Giter Site home page Giter Site logo

barseghyanartur / graphene-elastic Goto Github PK

View Code? Open in Web Editor NEW
72.0 8.0 18.0 502 KB

Graphene Elasticsearch/OpenSearch (DSL) integration

Home Page: https://pypi.org/project/graphene-elastic/

Dockerfile 0.19% Shell 1.14% Python 97.94% Makefile 0.73%
graphene graphene-python elasticsearch graphene-elasticsearch graphql graphql-elasticsearch elasticsearch-dsl opensearch opensearch-dsl

graphene-elastic's Introduction

graphene-elastic

Elasticsearch (DSL)/ OpenSearch (DSL) integration for Graphene.

PyPI Version

Supported Python versions

Build Status

Documentation Status

GPL-2.0-only OR LGPL-2.1-or-later

Coverage

Prerequisites

  • Graphene 2.x. Support for Graphene 1.x is not intended.
  • Python 3.6, 3.7, 3.8, 3.9 and 3.10. Support for Python 2 is not intended.
  • Elasticsearch 6.x, 7.x. Support for Elasticsearch 5.x is not intended.
  • OpenSearch 1.x, 2.x.

Main features and highlights

  • Implemented ElasticsearchConnectionField and ElasticsearchObjectType are the core classes to work with graphene.
  • Pluggable backends for searching, filtering, ordering, etc. Don't like existing ones? Override, extend or write your own.
  • Search backend.
  • Filter backend.
  • Ordering backend.
  • Pagination.
  • Highlighting backend.
  • Source filter backend.
  • Faceted search backend (including global aggregations).
  • Post filter backend.
  • Score filter backend.
  • Query string backend.
  • Simple query string backend.

See the Road-map for what's yet planned to implemented.

Do you need a similar tool for Django REST Framework? Check django-elasticsearch-dsl-drf.

Demo

Check the live demo app (FastAPI + Graphene 2 + Elasticsearch 7) hosted on Heroku and bonsai.io.

Documentation

Documentation is available on Read the Docs.

Installation

Install latest stable version from PyPI:

pip install graphene-elastic

Or latest development version from GitHub:

pip install https://github.com/barseghyanartur/graphene-elastic/archive/master.zip

Note

Staring from version 0.8, the elasticsearch and elasticsearch-dsl packages are no longer installed by default. You must either install them explicitly in your requirements or install as optional dependencies as follows: pip install graphene-elastic[elasticsearch]. Alternatively, you can use opensearch-py and opensearch-dsl. You would then need to install the opensearch-py and opensearch-dsl packages explicitly in your requirements or install them as optional dependencies as follows: pip install graphene-elastic[opensearch].

Examples

Note

In the examples, we use elasticsearch_dsl package for schema definition. You can however use opensearch_dsl or if you want to achieve portability between Elasticsearch and OpenSearch, use anysearch package. Read more here.

Install requirements

pip install -r requirements.txt

Populate sample data

The following command will create indexes for User and Post documents and populate them with sample data:

./scripts/populate_elasticsearch_data.sh

Sample document definition

search_index/documents/post.py

See examples/search_index/documents/post.py for full example.

import datetime
from elasticsearch_dsl import (
    Boolean,
    Date,
    Document,
    InnerDoc,
    Keyword,
    Nested,
    Text,
    Integer,
)

class Comment(InnerDoc):

    author = Text(fields={'raw': Keyword()})
    content = Text(analyzer='snowball')
    created_at = Date()

    def age(self):
        return datetime.datetime.now() - self.created_at


class Post(Document):

    title = Text(
        fields={'raw': Keyword()}
    )
    content = Text()
    created_at = Date()
    published = Boolean()
    category = Text(
        fields={'raw': Keyword()}
    )
    comments = Nested(Comment)
    tags = Text(
        analyzer=html_strip,
        fields={'raw': Keyword(multi=True)},
        multi=True
    )
    num_views = Integer()

    class Index:
        name = 'blog_post'
        settings = {
            'number_of_shards': 1,
            'number_of_replicas': 1,
            'blocks': {'read_only_allow_delete': None},
        }

Sample apps

Sample Flask app

Run the sample Flask app:

./scripts/run_flask.sh

Open Flask graphiql client

http://127.0.0.1:8001/graphql

Sample Django app

Run the sample Django app:

./scripts/run_django.sh runserver

Open Django graphiql client

http://127.0.0.1:8000/graphql

ConnectionField example

ConnectionField is the most flexible and feature rich solution you have. It uses filter backends which you can tie to your needs the way you want in a declarative manner.

Sample schema definition

import graphene
from graphene_elastic import (
    ElasticsearchObjectType,
    ElasticsearchConnectionField,
)
from graphene_elastic.filter_backends import (
    FilteringFilterBackend,
    SearchFilterBackend,
    HighlightFilterBackend,
    OrderingFilterBackend,
    DefaultOrderingFilterBackend,
)
from graphene_elastic.constants import (
    LOOKUP_FILTER_PREFIX,
    LOOKUP_FILTER_TERM,
    LOOKUP_FILTER_TERMS,
    LOOKUP_FILTER_WILDCARD,
    LOOKUP_QUERY_EXCLUDE,
    LOOKUP_QUERY_IN,
)

# Object type definition
class Post(ElasticsearchObjectType):

    class Meta(object):
        document = PostDocument
        interfaces = (Node,)
        filter_backends = [
            FilteringFilterBackend,
            SearchFilterBackend,
            HighlightFilterBackend,
            OrderingFilterBackend,
            DefaultOrderingFilterBackend,
        ]

        # For `FilteringFilterBackend` backend
        filter_fields = {
            # The dictionary key (in this case `title`) is the name of
            # the corresponding GraphQL query argument. The dictionary
            # value could be simple or complex structure (in this case
            # complex). The `field` key points to the `title.raw`, which
            # is the field name in the Elasticsearch document
            # (`PostDocument`). Since `lookups` key is provided, number
            # of lookups is limited to the given set, while term is the
            # default lookup (as specified in `default_lookup`).
            'title': {
                'field': 'title.raw',
                # Available lookups
                'lookups': [
                    LOOKUP_FILTER_TERM,
                    LOOKUP_FILTER_TERMS,
                    LOOKUP_FILTER_PREFIX,
                    LOOKUP_FILTER_WILDCARD,
                    LOOKUP_QUERY_IN,
                    LOOKUP_QUERY_EXCLUDE,
                ],
                # Default lookup
                'default_lookup': LOOKUP_FILTER_TERM,
            },

            # The dictionary key (in this case `category`) is the name of
            # the corresponding GraphQL query argument. Since no lookups
            # or default_lookup is provided, defaults are used (all lookups
            # available, term is the default lookup). The dictionary value
            # (in this case `category.raw`) is the field name in the
            # Elasticsearch document (`PostDocument`).
            'category': 'category.raw',

            # The dictionary key (in this case `tags`) is the name of
            # the corresponding GraphQL query argument. Since no lookups
            # or default_lookup is provided, defaults are used (all lookups
            # available, term is the default lookup). The dictionary value
            # (in this case `tags.raw`) is the field name in the
            # Elasticsearch document (`PostDocument`).
            'tags': 'tags.raw',

            # The dictionary key (in this case `num_views`) is the name of
            # the corresponding GraphQL query argument. Since no lookups
            # or default_lookup is provided, defaults are used (all lookups
            # available, term is the default lookup). The dictionary value
            # (in this case `num_views`) is the field name in the
            # Elasticsearch document (`PostDocument`).
            'num_views': 'num_views',
        }

        # For `SearchFilterBackend` backend
        search_fields = {
            'title': {'boost': 4},
            'content': {'boost': 2},
            'category': None,
        }

        # For `OrderingFilterBackend` backend
        ordering_fields = {
            # The dictionary key (in this case `tags`) is the name of
            # the corresponding GraphQL query argument. The dictionary
            # value (in this case `tags.raw`) is the field name in the
            # Elasticsearch document (`PostDocument`).
            'title': 'title.raw',

            # The dictionary key (in this case `created_at`) is the name of
            # the corresponding GraphQL query argument. The dictionary
            # value (in this case `created_at`) is the field name in the
            # Elasticsearch document (`PostDocument`).
            'created_at': 'created_at',

            # The dictionary key (in this case `num_views`) is the name of
            # the corresponding GraphQL query argument. The dictionary
            # value (in this case `num_views`) is the field name in the
            # Elasticsearch document (`PostDocument`).
            'num_views': 'num_views',
        }

        # For `DefaultOrderingFilterBackend` backend
        ordering_defaults = (
            '-num_views',  # Field name in the Elasticsearch document
            'title.raw',  # Field name in the Elasticsearch document
        )

        # For `HighlightFilterBackend` backend
        highlight_fields = {
            'title': {
                'enabled': True,
                'options': {
                    'pre_tags': ["<b>"],
                    'post_tags': ["</b>"],
                }
            },
            'content': {
                'options': {
                    'fragment_size': 50,
                    'number_of_fragments': 3
                }
            },
            'category': {},
        }

# Query definition
class Query(graphene.ObjectType):
    all_post_documents = ElasticsearchConnectionField(Post)

# Schema definition
schema = graphene.Schema(query=Query)
Filter
Sample queries

Since we didn't specify any lookups on category, by default all lookups are available and the default lookup would be term. Note, that in the {value:"Elastic"} part, the value stands for default lookup, whatever it has been set to.

query PostsQuery {
  allPostDocuments(filter:{category:{value:"Elastic"}}) {
    edges {
      node {
        id
        title
        category
        content
        createdAt
        comments
      }
    }
  }
}

But, we could use another lookup (in example below - terms). Note, that in the {terms:["Elastic", "Python"]} part, the terms is the lookup name.

query PostsQuery {
  allPostDocuments(
        filter:{category:{terms:["Elastic", "Python"]}}
    ) {
    edges {
      node {
        id
        title
        category
        content
        createdAt
        comments
      }
    }
  }
}

Or apply a gt (range) query in addition to filtering:

{
  allPostDocuments(filter:{
        category:{term:"Python"},
        numViews:{gt:"700"}
    }) {
    edges {
      node {
        category
        title
        comments
        numViews
      }
    }
  }
}
Implemented filter lookups

The following lookups are available:

  • contains
  • ends_with (or endsWith for camelCase)
  • exclude
  • exists
  • gt
  • gte
  • in
  • is_null (or isNull for camelCase)
  • lt
  • lte
  • prefix
  • range
  • starts_with (or startsWith for camelCase)
  • term
  • terms
  • wildcard

See dedicated documentation on filter lookups for more information.

Search in all fields:

query {
  allPostDocuments(
    search:{query:"Release Box"}
  ) {
    edges {
      node {
        category
        title
        content
      }
    }
  }
}

Search in specific fields:

query {
  allPostDocuments(
    search:{
        title:{value:"Release", boost:2},
        content:{value:"Box"}
    }
  ) {
    edges {
      node {
        category
        title
        content
      }
    }
  }
}
Ordering

Possible choices are ASC and DESC.

query {
  allPostDocuments(
        filter:{category:{term:"Photography"}},
        ordering:{title:ASC}
    ) {
    edges {
      node {
        category
        title
        content
        numViews
        tags
      }
    }
  }
}
Pagination

The first, last, before and after arguments are supported. By default number of results is limited to 100.

query {
  allPostDocuments(first:12) {
    pageInfo {
      startCursor
      endCursor
      hasNextPage
      hasPreviousPage
    }
    edges {
      cursor
      node {
        category
        title
        content
        numViews
      }
    }
  }
}
Highlighting

Simply, list the fields you want to highlight. This works only in combination with search.

query {
  allPostDocuments(
        search:{content:{value:"alice"}, title:{value:"alice"}},
        highlight:[category, content]
    ) {
    edges {
      node {
        title
        content
        highlight
      }
      cursor
    }
  }
}

Road-map

Road-map and development plans.

This package is designed after django-elasticsearch-dsl-drf and is intended to offer similar functionality.

Lots of features are planned to be released in the upcoming Beta releases:

  • Suggester backend.
  • Nested backend.
  • Geo-spatial backend.
  • Filter lookup geo_bounding_box (or geoBoundingBox for camelCase).
  • Filter lookup geo_distance (or geoDistance for camelCase).
  • Filter lookup geo_polygon (or geoPolygon for camelCase).
  • More-like-this backend.

Stay tuned or reach out if you want to help.

Testing

Project is covered with tests.

Testing with Docker

make docker-test

Running tests with virtualenv or tox

By defaults tests are executed against the Elasticsearch 7.x.

Run Elasticsearch 7.x with Docker

docker-compose up elasticsearch

Install test requirements

pip install -r requirements/test.txt

To test with all supported Python versions type:

tox

To test against specific environment, type:

tox -e py38-elastic7

To test just your working environment type:

./runtests.py

To run a single test module in your working environment type:

./runtests.py src/graphene_elastic/tests/test_filter_backend.py

To run a single test class in a given test module in your working environment type:

./runtests.py src/graphene_elastic/tests/test_filter_backend.py::FilterBackendElasticTestCase

Debugging

For development purposes, you could use the flask app (easy to debug). Standard pdb works (import pdb; pdb.set_trace()). If ipdb does not work well for you, use ptpdb.

Writing documentation

Keep the following hierarchy.

=====
title
=====

header
======

sub-header
----------

sub-sub-header
~~~~~~~~~~~~~~

sub-sub-sub-header
^^^^^^^^^^^^^^^^^^

sub-sub-sub-sub-header
++++++++++++++++++++++

sub-sub-sub-sub-sub-header
**************************

License

GPL-2.0-only OR LGPL-2.1-or-later

Support

For any security issues contact me at the e-mail given in the Author section. For overall issues, go to GitHub.

Author

Artur Barseghyan <[email protected]>

graphene-elastic's People

Contributors

barseghyanartur avatar binary-sort avatar caffeinatedgaze avatar chpmrc avatar davidsmith166 avatar epdasarathy avatar lingfromsh avatar philiplee15 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

graphene-elastic's Issues

Upgrade reqs

Including Django, Flask and Responder reqs and apps.

Retrieving ES relevancy score?

Hello, first off let me say that I love this integration and that it has made integrating ES with Graphene very easy. However, I am curious if it is possible to retrieve the calculated relevancy score of each search result? If it is possible, how would this be done using this library?

Refactoring

  • Since is't typical for backends to have input filter fields, make this like property filter_fields and property filter_fields_mapping more generic. In the filter backend class, it should be possible to define a class variable which points to the dictionary key which shall be used. That's all.

i cant install graphene_elastic

If use pip, via package o git+repo, only get the example dir:

15:24 $ pip install graphene-elastic
Collecting graphene-elastic
  Downloading https://files.pythonhosted.org/packages/66/8d/62c395eb625d4b7e51703e4eb3a53427339daea58d5ca59daab8b2179dd8/graphene_elastic-0.0.3-py3-none-any.whl
Requirement already satisfied: iso8601>=0.1.12 in /home/ignacio/infoxel/envs//lib/python3.5/site-packages (from graphene-elastic)
Requirement already satisfied: elasticsearch>=6.0 in /home/ignacio/infoxel/envs/wewelolo/lib/python3.5/site-packages (from graphene-elastic)
Requirement already satisfied: graphene<3,>=2.1.3 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from graphene-elastic)
Requirement already satisfied: singledispatch>=3.4.0.3 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from graphene-elastic)
Requirement already satisfied: elasticsearch-dsl>=6.0 in /home/ignacio/infoxel/envs//lib/python3.5/site-packages (from graphene-elastic)
Requirement already satisfied: urllib3>=1.21.1 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from elasticsearch>=6.0->graphene-elastic)
Requirement already satisfied: six<2,>=1.10.0 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from graphene<3,>=2.1.3->graphene-elastic)
Requirement already satisfied: graphql-core<3,>=2.1 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from graphene<3,>=2.1.3->graphene-elastic)
Requirement already satisfied: promise<3,>=2.1 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from graphene<3,>=2.1.3->graphene-elastic)
  Ignoring typing: markers 'python_version < "3.5"' don't match your environment
Requirement already satisfied: graphql-relay<1,>=0.4.5 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from graphene<3,>=2.1.3->graphene-elastic)
Requirement already satisfied: aniso8601<4,>=3 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from graphene<3,>=2.1.3->graphene-elastic)
Requirement already satisfied: python-dateutil in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from elasticsearch-dsl>=6.0->graphene-elastic)
Requirement already satisfied: rx>=1.6.0 in /home/ignacio/infoxel/envs/lib/python3.5/site-packages (from graphql-core<3,>=2.1->graphene<3,>=2.1.3->graphene-elastic)
Installing collected packages: graphene-elastic
Successfully installed graphene-elastic-0.0.3
15:24 $ python
Python 3.5.3 (default, Sep 27 2018, 17:25:39) 
[GCC 6.3.0 20170516] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import graphene_elastic
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named 'graphene_elastic'
>>> 
15:24 $ ls /home/ignacio/infoxel/envs/lib/python3.5/site-packages/graphene_elastic
ls: cannot access '/home/ignacio/infoxel/envs/lib/python3.5/site-packages/graphene_elastic': No such file or directory
15:24 $ ls /home/ignacio/infoxel/envs/lib/python3.5/site-packages/examples/
apps  factories  __init__.py  oauth.py  __pycache__  schema  search_index  starwars  starwars_relay  streaming.py
15:24 $ 
``

Support for complex bool queries

@barseghyanartur, I took a look at the django-es stuff. They seem to rely on a set of conventions for bool queries (must, should). I had started to implement something different by constructing the root query to have a couple of self-relations. The class as implemented looked like this. Don't take this code too literally; I just want to show the concept of using self-reference with and and or.

class BaseFilter(graphene.InputObjectType):

    @classmethod
    def __init_subclass_with_meta__(cls, *args, **kwargs):
        super().__init_subclass_with_meta__(*args, **kwargs)
        field = graphene.InputField(graphene.List(graphene.NonNull(cls)))
        cls._meta.fields.update({'or': field,
                                 'and': field,
                                 'match': graphene.Argument(graphene.String, description='abc [stuff](https://cnn.com)'),
                                 'multi_match':field})

    class Meta:
        abstract = True

Here, field is pretty simple, but one could replace field with a list of filtering and searching backends. Facets, postfilter, etc. would remain at the "top level" since it doesn't make sense for them to be included in bool operations.

If I again overlooked something in the docs, ignore this comment.

[Question]: Where does one configure the elasticsearch connection?

First, thanks for putting this library together. I'm working to get things up and running so that I might be able to contribute a bit.

I have an elasticsearch server that has a hostname and requires basic auth. Where can I configure this with graphene-elastic? I'm used to configuring the es client and then using it with basic elasticsearch_dsl Search.

I apologize if I have missed this in the docs and a quick scan of the code didn't point me to the right place.

Fiter several fields using OR

I need to find records which fields description OR searchKeys contain SEARCHED_STRING

{
  productsSearching(
    filter: {
      	description:{wildcard:"*SEARCHED_STRING*"},
        searchKeys:{wildcard:"*SEARCHED_STRING*"}
      	
    }

  ) {
    edges {
      node {
        id,
        description,
        ownCode,
        searchKeys
      }
    }
  }
}

but I only achieve AND results

When we build using graphene_federation build_schema instead of Graphene.Schema we get JSONString error

When we define a schema with graphene-elastic types and try to load it using graphene_federation build_schema instead of Graphene.Schema we get a error

the JSON object must be str, bytes or bytearray, not dict
Traceback (most recent call last):
File "manage.py", line 12, in
execute_from_command_line(sys.argv)
File "/usr/local/lib/python3.8/site-packages/django/core/management/init.py", line 381, in execute_from_command_line
utility.execute()
File "/usr/local/lib/python3.8/site-packages/django/core/management/init.py", line 375, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/usr/local/lib/python3.8/site-packages/django/core/management/init.py", line 224, in fetch_command
klass = load_command_class(app_name, subcommand)
File "/usr/local/lib/python3.8/site-packages/django/core/management/init.py", line 36, in load_command_class
module = import_module('%s.management.commands.%s' % (app_name, name))
File "/usr/local/lib/python3.8/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 975, in _find_and_load_unlocked
File "", line 671, in _load_unlocked
File "", line 783, in exec_module
File "", line 219, in _call_with_frames_removed
File "/app/saleor/graphql/management/commands/get_graphql_elastic_schema.py", line 5, in
from ...api_elastic import elastic_schema
File "/app/saleor/graphql/api_elastic.py", line 32, in
elastic_schema = build_schema(query=Query)
File "/usr/local/lib/python3.8/site-packages/graphene_federation/main.py", line 20, in build_schema
return graphene.Schema(query=_get_query(schema, query), **kwargs)
File "/usr/local/lib/python3.8/site-packages/graphene/types/schema.py", line 78, in init
self.build_typemap()
File "/usr/local/lib/python3.8/site-packages/graphene/types/schema.py", line 167, in build_typemap
self._type_map = TypeMap(
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 80, in init
super(TypeMap, self).init(types)
File "/usr/local/lib/python3.8/site-packages/graphql/type/typemap.py", line 28, in init
self.update(reduce(self.reducer, types, OrderedDict())) # type: ignore
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 88, in reducer
return self.graphene_reducer(map, type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 117, in graphene_reducer
return GraphQLTypeMap.reducer(map, internal_type)
File "/usr/local/lib/python3.8/site-packages/graphql/type/typemap.py", line 106, in reducer
field_map = type.fields
File "/usr/local/lib/python3.8/site-packages/graphql/pyutils/cached_property.py", line 22, in get
value = obj.dict[self.func.name] = self.func(obj)
File "/usr/local/lib/python3.8/site-packages/graphql/type/definition.py", line 226, in fields
return define_field_map(self, self._fields)
File "/usr/local/lib/python3.8/site-packages/graphql/type/definition.py", line 240, in define_field_map
field_map = field_map()
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 275, in construct_fields_for_type
map = self.reducer(map, field.type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 88, in reducer
return self.graphene_reducer(map, type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 93, in graphene_reducer
return self.reducer(map, type.of_type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 88, in reducer
return self.graphene_reducer(map, type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 117, in graphene_reducer
return GraphQLTypeMap.reducer(map, internal_type)
File "/usr/local/lib/python3.8/site-packages/graphql/type/typemap.py", line 96, in reducer
for t in type.types:
File "/usr/local/lib/python3.8/site-packages/graphql/pyutils/cached_property.py", line 22, in get
value = obj.dict[self.func.name] = self.func(obj)
File "/usr/local/lib/python3.8/site-packages/graphql/type/definition.py", line 453, in types
return define_types(self, self._types)
File "/usr/local/lib/python3.8/site-packages/graphql/type/definition.py", line 464, in define_types
types = types()
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 249, in types
self.graphene_reducer(map, objecttype)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 117, in graphene_reducer
return GraphQLTypeMap.reducer(map, internal_type)
File "/usr/local/lib/python3.8/site-packages/graphql/type/typemap.py", line 106, in reducer
field_map = type.fields
File "/usr/local/lib/python3.8/site-packages/graphql/pyutils/cached_property.py", line 22, in get
value = obj.dict[self.func.name] = self.func(obj)
File "/usr/local/lib/python3.8/site-packages/graphql/type/definition.py", line 226, in fields
return define_field_map(self, self._fields)
File "/usr/local/lib/python3.8/site-packages/graphql/type/definition.py", line 240, in define_field_map
field_map = field_map()
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 275, in construct_fields_for_type
map = self.reducer(map, field.type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 88, in reducer
return self.graphene_reducer(map, type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 93, in graphene_reducer
return self.reducer(map, type.of_type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 88, in reducer
return self.graphene_reducer(map, type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 117, in graphene_reducer
return GraphQLTypeMap.reducer(map, internal_type)
File "/usr/local/lib/python3.8/site-packages/graphql/type/typemap.py", line 106, in reducer
field_map = type.fields
File "/usr/local/lib/python3.8/site-packages/graphql/pyutils/cached_property.py", line 22, in get
value = obj.dict[self.func.name] = self.func(obj)
File "/usr/local/lib/python3.8/site-packages/graphql/type/definition.py", line 226, in fields
return define_field_map(self, self._fields)
File "/usr/local/lib/python3.8/site-packages/graphql/type/definition.py", line 240, in define_field_map
field_map = field_map()
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 275, in construct_fields_for_type
map = self.reducer(map, field.type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 88, in reducer
return self.graphene_reducer(map, type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 93, in graphene_reducer
return self.reducer(map, type.of_type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 88, in reducer
return self.graphene_reducer(map, type)
File "/usr/local/lib/python3.8/site-packages/graphene/types/typemap.py", line 97, in graphene_reducer
assert _type.graphene_type == type, (

[Question]: How does one do filtering on date fields?

I'm interested in applying range and gt/lt filters on one or more date fields. It appears that those filters are hard-coded to be used with "Decimal" types. Do you have an approach in mind for doing so? I can probably map the dates to something like a unix timestamp, but that complicates input for filtering.

If one wanted to add such functionality if it doesn't yet exist, where would you suggest starting, @barseghyanartur.

Add documentation on query string and simple query string backends

For simple query string:

    query PostsQuery {
      allPostDocuments(
        simpleQueryString:"'trouble candidate' +Python"
      ) {
        edges {
          node {
            id
            title
            category
            content
            createdAt
            comments
          }
        }
      }
    }

For query string:

    query PostsQuery {
      allPostDocuments(
        queryString:"(trouble candidate) AND Python"
      ) {
        edges {
          node {
            id
            title
            category
            content
            createdAt
            comments
          }
        }
      }
    }

Handle complex types conversion

At the moment complex types, such as elasticsearch_fields.Nested or elasticsearch_fields.Object are converted to JSONField. They should instead be converted into List and ObjectType instead.

Is it possible to combine graphene-elastic with graphene-django?

I'm working on a project that uses graphene-django and I'd like to add search functionality with graphene-elastic. Currently, most objects in my app(50+) are subclassing DjangoObjectType, ie:

class PostNode(DjangoObjectType):
    class Meta:
        model = Post
        interfaces = (graphene.Node,)

Elastic version could look something like this:

class PostElasticNode(ElasticsearchObjectType):
    class Meta:
        interfaces = (graphene.Node,)
        document = PostDocument

        filter_backends = [
            FilteringFilterBackend,
        ]

        filter_fields = {
            'title': {
                'field': 'title',
                'default_lookup': LOOKUP_FILTER_TERM,
            },
        }

So as far as I understand:

  1. DjangoObjectType determines what fields a GraphQL object has by looking at model = Post
  2. ElasticsearchObjectType does it by looking at document = PostDocument

I guess my question is - is there a way to somehow combine ElasticsearchObjectType and DjangoObjectType, so that the ElasticsearchConnectionField resolves query with PostNode type and uses PostElasticNode only to determine available filters?

Currently, as far as I understand, I'd have to rewrite all the existing nodes as ElasticsearchObjectTypes, create relevant elastic Documents, including relations. Is there an easier way to go about this issue?

Add tests

Test

  • Add initial test suite
  • Test filter backend
  • Test search backend
  • Test ordering backend
  • Test pagination

Filter backend tests

  • contains
  • ends_with (or endsWith for camelCase)
  • exclude
  • exists
  • geo_bounding_box (or geoBoundingBox for camelCase)
  • geo_distance (or geoDistance for camelCase)
  • geo_polygon (or geoPolygon for camelCase)
  • gt
  • gte
  • in
  • is_null (or isNull for camelCase)
  • lt
  • lte
  • prefix
  • range
  • starts_with (or startsWith for camelCase)
  • term
  • terms
  • wildcard

Search backend tests

  • All fields
  • Specific field
  • Boost

Ordering backend tests

  • Ordering backend
  • Default ordering backend

Pagination tests

  • first
  • last
  • before
  • after

Problem to start using the lib

Hello @barseghyanartur

I'm having a little problem to get everything up and running, I'm new with elastic and graphql, maybe that's the reason ๐Ÿ˜†

I have a consumer(RabbitMQ) app that populates elasticsearch on index "animal" with this kind of data[1]

I created another app(API) to consume this data from elastic, and I'm trying to use graphene-elastic for that, so far this is what I have[2]. For now, I just want to filter using the field "_id".

My problem so far:

File "/usr/local/lib/python3.6/site-packages/graphene_elastic/fields.py", line 89, in type
), "The type {} doesn't have a connection".format(_type.name)
AssertionError: The type Animal doesn't have a connection

That connection is the elasticsearch connection? Where should I declare my hostname and port for elastic?

[1] https://gist.github.com/csarcom/a9980a8bf62358e6a0b9e019469c9cd9
[2] https://gist.github.com/csarcom/0873f6e7816f7b75240a3828e11a7136

How to use connection between indices

I looked in the documentation and couldnt find anything.
Say I have some indices defined like this

class Student(Document):
    name = Text()
    teacher = Text()

class Teachers(Document):
    name = Text()

If the teacher field is the Teacher id: How can I create a connection between the student and the teacher?

[Question] First example throws RequestError

Hi, great job starting this lib.

I am trying out the library, created the virtualenv, setup all requirements, populated elasticsearch data, as explained, all from the script.

After I start up the flask app example, the first query example in the documentation throws an error:

query PostsQuery {
  allPostDocuments(filter:{category:{value:"Elastic"}}) {
    edges {
      node {
        id
        title
        category
        content
        createdAt
        comments
      }
    }
  }
}

Response:

{
  "errors": [
    {
      "message": "RequestError(400, 'search_phase_execution_exception', 'No mapping found for [title.raw] in order to sort on')",
      "locations": [
        {
          "line": 2,
          "column": 3
        }
      ],
      "path": [
        "allPostDocuments"
      ]
    }
  ],
  "data": {
    "allPostDocuments": null
  }
}

Any hints?

Improve tests

Filter backend tests

  • geo_bounding_box (or geoBoundingBox for camelCase)
  • geo_distance (or geoDistance for camelCase)
  • geo_polygon (or geoPolygon for camelCase)

Search backend tests

  • Boost

Pagination tests

  • last
  • before
  • after

[Question] Mutations and Subscriptions?

Hi,

Thanks for creating this library. I was unable to find code examples of mutations or subscriptions. Is it possible to accomplish these operation with this library?

Thank you for your time.

Specifying facets in query with FacetedSearchFilter does not limit aggs.

Here is what I am seeing when I specify only one facet in the query:

query newQuery {
  document (first:1, facets: [published]){
    edges {
      node {
        id
        accession
      }
     	cursor
    }
    facets 
  }
}

and here is the return from that query that includes both of the available facets.

"facets": {
        "center_name": {
          "doc_count": 215818,
          "aggs": {
            "doc_count_error_upper_bound": 0,
            "sum_other_doc_count": 32624,
            "buckets": [
...
            ]
          }
        },
        "published": {
          "doc_count": 215818,
          "aggs": {
            "buckets": [
...
            ]
          }

RELAY_CONNECTION_MAX_LIMIT not being respected

Hello, I am trying to raise the maximum limit for the first variable. I added RELAY_CONNECTION_MAX_LIMIT to my GRAPHENE setting in Django. The override works for database GraphQL queries but if you try to request more than 100 items for an ElasticsearchObjectType it will error out and report that the requested number of items exceeds the limit.

I've been able to correct this by modifying the default value in settings.py. It appears that the issue may be related to the IMPORT_STRINGS variable in your settings.py.

Possibility to use the request object in Filter backends

Hey, just found this project, looks really good!

While testing this out on my project (Which currently uses django-elasticsearch-dsl-drf) I wanted to pre-filter the queryset by only selecting the documents the user has access to. But to do that I would need the request object.

The filter backend in the django-elasticsearch-dsl-drf I currently use (and trying to replicate) looks like this:

def filter_queryset(self, request, queryset, view):
    return queryset.filter(
        E_Q(
            'nested',
            path='followers',
            query=E_Q(
                'match',
                followers__id=request.user.id
            )
        ) | E_Q(
            'nested',
            path='job.followers',
            query=E_Q('match', job__followers__id=request.user.id)
        )
    )

Any idea how I could solve it with your project?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.