Giter Site home page Giter Site logo

druidry's Introduction

Welcome to project Druidry!

build_status License: Apache License 2 javadoc

Druid is an extremely popular tool to perform OLAP queries on event data. Druid drives real-time dashboards in most of the organisations right now. We@Zapr love Druid! Therefore we want to contribute towards making Druid, even more, friendlier to the ever expanding community.

We want to make the process of deep meaningful conversations with Druid little easier. What do we mean is that we don’t want developers to write big, scary JSON anymore but instead use a simple Java-based query generator to help with the querying.

Creating JSON freely can cause tedious bugs such as date type mistakes or spelling mistakes and potentially code can get bigger and messier and less readable. So, in reality, we want to keep the main focus of querying to be the use-case, not the type-checks.

We are excited to know whether you liked it or loved it, so please reach out to us at [email protected]

Description

Druidry is an open-source Java based utility library which supports creating query to Druid automatically taking care of following,

  • Type checking.
  • Spelling Checks.
  • Code reviewability and readability.

This library is still growing and does not support each and every constructs, however it supports the most common one used internally @Zapr.

Getting Started

Prerequisite

  • Maven
  • Java 8

Usage

Add this in your pom.xml (assuming maven based project)

        <dependency>
            <groupId>in.zapr.druid</groupId>
            <artifactId>druidry</artifactId>
            <version>${LATEST_VERSION}</version>
        </dependency>

Replace ${LATEST_VERSION} with latest release version

Examples

Taking from Druid's example query

{
     "queryType": "topN",
     "dataSource": "sample_data",
     "dimension": "sample_dim",
     "threshold": 5,
     "metric": "count",
     "granularity": "all",
     "filter": {
        "type": "and",
          "fields": [
            {
              "type": "selector",
              "dimension": "dim1",
              "value": "some_value"
            },
            {
              "type": "selector",
              "dimension": "dim2",
              "value": "some_other_val"
            }
          ]
     },
     "aggregations": [
        {
          "type": "longSum",
          "name": "count",
          "fieldName": "count"
        },
        {
          "type": "doubleSum",
          "name": "some_metric",
          "fieldName": "some_metric"
        }
     ],
     "postAggregations": [
         {
            "type": "arithmetic",
            "name": "sample_divide",
            "fn": "/",
            "fields": [
              {
                "type": "fieldAccess",
                "name": "some_metric",
                "fieldName": "some_metric"
              },
              {
                "type": "fieldAccess",
                "name": "count",
                "fieldName": "count"
              }
            ]
         }
      ],
      "intervals": [
        "2013-08-31T00:00:00.000/2013-09-03T00:00:00.000"
      ]
}
SelectorFilter selectorFilter1 = new SelectorFilter("dim1", "some_value");
SelectorFilter selectorFilter2 = new SelectorFilter("dim2", "some_other_val");

AndFilter filter = new AndFilter(Arrays.asList(selectorFilter1, selectorFilter2));

DruidAggregator aggregator1 = new LongSumAggregator("count", "count");
DruidAggregator aggregator2 = new DoubleSumAggregator("some_metric", "some_metric");

FieldAccessPostAggregator fieldAccessPostAggregator1
        = new FieldAccessPostAggregator("some_metric", "some_metric");

FieldAccessPostAggregator fieldAccessPostAggregator2
        = new FieldAccessPostAggregator("count", "count");

DruidPostAggregator postAggregator = ArithmeticPostAggregator.builder()
        .name("sample_divide")
        .function(ArithmeticFunction.DIVIDE)
        .fields(Arrays.asList(fieldAccessPostAggregator1, fieldAccessPostAggregator2))
        .build();

DateTime startTime = new DateTime(2013, 8, 31, 0, 0, 0, DateTimeZone.UTC);
DateTime endTime = new DateTime(2013, 9, 3, 0, 0, 0, DateTimeZone.UTC);
Interval interval = new Interval(startTime, endTime);

Granularity granularity = new SimpleGranularity(PredefinedGranularity.ALL);
DruidDimension dimension = new SimpleDimension("sample_dim");
TopNMetric metric = new SimpleMetric("count");

DruidTopNQuery query = DruidTopNQuery.builder()
        .dataSource("sample_data")
        .dimension(dimension)
        .threshold(5)
        .topNMetric(metric)
        .granularity(granularity)
        .filter(filter)
        .aggregators(Arrays.asList(aggregator1, aggregator2))
        .postAggregators(Collections.singletonList(postAggregator))
        .intervals(Collections.singletonList(interval))
        .build();

ObjectMapper mapper = new ObjectMapper();
String requiredJson = mapper.writeValueAsString(query);
DruidConfiguration config =  DruidConfiguration
               .builder()
               .host("druid.io")
               .endpoint("druid/v2/")
               .build();

DruidClient client = new DruidJerseyClient(druidConfiguration);
client.connect();
List<DruidResponse> responses = client.query(query, DruidResponse.class);
client.close();

Supported Features

Queries

  • Aggregation Queries
    • TopN
    • TimeSeries
    • GroupBy
  • DruidScanQuery
  • DruidSelectQuery

Aggregators

  • Cardinality
  • Count
  • DoubleMax
  • DoubleMin
  • DoubleSum
  • DoubleLast
  • DoubleFirst
  • FloatFirst
  • FloatLast
  • Filtered
  • HyperUnique
  • Javascript
  • LongMax
  • LongMin
  • LongSum
  • LongFirst
  • LongLast
  • DistinctCount
  • Histogram
  • Data Sketches
    • ThetaSketch
    • TupleSketch
    • QuantilesSketch
    • HllSketchBuild
    • HllSketchMerge

Filters

  • And
  • Bound
  • In
  • Interval (Without Extraction Function)
  • Javascript
  • Not
  • Or
  • Regex
  • Search (Without Extraction Function)
  • Selector

Post Aggregators

  • Arithmetic
  • Constant
  • FieldAccess
  • HyperUniqueCardinality
  • Javascript
  • Data Sketches
    • Theta Sketch
      • ThetaSketchEstimate
      • ThetaSketchSetOp
    • Tuple Sketch
      • TupleSketchToEstimate
      • TupleSketchToEstimateAndBounds
      • TupleSketchToNumEntries
      • TupleSketchToMeans
      • TupleSketchToVariances
      • TupleSketchToQuantilesSketch
      • TupleSketchSetOp
      • TupleSketchTTest
      • TupleSketchToString
    • Quantiles Sketch
      • QuantilesSketchToQuantile
      • QuantilesSketchToQuantiles
      • QuantilesSketchToHistogram
      • QuantilesSketchToString
    • HLL Sketch
      • HllSketchEstimateWithBounds
      • HllSketchUnion
      • HllSketchToString

Virtual Columns

  • Expression

Granularity

  • Duration
  • Period
  • Predefined

Contact

For any features or bugs, please raise it in issues section

If anything else, get in touch with us at [email protected]

druidry's People

Contributors

abhi-zapr avatar abhi195 avatar dependabot[bot] avatar gagangupt16 avatar gg-zapr avatar hari-om-888 avatar hgvanpariya avatar jacekc3 avatar jonarzz avatar moezubair avatar nihit-zapr avatar parul-zapr avatar patilvikram avatar senthilec566 avatar siddharthsa avatar tunix avatar vchimishuk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

druidry's Issues

Enum type resolution problem?

The (ObjectMapper's) writeValueAsString method handles Enum type properly, as seen in the console. However ,when it goes to the query method defined in DruidJerseyClient , the Entity.entity method seems to resolve the Enum type(QueryType) inproperly . From the server log, it seems that the query couldn't be parsed. My druidry version is 2.3.

@Override public <T> List<T> query(DruidQuery druidQuery, Class<T> className) throws QueryException { try (Response response = this.queryWebTarget .request(MediaType.APPLICATION_JSON) .post(Entity.entity(druidQuery, MediaType.APPLICATION_JSON))) { ...

The server log shows "Exception occurred on request [unparsable query]
com.fasterxml.jackson.databind.JsonMappingException: Could not resolve type id 'TIMESERIES' into a subtype of [simple type, class io.druid.query.Query]"

Thanks for any comments.

how to query null dimension?

hi:
i want to query druid like this sql: select * from table_a where name is null. i use EXPLAIN PLAN FOR in dsql, and it show me filter with : {"type":"selector","dimension":"name","value":null,"extractionFn":null}, but i find error when use: SelectorFilter("name", null) , and SelectorFilter("name") with private access error.how should i query this?

Druidry DruidJerseyClient Connection Issue

I am trying to use Druidry (latest version:2.13)
Connection Code
DruidConfiguration config = DruidConfiguration .builder() .protocol(DruidQueryProtocol.HTTP) .host("<host>") .port(8082) .endpoint("druid/v2/") .concurrentConnectionsRequired(5) .build(); ClientConfig clientConfig = new ClientConfig(); clientConfig.register(JacksonFeature.class); DruidJerseyClient client = new DruidJerseyClient(config, clientConfig); client.connect(); client.close();
Error as below
{ "timestamp": "2019-06-03T08:11:07.597+0000", "status": 500, "error": "Internal Server Error", "message": "javax.ws.rs.core.UriBuilder.uri(Ljava/lang/String;)Ljavax/ws/rs/core/UriBuilder;", "trace": "java.lang.AbstractMethodError: javax.ws.rs.core.UriBuilder.uri(Ljava/lang/String;)Ljavax/ws/rs/core/UriBuilder;\n\tat javax.ws.rs.core.UriBuilder.fromUri(UriBuilder.java:120)\n\tat org.glassfish.jersey.client.JerseyWebTarget.<init>(JerseyWebTarget.java:72)\n\tat org.glassfish.jersey.client.JerseyClient.target(JerseyClient.java:344)\n\tat org.glassfish.jersey.client.JerseyClient.target(JerseyClient.java:80)\n\tat in.zapr.druid.druidry.client.DruidJerseyClient.connect(DruidJerseyClient.java:80)\n\tat <>.getAvgListsPerCustomer(ListsController.java:191)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:498)\n\tat org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:189)\n\tat org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138)\n\tat org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:102)\n\tat org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:892)\n\tat org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:797)\n\tat org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87)\n\tat org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1038)\n\tat org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942)\n\tat org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005)\n\tat org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:634)\n\tat org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:741)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)\n\tat org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:200)\n\tat org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)\n\tat org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:490)\n\tat org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:139)\n\tat org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)\n\tat org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)\n\tat org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343)\n\tat org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:408)\n\tat org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)\n\tat org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:834)\n\tat org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1415)\n\tat org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)\n\tat java.lang.Thread.run(Thread.java:748)\n", "path": "<>" }

How can I resolve this?

Thank you.

Kerberos support for client

Is anyone using the DruidClient to make requests to druid cluster that's protected by kerberos authentication?

this druid client conflicts with fastjson?

when i import fastjson in pom, the response code is 500, remove is ok.
code:

public static void main(String[] args) {
        DateTime startTime = new DateTime(2016, 1, 1, 0,
                0, 0, DateTimeZone.UTC);
        DateTime endTime = new DateTime(2017, 1, 2, 0,
                0, 0, DateTimeZone.UTC);
        Interval interval = new Interval(startTime, endTime);
        PagingSpec pagingSpec = new PagingSpec(5, new HashMap<>());
        Granularity granularity = new SimpleGranularity(PredefinedGranularity.ALL);
        DruidSelectQuery query = DruidSelectQuery.builder()
                .dataSource("wikipedia")
                .descending(false)
                .granularity(granularity)
                .intervals(Collections.singletonList(interval))
                .pagingSpec(pagingSpec)
                .build();
        try {
            DruidConfiguration config = DruidConfiguration
                    .builder()
                    .host("localhost")
                    .port(8082)
                    .endpoint("druid/v2/")
                    .build();
            DruidClient client = new DruidJerseyClient(config);
            client.connect();
            String query1 = client.query(query);
            System.out.println(query1);
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

exception:

in.zapr.druid.druidry.client.exception.QueryException: null
	at in.zapr.druid.druidry.client.DruidJerseyClient.handleInternalServerResponse(DruidJerseyClient.java:145)
	at in.zapr.druid.druidry.client.DruidJerseyClient.query(DruidJerseyClient.java:108)
	at com.aaa.demo.springboot.DemoApplicationTests.main(DemoApplicationTests.java:50)
in.zapr.druid.druidry.client.exception.QueryException
	at in.zapr.druid.druidry.client.DruidJerseyClient.handleInternalServerResponse(DruidJerseyClient.java:145)
	at in.zapr.druid.druidry.client.DruidJerseyClient.query(DruidJerseyClient.java:108)
	at com.aaa.demo.springboot.DemoApplicationTests.main(DemoApplicationTests.java:50)
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>fastjson</artifactId>
            <version>1.2.58</version>
        </dependency>

Issue witth TopN Query

Hi,

I am facing some issue with TopN query. When I generate a druiddry topN query in code, and query druid using it (client.query()), i get the following exception:

javax.ws.rs.ProcessingException: Error reading entity from input stream.

Complete trace :

Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize instance of java.util.LinkedHashMap out of START_ARRAY token
at [Source: (org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream); line: 1, column: 51] (through reference chain: java.util.ArrayList[0]->com.uipath.analytics.dataPlatform.store.DruidResponse["result"])
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1342) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1138) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1092) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.std.StdDeserializer._deserializeFromEmpty(StdDeserializer.java:599) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:360) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:29) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:136) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:288) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:286) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:245) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:27) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.ObjectReader._bind(ObjectReader.java:1574) ~[jackson-databind-2.9.6.jar:2.9.6]
at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:965) ~[jackson-databind-2.9.6.jar:2.9.6]
at org.glassfish.jersey.jackson.internal.jackson.jaxrs.base.ProviderBase.readFrom(ProviderBase.java:838) ~[jersey-media-json-jackson-2.26.jar:na]
at org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor.invokeReadFrom(ReaderInterceptorExecutor.java:257) ~[jersey-common-2.26.jar:na]
at org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$TerminalReaderInterceptor.aroundReadFrom(ReaderInterceptorExecutor.java:236) ~[jersey-common-2.26.jar:na]
at org.glassfish.jersey.message.internal.ReaderInterceptorExecutor.proceed(ReaderInterceptorExecutor.java:156) ~[jersey-common-2.26.jar:na]
at org.glassfish.jersey.message.internal.MessageBodyFactory.readFrom(MessageBodyFactory.java:1091) ~[jersey-common-2.26.jar:na]
at org.glassfish.jersey.message.internal.InboundMessageContext.readEntity(InboundMessageContext.java:874) ~[jersey-common-2.26.jar:na]
... 77 common frames omitted

2018-10-17 12:28:34 INFO DruidReader:66 - Druid execution exception :in.zapr.druid.druidry.client.exception.QueryException: javax.ws.rs.ProcessingException: Error reading entity from input stream.

The query which was generated is -

{
"dataSource": "system-process-insights",
"queryType": "topN",
"intervals": ["2018-10-06T11:30:00.000+05:30/2018-10-06T12:30:00.000+05:30"],
"granularity": "hour",
"aggregations": [{
"type": "longSum",
"name": "a1",
"fieldName": "memUsedVirtual"
}, {
"type": "longSum",
"name": "a2",
"fieldName": "memUsedRam"
}],
"dimension": "processName",
"threshold": 5,
"metric": "a1"
}

If I run this same query directly (using curl), then it gives me proper result. But through druid dry it is failing.
Other queries, groupBy/timeSeries are fine. If I change the type of this query to group by then it works.
Please help.

Interval returns different format than expected

Hi
The interval returns this format:

return String.format(DRUID_INTERVAL_FORMAT, startTime.toDateTimeISO(), endTime.toDateTimeISO());
which results in:

"intervals": ["2018-11-19T00:00:00.000-05:00/2018-11-20T00:00:00.000-05:00"],

However, my query requires the format to be :

"intervals": ["2018-11-01T00:00:00.000Z/2018-11-02T00:00:00.000Z"]

Any idea on how to fix that?

granularity setting

How should I set the timezone of granularity to Asia/Hong_Kong? And where can I find the API file? Thanks!

Adding a client version that supports legacy version of jersey (1.*)

I'm working on a legacy project that uses an old version of jersey (com.sun.jersey 1.19.1), which is not compatible with Druidry (using org.glassfish.jersey 2.*). Upgrading is currently a pain, since lots of other dependencies depend on it.

I looked at the code of Druidry, and it seems like Jersey-client is only used to create the DruidJerseyClient class. I am wondering if it is simple enough to have a Druidry version that supports the legacy version of jersey. If so that'd be great.

Thanks!

Select query with paginspec

I am trying to use select query with pagination, but it not working. After connecting to druid using druidry, query is passing DruidClient but response is coming null. I checked same query in json format using curl command, that was working. Please reply as soon as possible.

add no ars constructor to DruidGroupByQuery class

Hi,
when executing a query with this class on aws lambda I am getting :
MessageBodyWriter not found for media type=application/json, type=class in.zapr.druid.druidry.query.aggregation.DruidGroupByQuery, genericType=class in.zapr.druid.druidry.query.aggregation.DruidGroupByQuery.

15:41:38
15:41:38.581 [main] ERROR in.zapr.druid.druidry.client.DruidJerseyClient - Exception while querying {}

seems that the solution is to add a constructor with no args

DruidJerseyClient is not thread-safe

The connect() and close() methods in DruidJerseyClient class are not thread-safe which will throw an exception in a multi-threaded environment.

截图

DruidJerseyClient Exception with spark 2.3

When I use druidry in spark streaming (version: 2.3.2) and I got the following error:

19/03/22 15:27:00 ERROR DruidJerseyClient: Exception while querying {}
in.zapr.druid.druidry.client.exception.QueryException
at in.zapr.druid.druidry.client.DruidJerseyClient.handleInternalServerResponse(DruidJerseyClient.java:128)
at in.zapr.druid.druidry.client.DruidJerseyClient.query(DruidJerseyClient.java:91)

Any help for this?

Removal of JAX-RS Dependency

Hi,

druidry uses Jersey for communication over HTTP. This causes issues if the host app includes dependencies which may have used JAX-RS 1.x. An example is a Spring Boot app running Spring Cloud and therefore libraries such as Netflix's Eureka, Feign, Ribbon. There are open issues in Spring Cloud Netflix project for this.

Would you consider removing JAX-RS and use something else instead?

TimeFormatExtractionFunction can't extraction day of week number

TimeFormatExtractionFunction use SimpleDateFormat as format. But the petterns used in this class are different from Joda DateTimeFormat. DateTimeFormat has 'e' pattern for day of week number when SimpleDateFormat use pattern 'u'. SimpleDateFormat not supported pettern 'e'.

Is postAggregations type "expression" not supported?

Through the 'case when' statement of /console/druid/ page, json is obtained:
{ "queryType": "groupBy", "dataSource": { "type": "table", "name": "dw_action_sec_data_test" }, "intervals": { "type": "intervals", "intervals": [ "-146136543-09-08T08:23:32.096Z/146140482-04-24T15:36:27.903Z" ] }, "virtualColumns": [], "filter": null, "granularity": { "type": "all" }, "dimensions": [ { "type": "default", "dimension": "actionId", "outputName": "d0", "outputType": "LONG" }, { "type": "default", "dimension": "applicationId", "outputName": "d1", "outputType": "LONG" } ], "aggregations": [ { "type": "count", "name": "a0" }, { "type": "longSum", "name": "a1", "fieldName": "reqCount", "expression": null }, { "type": "longSum", "name": "a2", "fieldName": "errorCount", "expression": null } ], "postAggregations": [ { "type": "expression", "name": "p0", "expression": "case_searched(((\"a1\" - \"a2\") > 1),1,11110)", "ordering": null } ], "having": null, "limitSpec": { "type": "default", "columns": [], "limit": 100 }, "context": { "sqlOuterLimit": 100, "sqlQueryId": "d45d2637-1a15-43aa-9e6f-fe86a7b281f7" }, "descending": false }
It is found that the type of postAggregations is' expression ', but no corresponding type has been found in druidry. Is this type not supported?Or did I not find the right class?

Use Java 8's java.time module instead of joda

I wanted to bring this up if it's something you'd consider? In projects already adopted Java 8's java.time package, it's kind of frustrating to add back joda package and try to convert between objects.

411 Length Required

after query druid got http status code 411. after set the property blow to ClientConfig get resloved.

            this.jerseyConfig.property(ClientProperties.REQUEST_ENTITY_PROCESSING, RequestEntityProcessing.BUFFERED);

Druid Authentication

Hi!
I have added an authentication step in my Druid environment. How can I continue using this library?

IntervalFilter 里面的类型没有初始化值,是否应该初始化值为interval

public class IntervalFilter extends DruidFilter {

private static String INTERVAL_DRUID_FILTER_TYPE = "interval";

private String type;
private String dimension;
private List<Interval> intervals;

public IntervalFilter(String dimension, List<Interval> intervals) {
    this.dimension = dimension;
    this.intervals = intervals;
}

// TODO: support for Extraction Function

}

Build is platform dependent

While building in maven, we should specify encoding so that it doesn't pick up encoding of machine where it is being built

Builder for DruidSearchQuery

There are builders in aggregation query like DruidGroupByQuery, DruidTopNQuery, but there is no builder in DruidSearchQuery.

Druid Connection pool shuts down once close is called and connection is not set even after calling connect again

I have an api which does something like this :
runDruidQuery(DruidQuery query)
{
DruidConfiguration config = DruidConfiguration.builder().host(appConfig.getDruidHost())
.port(appConfig.getDruidPort()).endpoint(appConfig.getDruidEndpoint()).build();
client = new DruidJerseyClient(config);
client .connect();
client.query(query);
client.close();

}
This api works fine for the first time. But on second call I get :
in.zapr.druid.druidry.client.exception.QueryException: javax.ws.rs.ProcessingException: java.lang.IllegalStateException: Connection pool shut down

I tried removing close() call. On doing this, second call works, but I get the exception randomly after some time.
Do I need to configure some timeout configuration anywhere?

DruidJerseyClient error

Hi:
I use druidry like this:

        DruidTimeSeriesQuery query = DruidTimeSeriesQuery.builder()
                .dataSource("druid_test_2")
                .granularity(granularity)
                .intervals(Collections.singletonList(interval))
                .descending(true)
                .filter(filter)
                .aggregators(Collections.singletonList(aggregator1))
                .intervals(Collections.singletonList(interval))
                .build();
        ObjectMapper mapper = new ObjectMapper();
        String requiredJson = mapper.writeValueAsString(query);
        DruidConfiguration config = DruidConfiguration
                .builder()
                .protocol(DruidQueryProtocol.HTTP)
                .host("my druid host")
                .port(8082)
                .endpoint("druid/v2/")
                .concurrentConnectionsRequired(5)
                .build();
        DruidClient client = new DruidJerseyClient(config);
        client.connect();
        List<DruidResponse> responses = client.query(query,DruidResponse.class);

and I get the follow error:

{"error":"Unknown exception","errorMessage":"Could not resolve type id 'TIMESERIES' into a subtype of [simple type, class org.apache.druid.query.Query]: known type ids = [Query, dataSourceMetadata, groupBy, scan, search, segmentMetadata, select, timeBoundary, timeseries, topN]\n at [Source: HttpInputOverHTTP@167674ce[c=1242,q=0,[0]=null,s=STREAM]; line: 1, column: 1217]","errorClass":"com.fasterxml.jackson.databind.JsonMappingException","host":null}

Java 11 compability

Current version is not compatible with Java 11 because of outdated Lombok library. I think that to make it usable with Java 11 Lombok must be updated from version 1.16.14 to latest version 1.18.10. Could you apply this change?

QueryException

When I Checked,I got a error : in.zapr.druid.druidry.client.exception.QueryException: javax.ws.rs.ProcessingException: java.net.UnknownHostException: http

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.