Giter Site home page Giter Site logo

onyx-kafka's People

Contributors

ckarlsen84 avatar enragedginger avatar gardnervickers avatar jcsims avatar lbradstreet avatar magemasher avatar mccraigmccraig avatar michaeldrogalis avatar teaforthecat avatar vise890 avatar vsolovyov avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

onyx-kafka's Issues

typo in readme.md

please search kafak/wrap-with-metadata,
it's under read-messages section

Add schema checks of task-maps

The task map entries are getting more complicated over time, and we're having to add validation for kvs that are changing between versions. We should add a schema check for task maps entries to make this process easier.

Schema checks should be relatively non-strict, i.e. they only check the keys and values that are known about by the plugin, and ignore any additional keys.

Document checkpointing method

Checkpointing is somewhat described via the task-map opts, but the overall strategy could probably be described better

Functionality that enables autoscaling

Not sure if a suggestion belongs here. But for us it would be nice when it would be possible to have autoscaling of the number of input peers based on the number of instances.

For example:
There is a kafka cluster with a topic with 100 partitions.
The number of peers that read from kafka increases linearly up to 100 with increasing numbers of instances.

Or at least that you do not have to set the n-peers or min/max-peers, but that the partition-assigment function does the partitioning based on the eventual number of peers.

Do you think these functionalities lie in the realm of possibilities?

:kafka/commit-interval deprecated in or before 0.14.5.0?

Based on https://github.com/onyx-platform/onyx-kafka/blob/b092ad0e96bd52bb38e9b447834d78c072a0fb47/src/onyx/kafka/information_model.cljc , it should be valid to pass :kafka/commit-interval as a kafka-opt to the plugin.

Actually passing this results in a Schema validation error (I don't recall the exact, but something akin to "key not in namespace :onyx OR :kafka".

I expect that either:
A) The information model should be updated with "deprecated" for this field
B) This option is supposed to be valid but is missing from the Schema

Look into transient CI failure

May be due to multiple partitions or await-job-completion not actually waiting for all tasks to shutdown before returning.

FAIL at (output_test.clj:118)
Expected: [{:key 1, :partition nil, :value {:n 0}} {:key nil, :partition nil, :value {:n 1}} {:key "tarein", :partition 1, :value {:n 2}}]
Actual: ({:key 1, :partition nil, :value {:n 0}})
Diffs: expected length of sequence is 3, actual length is 1.
actual is missing 2 elements: ({:key nil, :partition nil, :value {:n 1}} {:key "tarein", :partition 1, :value {:n 2}})
FAILURE: 1 check failed. (But 6 succeeded.)
Error encountered performing task 'midje' with profile(s): 'dev,circle-ci'
Subprocess failed

Incompatible slf4j versions causes Datomic initialisation to fail

I've been investigating an issue with @kennyjwilli which manifests itself when running a process from Cursive. The issue is that onyx-kafka contains a version of slf4j-log4j which is incompatible with the log4j-api in onyx. This causes Datomic to fail during initialisation.

Here's a sample project.clj:

(defproject datomic-slf4j-test "0.1.0-SNAPSHOT"
  :description "FIXME: write description"
  :url "http://example.com/FIXME"
  :license {:name "Eclipse Public License"
            :url "http://www.eclipse.org/legal/epl-v10.html"}
  :repositories {"my.datomic.com" {:url   "https://my.datomic.com/repo"
                                   :creds :gpg}}
  :dependencies [[org.clojure/clojure "1.8.0"]
                 [com.datomic/datomic-pro "0.9.5561.50"]
                 [org.onyxplatform/onyx "0.11.0"]
                 [org.onyxplatform/onyx-kafka "0.11.0.0"]])

And a simple namespace:

(ns datomic-slf4j-test.core)

(require 'datomic.api)

With this configuration, running lein run -m datomic-slf4j-test.core produces:

Caused by: java.lang.NoSuchMethodError: org.slf4j.helpers.Util.safeGetSystemProperty(Ljava/lang/String;)Ljava/lang/String;
	at org.slf4j.impl.VersionUtil.getJavaMajorVersion(VersionUtil.java:11)
	at org.slf4j.impl.Log4jMDCAdapter.<clinit>(Log4jMDCAdapter.java:37)
	at org.slf4j.impl.StaticMDCBinder.getMDCA(StaticMDCBinder.java:59)
	at org.slf4j.MDC.<clinit>(MDC.java:90)
	at datomic.slf4j__init.load(Unknown Source)
	at datomic.slf4j__init.<clinit>(Unknown Source)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:348)
	at clojure.lang.RT.classForName(RT.java:2168)
	at clojure.lang.RT.classForName(RT.java:2177)
	at clojure.lang.RT.loadClassForName(RT.java:2196)
	at clojure.lang.RT.load(RT.java:443)
	at clojure.lang.RT.load(RT.java:419)
	at clojure.core$load$fn__5677.invoke(core.clj:5893)

If onyx-kafka is removed from the dependencies, this error is not produced.

Here's the deps tree:

 [clojure-complete "0.2.4" :exclusions [[org.clojure/clojure]]]
 [com.datomic/datomic-pro "0.9.5561.50"]
   [com.datomic/datomic-lucene-core "3.3.0"]
   [com.google.guava/guava "18.0"]
   [com.h2database/h2 "1.3.171"]
   [commons-codec "1.10"]
   [net.spy/spymemcached "2.11.4"]
   [org.apache.activemq/artemis-core-client "1.4.0" :exclusions [[org.jgroups/jgroups] [commons-logging]]]
     [io.netty/netty-all "4.0.39.Final"]
     [org.apache.activemq/artemis-commons "1.4.0"]
       [commons-beanutils "1.9.2"]
         [commons-collections "3.2.1"]
       [org.jboss.logging/jboss-logging "3.3.0.Final"]
     [org.apache.geronimo.specs/geronimo-json_1.0_spec "1.0-alpha-1"]
     [org.apache.johnzon/johnzon-core "0.9.4"]
   [org.apache.httpcomponents/httpclient "4.5.2" :exclusions [[commons-logging]]]
     [org.apache.httpcomponents/httpcore "4.4.4"]
   [org.apache.tomcat/tomcat-jdbc "7.0.27" :exclusions [[commons-logging]]]
     [org.apache.tomcat/tomcat-juli "7.0.27"]
   [org.clojure/tools.cli "0.3.5" :exclusions [[org.clojure/clojure]]]
   [org.codehaus.janino/commons-compiler-jdk "2.6.1"]
     [org.codehaus.janino/commons-compiler "2.6.1"]
   [org.fressian/fressian "0.6.5"]
   [org.slf4j/jcl-over-slf4j "1.7.7"]
   [org.slf4j/jul-to-slf4j "1.7.7"]
   [org.slf4j/log4j-over-slf4j "1.7.7" :scope "runtime"]
   [org.slf4j/slf4j-nop "1.7.7"]
 [org.clojure/clojure "1.8.0"]
 [org.clojure/tools.nrepl "0.2.12" :exclusions [[org.clojure/clojure]]]
 [org.onyxplatform/onyx-kafka "0.11.0.0"]
   [cheshire "5.7.0"]
     [com.fasterxml.jackson.core/jackson-core "2.8.6"]
     [com.fasterxml.jackson.dataformat/jackson-dataformat-cbor "2.8.6"]
     [com.fasterxml.jackson.dataformat/jackson-dataformat-smile "2.8.6"]
     [tigris "0.1.1"]
   [org.apache.kafka/kafka-clients "0.11.0.0"]
     [org.xerial.snappy/snappy-java "1.1.2.6"]
   [org.apache.kafka/kafka_2.11 "0.11.0.0"]
     [com.101tec/zkclient "0.10" :exclusions [[*/mail] [*/jline] [*/netty] [*/jms] [*/javax] [*/jmxri] [*/jmxtools]]]
     [com.yammer.metrics/metrics-core "2.2.0" :exclusions [[*/mail] [*/jline] [*/netty] [*/jms] [*/javax] [*/jmxri] [*/jmxtools]]]
     [net.sf.jopt-simple/jopt-simple "5.0.3" :exclusions [[*/mail] [*/jline] [*/netty] [*/jms] [*/javax] [*/jmxri] [*/jmxtools]]]
     [org.scala-lang.modules/scala-parser-combinators_2.11 "1.0.4" :exclusions [[*/mail] [*/jline] [*/netty] [*/jms] [*/javax] [*/jmxri] [*/jmxtools]]]
     [org.scala-lang/scala-library "2.11.11" :exclusions [[*/mail] [*/jline] [*/netty] [*/jms] [*/javax] [*/jmxri] [*/jmxtools]]]
     [org.slf4j/slf4j-log4j12 "1.7.25" :exclusions [[*/mail] [*/jline] [*/netty] [*/jms] [*/javax] [*/jmxri] [*/jmxtools]]]
 [org.onyxplatform/onyx "0.11.0"]
   [clj-fuzzy "0.3.1" :exclusions [[org.clojure/clojurescript]]]
   [clj-tuple "0.2.2"]
   [com.amazonaws/aws-java-sdk-s3 "1.11.190"]
     [com.amazonaws/aws-java-sdk-core "1.11.190"]
       [com.fasterxml.jackson.core/jackson-databind "2.6.7.1"]
         [com.fasterxml.jackson.core/jackson-annotations "2.6.0"]
       [commons-logging "1.1.3"]
       [joda-time "2.8.1"]
       [software.amazon.ion/ion-java "1.0.2"]
     [com.amazonaws/aws-java-sdk-kms "1.11.190"]
     [com.amazonaws/jmespath-java "1.11.190"]
   [com.stuartsierra/component "0.3.2"]
   [com.stuartsierra/dependency "0.2.0"]
   [com.taoensso/nippy "2.13.0"]
     [net.jpountz.lz4/lz4 "1.3"]
     [org.clojure/tools.reader "0.10.0"]
     [org.iq80.snappy/snappy "0.4"]
     [org.tukaani/xz "1.6"]
   [com.taoensso/timbre "4.8.0"]
     [com.taoensso/encore "2.88.0"]
       [com.taoensso/truss "1.3.6"]
     [io.aviso/pretty "0.1.33"]
   [io.aeron/aeron-all "1.4.1"]
   [io.replikativ/hasch "0.3.3" :exclusions [[org.clojure/clojurescript] [com.cognitect/transit-clj] [com.cognitect/transit-cljs] [org.clojure/data.fressian] [com.cemerick/austin]]]
     [io.replikativ/incognito "0.2.0"]
   [metrics-clojure "2.9.0"]
     [io.dropwizard.metrics/metrics-core "3.1.2"]
   [net.cgrand/xforms "0.9.3"]
     [net.cgrand/macrovich "0.2.0"]
     [org.clojure/clojurescript "1.9.293"]
       [com.google.javascript/closure-compiler-unshaded "v20160911"]
         [com.google.code.findbugs/jsr305 "1.3.9"]
         [com.google.code.gson/gson "2.2.4"]
         [com.google.javascript/closure-compiler-externs "v20160911"]
         [com.google.jsinterop/jsinterop-annotations "1.0.0"]
         [com.google.protobuf/protobuf-java "2.5.0"]
       [org.clojure/data.json "0.2.6"]
       [org.clojure/google-closure-library "0.0-20160609-f42b4a24"]
         [org.clojure/google-closure-library-third-party "0.0-20160609-f42b4a24"]
       [org.mozilla/rhino "1.7R5"]
   [org.apache.curator/curator-framework "2.9.1"]
     [org.apache.curator/curator-client "2.9.1"]
   [org.apache.curator/curator-test "2.9.1"]
     [org.apache.commons/commons-math "2.2"]
     [org.javassist/javassist "3.18.1-GA"]
   [org.apache.zookeeper/zookeeper "3.4.10" :exclusions [[org.slf4j/slf4j-log4j12]]]
     [io.netty/netty "3.10.5.Final"]
     [jline "0.9.94" :exclusions [[*]]]
     [log4j "1.2.16" :exclusions [[*]]]
   [org.btrplace/scheduler-api "0.46"]
     [net.sf.trove4j/trove4j "3.0.3"]
   [org.btrplace/scheduler-choco "0.46"]
     [it.unimi.dsi/fastutil "7.0.12"]
     [org.choco-solver/choco-solver "3.3.3"]
       [args4j "2.32"]
       [com.github.cp-profiler/cpprof-java "1.1.0"]
         [org.zeromq/jeromq "0.3.4"]
       [dk.brics.automaton/automaton "1.11-8"]
       [org.javabits.jgrapht/jgrapht-core "0.9.3"]
   [org.clojure/core.async "0.3.443"]
     [org.clojure/tools.analyzer.jvm "0.7.0"]
       [org.clojure/core.memoize "0.5.9"]
         [org.clojure/core.cache "0.6.5"]
           [org.clojure/data.priority-map "0.0.7"]
       [org.clojure/tools.analyzer "0.6.9"]
       [org.ow2.asm/asm-all "4.2"]
   [org.deephacks.lmdbjni/lmdbjni-linux64 "0.4.6"]
   [org.deephacks.lmdbjni/lmdbjni-osx64 "0.4.6"]
   [org.deephacks.lmdbjni/lmdbjni-win64 "0.4.6"]
   [org.deephacks.lmdbjni/lmdbjni "0.4.6"]
   [org.slf4j/slf4j-api "1.7.12"]
   [prismatic/schema "1.1.6"]

The conflict is between [org.slf4j/slf4j-api "1.7.12"] and [org.slf4j/slf4j-log4j12 "1.7.25" :exclusions ...]. The slf4j doc states:

Mixing different versions of slf4j-api.jar and SLF4J binding can cause problems. For example, if you are using slf4j-api-1.8.0-alpha2.jar, then you should also use slf4j-simple-1.8.0-alpha2.jar, using slf4j-simple-1.5.5.jar will not work.

At initialization time, if SLF4J suspects that there may be an slf4j-api vs. binding version mismatch problem, it will emit a warning about the suspected mismatch.

Document embedded kafka server

The embedded kafka server included in 0.7.2.0 is pretty useful. We should document it in the README as it would be helpful to users.

Keyed messages support

Learning about Kafka I realized that I could have use for keyed messages in combination with log compaction. It made me wonder whether it is planned to support keyed messages in this plugin and how straightforward the necessary changes would be, i. e. would adding and implementing a parameter for key serialization/deserialization suffice?

surprising replay in development

onyx-0.9.9
onyx-kafka-0.9.9.0
kafka 0.9.0.1 & kafka 0.10.0.0

in development i am seeing messages replayed when onyx is restarted ( tools.ns reload, new environment with random onyx-id created, new job submitted )

in production processing correctly resumes after the last acknowledged message ( static onyx-id, job resumed )

here's my onyx-kafka task catalog entry - https://www.refheap.com/470754c4f1daab1e0f2610f71

if i force a reset then that seems to work ok - processing resumes at the largest offset as expected - so the problem appears to be related to reading the last acknowledged offset

StringSerializer/StringDeserializer instead of ByteArraySerializer/ByteArrayDeserializer

I would really like org.apache.kafka.common.serialization.StringDeserializer for key.deserializer and key.serializer. My data is small <100 byte strings, and being able to kafkacat a topic and read the results would give me great leverage.

This pull request seems to enable what I want, and it was never merged: #55

  • Why was it never merged?
  • Is this more easily done by just adding key.deserializer and key.serializer into the consumer-config and constructing (i.e.) key-deserializer with that, or defaulting to (h/byte-array-deserializer) if it's not defined?

Only restart jobs that are likely to succeed

Moved over from: onyx-platform/onyx#635

When I start my system I encounter an error because something is misconfigured. When the task can't be started it gets restarted again and again only to fail over and over.

clojure.lang.ExceptionInfo: Caught exception inside task lifecycle. Rebooting the task. -> Exception type: clojure.lang.ExceptionInfo. Exception message: :onyx/min-peers must equal :onyx/max-peers and the number of partitions, or :onyx/n-peers must equal number of kafka partitions

This fills up my logs, and doesn't really help me solve the underlying issue.

Need a throughput / latency test

We need performance tests for the plugins, not just onyx core. onyx-kafka would be the best place to start.

Initially it would be fine to run the test with an embedded broker, on a single machine. A test to detect the most egregious problems would be enough initially.

Consider how to allow arbitrary config options to be passed to the producer / consumer

Option 1. I think since we're fixed on certain producer / consumer versions, we can just use an opts map (potentially blacklisting some options that we allow).

Option 2. Another option is to inject a producer/consumer, but this could get messy to support, and is still kinda cumbersome for users.

Option 3. Last option is to just code up an arbitrary rule to convert the parameter styles and support all config options. This doesn't future proof us very well and requires us to document everything.

I prefer 3 or 1 vastly more than 2.

Troubleshooting simple job not writing to a topic

Hey there,

I'm playing with a recent onyx kafka plugin ("0.10.0.0-SNAPSHOT"). For some reason, I can't get Onyx to write to an output kafka topic. Code is in this git sample project, in the cloud-orchestration-2 branch. I'm using a very simple workflow below. Here, :process-commands is just Clojure's identity function.

[[:read-commands :process-commands]
 [:process-commands :write-messages]]

My setup uses docker-compose. A simple workflow to stand everything up, might look like the following.

# Build a base image
docker build -f Dockerfile.app.base -t twashing/simple-kafka-onyx-commander-base:latest .

# Bring up the composed services
docker-compose up

# In another terminal, connect to a kafka tools workbench
docker-compose exec kafka-tools /bin/bash

# Create 2 basic topics
kafka-topics --create \
             --zookeeper zookeeper:2181 \
             --topic read-messages \
             --replication-factor 1 \
             --partitions 1 \
             --config cleanup.policy=compact

kafka-topics --create \
             --zookeeper zookeeper:2181 \
             --topic write-messages \
             --replication-factor 1 \
             --partitions 1 \
             --config cleanup.policy=compact

kafka-topics --list --zookeeper zookeeper:2181

# Fire up a console producer
kafka-console-producer \
  --broker-list kafka:9092 \
  --topic read-messages \
  --property "parse.key=true" \
  --property "key.separator=,"

# Produce some messages to the topic
4b6d8c94-4674-4466-9ee2-872bf1678e78,{:foo :bar}
6af3345a-5634-4b41-9762-ce8ad6f6eaa8,{:qwerty :asdf}
310fcfe4-3cf6-4b8e-9c26-5557634dc6b3,{:thing :amabob}

# Listen to the output topic for processed messages
kafka-console-consumer --bootstrap-server kafka:9092 --topic read-messages --new-consumer --from-beginning

# Now cider-connect to localhost:5444, and run the job here:
# https://github.com/twashing/simple-kafka-onyx-commander/blob/cloud-orchestration-2/src/clojure/com/interrupt/streaming/core.clj#L192-L213

Now, the docker service logs for app and kafka look fine. But the zookeeper logs keep giving me these kind of Error:KeeperErrorCode = NoNode for /onyx/dev/... errors, when trying to make Onyx write to kafka. Googling this gave me some leads, but no answers. And producing to and consuming from the topics, using just kafka tools, works fine.

Now, I've set the number of peers for this job, to be 1, as there's only one machine running the job. But somehow I've not configured Onyx incorrectly.

I feel I've overlooked some small detail that I can't put my finger on. Probably something obvious to someone who sees it. Any ideas? Here's zookeeper's full error log, in this pastebin link.

# Presumably, somehow a consumer is trying to access a non-existant node, eventhough I specified KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1 for kafka. 
# But this error appears before I start the job. I don't THINK these are related. 

zookeeper_1    | [2017-11-20 17:50:59,092] INFO Got user-level KeeperException when processing sessionid:0x15fda8266af0001 type:create cxid:0x160 zxid:0x151 txntype:-1 reqpath:n/a Error Path:/brokers/topics/__consumer_offsets/partitions/16 Error:KeeperErrorCode = NoNode for /brokers/topics/__consumer_offsets/partitions/16 (org.apache.zookeeper.server.PrepRequestProcessor)
zookeeper_1    | [2017-11-20 17:50:59,108] INFO Got user-level KeeperException when processing sessionid:0x15fda8266af0001 type:create cxid:0x163 zxid:0x154 txntype:-1 reqpath:n/a Error Path:/brokers/topics/__consumer_offsets/partitions/2 Error:KeeperErrorCode = NoNode for /brokers/topics/__consumer_offsets/partitions/2 (org.apache.zookeeper.server.PrepRequestProcessor)

# These errors appear in Zookeeper, when I run the job. Presumably Onyx is trying to run a job on a non-existant machine.
# Full output in this pastebin link: https://pastebin.com/dwjvCCsk

zookeeper_1    | [2017-11-20 18:46:22,397] INFO Got user-level KeeperException when processing sessionid:0x15fdab49b0d0008 type:create cxid:0x15 zxid:0x13c txntype:-1 reqpath:n/a Error Path:/onyx/e679ad1d-da33-480f-8a7c-d3854c799078/resume-point Error:KeeperErrorCode = NodeExists for /onyx/e679ad1d-da33-480f-8a7c-d3854c799078/resume-point (org.apache.zookeeper.server.PrepRequestProcessor)
zookeeper_1    | [2017-11-20 18:46:22,404] INFO Got user-level KeeperException when processing sessionid:0x15fdab49b0d0008 type:create cxid:0x16 zxid:0x13d txntype:-1 reqpath:n/a Error Path:/onyx/e679ad1d-da33-480f-8a7c-d3854c799078/origin/origin Error:KeeperErrorCode = NodeExists for /onyx/e679ad1d-da33-480f-8a7c-d3854c799078/origin/origin (org.apache.zookeeper.server.PrepRequestProcessor)

Kafka 0.9 consumer offset rename

0.9 switched from "smallest" and "largest" to "earliest" and "latest". We should make the change for this repo only -- and not for onyx-kafka-0.8.

Missing dependency for Cheshire JSON encoder

I have just updated from 10.x somesuch beta7 to 0.10.0-beta11

and running lein fails on my setup due to the missing dependency for:
https://github.com/dakrone/cheshire

lein run

Exception in thread "main" java.io.FileNotFoundException: Could not locate cheshire/core__init.class or cheshire/core.clj on classpath., compiling:(onyx/tasks/kafka.clj:1:1)
	at clojure.lang.Compiler.load(Compiler.java:7391)
	at clojure.lang.RT.loadResourceScript(RT.java:372)
	at clojure.lang.RT.loadResourceScript(RT.java:363)
	at clojure.lang.RT.load(RT.java:453)
	at clojure.lang.RT.load(RT.java:419)
	at clojure.core$load$fn__5677.invoke(core.clj:5893)
	at clojure.core$load.invokeStatic(core.clj:5892)
	at clojure.core$load.doInvoke(core.clj:5876)
	at clojure.lang.RestFn.invoke(RestFn.java:408)
	at clojure.core$load_one.invokeStatic(core.clj:5697)
	at clojure.core$load_one.invoke(core.clj:5692)
	at clojure.core$load_lib$fn__5626.invoke(core.clj:5737)
	at clojure.core$load_lib.invokeStatic(core.clj:5736)
	at clojure.core$load_lib.doInvoke(core.clj:5717)
	at clojure.lang.RestFn.applyTo(RestFn.java:142)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$load_libs.invokeStatic(core.clj:5774)
	at clojure.core$load_libs.doInvoke(core.clj:5758)
	at clojure.lang.RestFn.applyTo(RestFn.java:137)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$require.invokeStatic(core.clj:5796)
	at clojure.core$require.doInvoke(core.clj:5796)
	at clojure.lang.RestFn.invoke(RestFn.java:3659)
	at onyx.plugin.kafka$eval3763$loading__5569__auto____3764.invoke(kafka.clj:1)
	at onyx.plugin.kafka$eval3763.invokeStatic(kafka.clj:1)
	at onyx.plugin.kafka$eval3763.invoke(kafka.clj:1)
	at clojure.lang.Compiler.eval(Compiler.java:6927)
	at clojure.lang.Compiler.eval(Compiler.java:6916)
	at clojure.lang.Compiler.load(Compiler.java:7379)
	at clojure.lang.RT.loadResourceScript(RT.java:372)
	at clojure.lang.RT.loadResourceScript(RT.java:363)
	at clojure.lang.RT.load(RT.java:453)
	at clojure.lang.RT.load(RT.java:419)
	at clojure.core$load$fn__5677.invoke(core.clj:5893)
	at clojure.core$load.invokeStatic(core.clj:5892)
	at clojure.core$load.doInvoke(core.clj:5876)
	at clojure.lang.RestFn.invoke(RestFn.java:408)
	at clojure.core$load_one.invokeStatic(core.clj:5697)
	at clojure.core$load_one.invoke(core.clj:5692)
	at clojure.core$load_lib$fn__5626.invoke(core.clj:5737)
	at clojure.core$load_lib.invokeStatic(core.clj:5736)
	at clojure.core$load_lib.doInvoke(core.clj:5717)
	at clojure.lang.RestFn.applyTo(RestFn.java:142)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$load_libs.invokeStatic(core.clj:5774)
	at clojure.core$load_libs.doInvoke(core.clj:5758)
	at clojure.lang.RestFn.applyTo(RestFn.java:137)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$require.invokeStatic(core.clj:5796)
	at clojure.core$require.doInvoke(core.clj:5796)
	at clojure.lang.RestFn.invoke(RestFn.java:619)
	at onyx.kafka.utils$eval2257$loading__5569__auto____2258.invoke(utils.clj:1)
	at onyx.kafka.utils$eval2257.invokeStatic(utils.clj:1)
	at onyx.kafka.utils$eval2257.invoke(utils.clj:1)
	at clojure.lang.Compiler.eval(Compiler.java:6927)
	at clojure.lang.Compiler.eval(Compiler.java:6916)
	at clojure.lang.Compiler.load(Compiler.java:7379)
	at clojure.lang.RT.loadResourceScript(RT.java:372)
	at clojure.lang.RT.loadResourceScript(RT.java:363)
	at clojure.lang.RT.load(RT.java:453)
	at clojure.lang.RT.load(RT.java:419)
	at clojure.core$load$fn__5677.invoke(core.clj:5893)
	at clojure.core$load.invokeStatic(core.clj:5892)
	at clojure.core$load.doInvoke(core.clj:5876)
	at clojure.lang.RestFn.invoke(RestFn.java:408)
	at clojure.core$load_one.invokeStatic(core.clj:5697)
	at clojure.core$load_one.invoke(core.clj:5692)
	at clojure.core$load_lib$fn__5626.invoke(core.clj:5737)
	at clojure.core$load_lib.invokeStatic(core.clj:5736)
	at clojure.core$load_lib.doInvoke(core.clj:5717)
	at clojure.lang.RestFn.applyTo(RestFn.java:142)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$load_libs.invokeStatic(core.clj:5774)
	at clojure.core$load_libs.doInvoke(core.clj:5758)
	at clojure.lang.RestFn.applyTo(RestFn.java:137)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$require.invokeStatic(core.clj:5796)
	at clojure.core$require.doInvoke(core.clj:5796)
	at clojure.lang.RestFn.invoke(RestFn.java:2793)
	at clojure_onyx.core$eval22$loading__5569__auto____23.invoke(core.clj:1)
	at clojure_onyx.core$eval22.invokeStatic(core.clj:1)
	at clojure_onyx.core$eval22.invoke(core.clj:1)
	at clojure.lang.Compiler.eval(Compiler.java:6927)
	at clojure.lang.Compiler.eval(Compiler.java:6916)
	at clojure.lang.Compiler.load(Compiler.java:7379)
	at clojure.lang.RT.loadResourceScript(RT.java:372)
	at clojure.lang.RT.loadResourceScript(RT.java:363)
	at clojure.lang.RT.load(RT.java:453)
	at clojure.lang.RT.load(RT.java:419)
	at clojure.core$load$fn__5677.invoke(core.clj:5893)
	at clojure.core$load.invokeStatic(core.clj:5892)
	at clojure.core$load.doInvoke(core.clj:5876)
	at clojure.lang.RestFn.invoke(RestFn.java:408)
	at clojure.core$load_one.invokeStatic(core.clj:5697)
	at clojure.core$load_one.invoke(core.clj:5692)
	at clojure.core$load_lib$fn__5626.invoke(core.clj:5737)
	at clojure.core$load_lib.invokeStatic(core.clj:5736)
	at clojure.core$load_lib.doInvoke(core.clj:5717)
	at clojure.lang.RestFn.applyTo(RestFn.java:142)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$load_libs.invokeStatic(core.clj:5774)
	at clojure.core$load_libs.doInvoke(core.clj:5758)
	at clojure.lang.RestFn.applyTo(RestFn.java:137)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$require.invokeStatic(core.clj:5796)
	at clojure.core$require.doInvoke(core.clj:5796)
	at clojure.lang.RestFn.invoke(RestFn.java:408)
	at user$eval7$fn__9.invoke(form-init641685057106199864.clj:1)
	at user$eval7.invokeStatic(form-init641685057106199864.clj:1)
	at user$eval7.invoke(form-init641685057106199864.clj:1)
	at clojure.lang.Compiler.eval(Compiler.java:6927)
	at clojure.lang.Compiler.eval(Compiler.java:6917)
	at clojure.lang.Compiler.load(Compiler.java:7379)
	at clojure.lang.Compiler.loadFile(Compiler.java:7317)
	at clojure.main$load_script.invokeStatic(main.clj:275)
	at clojure.main$init_opt.invokeStatic(main.clj:277)
	at clojure.main$init_opt.invoke(main.clj:277)
	at clojure.main$initialize.invokeStatic(main.clj:308)
	at clojure.main$null_opt.invokeStatic(main.clj:342)
	at clojure.main$null_opt.invoke(main.clj:339)
	at clojure.main$main.invokeStatic(main.clj:421)
	at clojure.main$main.doInvoke(main.clj:384)
	at clojure.lang.RestFn.invoke(RestFn.java:421)
	at clojure.lang.Var.invoke(Var.java:383)
	at clojure.lang.AFn.applyToHelper(AFn.java:156)
	at clojure.lang.Var.applyTo(Var.java:700)
	at clojure.main.main(main.java:37)
Caused by: java.io.FileNotFoundException: Could not locate cheshire/core__init.class or cheshire/core.clj on classpath.
	at clojure.lang.RT.load(RT.java:456)
	at clojure.lang.RT.load(RT.java:419)
	at clojure.core$load$fn__5677.invoke(core.clj:5893)
	at clojure.core$load.invokeStatic(core.clj:5892)
	at clojure.core$load.doInvoke(core.clj:5876)
	at clojure.lang.RestFn.invoke(RestFn.java:408)
	at clojure.core$load_one.invokeStatic(core.clj:5697)
	at clojure.core$load_one.invoke(core.clj:5692)
	at clojure.core$load_lib$fn__5626.invoke(core.clj:5737)
	at clojure.core$load_lib.invokeStatic(core.clj:5736)
	at clojure.core$load_lib.doInvoke(core.clj:5717)
	at clojure.lang.RestFn.applyTo(RestFn.java:142)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$load_libs.invokeStatic(core.clj:5774)
	at clojure.core$load_libs.doInvoke(core.clj:5758)
	at clojure.lang.RestFn.applyTo(RestFn.java:137)
	at clojure.core$apply.invokeStatic(core.clj:648)
	at clojure.core$require.invokeStatic(core.clj:5796)
	at clojure.core$require.doInvoke(core.clj:5796)
	at clojure.lang.RestFn.invoke(RestFn.java:457)
	at onyx.tasks.kafka$eval16720$loading__5569__auto____16721.invoke(kafka.clj:1)
	at onyx.tasks.kafka$eval16720.invokeStatic(kafka.clj:1)
	at onyx.tasks.kafka$eval16720.invoke(kafka.clj:1)
	at clojure.lang.Compiler.eval(Compiler.java:6927)
	at clojure.lang.Compiler.eval(Compiler.java:6916)
	at clojure.lang.Compiler.load(Compiler.java:7379)
	... 126 more

Simply adding the dependency [cheshire "5.7.0"] to my project.clj avoids the issue.

Make kafka/zookeeper optional

Kafka no longer requires zookeeper to bootstrap, so we should make it possibly to supply the bootstrap addrs directly rather than looking them up in zookeeper ourselves.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.