Giter Site home page Giter Site logo

miguno / kafka-storm-starter Goto Github PK

View Code? Open in Web Editor NEW
727.0 727.0 331.0 402 KB

[PROJECT IS NO LONGER MAINTAINED] Code examples that show to integrate Apache Kafka 0.8+ with Apache Storm 0.9+ and Apache Spark Streaming 1.1+, while using Apache Avro as the data serialization format.

Home Page: http://www.michael-noll.com/blog/2014/05/27/kafka-storm-integration-example-tutorial/

License: Other

Scala 84.31% Shell 15.69%
apache-avro apache-kafka apache-spark apache-storm avro integration kafka scala spark storm

kafka-storm-starter's Introduction

THIS PROJECT IS NO LONGER MAINTAINED

kafka-storm-starter Build Status

Code examples that show how to integrate Apache Kafka 0.8+ with Apache Storm 0.9+ and Apache Spark 1.1+ while using Apache Avro as the data serialization format.

A great alternative to the examples in this repository, which require you to operate a Spark or Storm processing cluster: build elastic, distributed, fault-tolerant stream processing applications with Kafka's Streams API (read: no additional cluster required)

"Kafka Streams (source code), a component of open source Apache Kafka, is a powerful, easy-to-use library for building highly scalable, fault-tolerant, distributed stream processing applications on top of Apache Kafka. It builds upon important concepts for stream processing such as properly distinguishing between event-time and processing-time, handling of late-arriving data, and efficient management of application state."

Take a look at the Kafka Streams code examples at https://github.com/confluentinc/examples.


Table of Contents


Quick start

Show me!

$ ./sbt test

This command launches our test suite.

Notably it will run end-to-end tests of Kafka, Storm, and Kafka/Storm as well as Kafka/Spark Streaming integration. See this abridged version of the test output:

[...other tests removed...]

[info] KafkaSpec:
[info] Kafka
[info] - should synchronously send and receive a Tweet in Avro format
[info]   + Given a ZooKeeper instance
[info]   + And a Kafka broker instance
[info]   + And some tweets
[info]   + And a single-threaded Kafka consumer group
[info]   + When I start a synchronous Kafka producer that sends the tweets in Avro binary format
[info]   + Then the consumer app should receive the tweets
[info] - should asynchronously send and receive a Tweet in Avro format
[info]   + Given a ZooKeeper instance
[info]   + And a Kafka broker instance
[info]   + And some tweets
[info]   + And a single-threaded Kafka consumer group
[info]   + When I start an asynchronous Kafka producer that sends the tweets in Avro binary format
[info]   + Then the consumer app should receive the tweets
[info] StormSpec:
[info] Storm
[info] - should start a local cluster
[info]   + Given no cluster
[info]   + When I start a LocalCluster instance
[info]   + Then the local cluster should start properly
[info] - should run a basic topology
[info]   + Given a local cluster
[info]   + And a wordcount topology
[info]   + And the input words alice, bob, joe, alice
[info]   + When I submit the topology
[info]   + Then the topology should properly count the words
[info] KafkaStormSpec:
[info] As a user of Storm
[info] I want to read Avro-encoded data from Kafka
[info] so that I can quickly build Kafka<->Storm data flows
[info] Feature: AvroDecoderBolt[T]
[info]   Scenario: User creates a Storm topology that uses AvroDecoderBolt
[info]     Given a ZooKeeper instance
[info]     And a Kafka broker instance
[info]     And a Storm topology that uses AvroDecoderBolt and that reads tweets from topic testing-input} and writes them as-is to topic testing-output
[info]     And some tweets
[info]     And a synchronous Kafka producer app that writes to the topic testing-input
[info]     And a single-threaded Kafka consumer app that reads from topic testing-output and Avro-decodes the incoming data
[info]     And a Storm topology configuration that registers an Avro Kryo decorator for Tweet
[info]     When I run the Storm topology
[info]     And I Avro-encode the tweets and use the Kafka producer app to sent them to Kafka
[info]     Synchronously sending Tweet {"username": "ANY_USER_1", "text": "ANY_TEXT_1", "timestamp": 1411993272} to topic Some(testing-input)
[info]     Synchronously sending Tweet {"username": "ANY_USER_2", "text": "ANY_TEXT_2", "timestamp": 0} to topic Some(testing-input)
[info]     Synchronously sending Tweet {"username": "ANY_USER_3", "text": "ANY_TEXT_3", "timestamp": 1234} to topic Some(testing-input)
[info]     Then the Kafka consumer app should receive the original tweets from the Storm topology
[info] Feature: AvroScheme[T] for Kafka spout
[info]   Scenario: User creates a Storm topology that uses AvroScheme in Kafka spout
[info]     Given a ZooKeeper instance
[info]     And a Kafka broker instance
[info]     And a Storm topology that uses AvroScheme and that reads tweets from topic testing-input and writes them as-is to topic testing-output
[info]     And some tweets
[info]     And a synchronous Kafka producer app that writes to the topic testing-input
[info]     And a single-threaded Kafka consumer app that reads from topic testing-output and Avro-decodes the incoming data
[info]     And a Storm topology configuration that registers an Avro Kryo decorator for Tweet
[info]     When I run the Storm topology
[info]     And I Avro-encode the tweets and use the Kafka producer app to sent them to Kafka
[info]     Synchronously sending Tweet {"username": "ANY_USER_1", "text": "ANY_TEXT_1", "timestamp": 1411993272} to topic Some(testing-input)
[info]     Synchronously sending Tweet {"username": "ANY_USER_2", "text": "ANY_TEXT_2", "timestamp": 0} to topic Some(testing-input)
[info]     Synchronously sending Tweet {"username": "ANY_USER_3", "text": "ANY_TEXT_3", "timestamp": 1234} to topic Some(testing-input)
[info]     Then the Kafka consumer app should receive the original tweets from the Storm topology
[info] KafkaSparkStreamingSpec:
[info] As a user of Spark Streaming
[info] I want to read Avro-encoded data from Kafka
[info] so that I can quickly build Kafka<->Spark Streaming data flows
[info] Feature: Basic functionality
[info]   Scenario: User creates a Spark Streaming job that reads from and writes to Kafka
[info]     Given a ZooKeeper instance
[info]     And a Kafka broker instance
[info]     And some tweets
[info]     And a synchronous Kafka producer app that writes to the topic KafkaTopic(testing-input,1,1,{})
[info]     And a single-threaded Kafka consumer app that reads from topic KafkaTopic(testing-output,1,1,{}) and Avro-decodes the incoming data
[info]     When I Avro-encode the tweets and use the Kafka producer app to sent them to Kafka
[info]     And I run a streaming job that reads tweets from topic KafkaTopic(testing-input,1,1,{}) and writes them as-is to topic KafkaTopic(testing-output,1,1,{})
[info]     Then the Spark Streaming job should consume all tweets from Kafka
[info]     And the job should write back all tweets to Kafka
[info]     And the Kafka consumer app should receive the original tweets from the Spark Streaming job
[info] Run completed in 45 seconds, 787 milliseconds.
[info] Total number of tests run: 27
[info] Suites: completed 9, aborted 0
[info] Tests: succeeded 27, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.

Show me one more time!

$ ./sbt run

This command launches KafkaStormDemo. This demo starts in-memory instances of ZooKeeper, Kafka, and Storm. It then runs a demo Storm topology that connects to and reads from the Kafka instance.

You will see output similar to the following (some parts removed to improve readability):

7031 [Thread-19] INFO  backtype.storm.daemon.worker - Worker 3f7f1a51-5c9e-43a5-b431-e39a7272215e for storm kafka-storm-starter-1-1400839826 on daa60807-d440-4b45-94fc-8dd7798453d2:1027 has finished loading
7033 [Thread-29-kafka-spout] INFO  storm.kafka.DynamicBrokersReader - Read partition info from zookeeper: GlobalPartitionInformation{partitionMap={0=127.0.0.1:9092}}
7050 [Thread-29-kafka-spout] INFO  backtype.storm.daemon.executor - Opened spout kafka-spout:(1)
7051 [Thread-29-kafka-spout] INFO  backtype.storm.daemon.executor - Activating spout kafka-spout:(1)
7051 [Thread-29-kafka-spout] INFO  storm.kafka.ZkCoordinator - Refreshing partition manager connections
7065 [Thread-29-kafka-spout] INFO  storm.kafka.DynamicBrokersReader - Read partition info from zookeeper: GlobalPartitionInformation{partitionMap={0=127.0.0.1:9092}}
7066 [Thread-29-kafka-spout] INFO  storm.kafka.ZkCoordinator - Deleted partition managers: []
7066 [Thread-29-kafka-spout] INFO  storm.kafka.ZkCoordinator - New partition managers: [Partition{host=127.0.0.1:9092, partition=0}]
7083 [Thread-29-kafka-spout] INFO  storm.kafka.PartitionManager - Read partition information from: /kafka-spout/kafka-storm-starter/partition_0  --> null
7100 [Thread-29-kafka-spout] INFO  storm.kafka.PartitionManager - No partition information found, using configuration to determine offset
7105 [Thread-29-kafka-spout] INFO  storm.kafka.PartitionManager - Starting Kafka 127.0.0.1:0 from offset 18
7106 [Thread-29-kafka-spout] INFO  storm.kafka.ZkCoordinator - Finished refreshing

At this point Storm is connected to Kafka (more precisely: to the testing topic in Kafka). Not much will happen afterwards because a) we are not sending any data to the Kafka topic and b) this demo Storm topology only reads from the Kafka topic but it does nothing to the data that was read.

Note that this example will actually run two in-memory instances of ZooKeeper: the first (listening at 127.0.0.1:2181/tcp) is used by the Kafka instance, the second (listening at 127.0.0.1:2000/tcp) is automatically started and used by the in-memory Storm cluster. This is because, when running in local aka in-memory mode, Storm versions < 0.9.3 do not allow you to reconfigure or disable its own ZooKeeper instance (see the Storm FAQ below for further information).

To stop the demo application you must kill or Ctrl-C the process in the terminal.

You can use KafkaStormDemo as a starting point to create your own, "real" Storm topologies that read from a "real" Kafka, Storm, and ZooKeeper infrastructure. An easy way to get started with such an infrastructure is by deploying Kafka, Storm, and ZooKeeper via a tool such as Wirbelsturm.

Features

What features do we showcase in kafka-storm-starter? Note that we focus on showcasing, and not necessarily on "production ready".

  • How to integrate Kafka and Storm as well as Kafka and Spark Streaming
  • How to use Avro with Kafka, Storm, and Spark Streaming.
  • Kafka standalone code examples
    • KafkaProducerApp: A simple Kafka producer app for writing Avro-encoded data into Kafka. KafkaSpec puts this producer to use and shows how to use Twitter Bijection to Avro-encode the messages being sent to Kafka.
    • KafkaConsumerApp: A simple Kafka consumer app for reading Avro-encoded data from Kafka. KafkaSpec puts this consumer to use and shows how to use Twitter Bijection to Avro-decode the messages being read from Kafka.
  • Storm standalone code examples
    • AvroDecoderBolt[T]: An AvroDecoderBolt[T <: org.apache.avro.specific.SpecificRecordBase] that can be parameterized with the type of the Avro record T it will deserialize its data to (i.e. no need to write another decoder bolt just because the bolt needs to handle a different Avro schema).
    • AvroScheme[T]: An AvroScheme[T <: org.apache.avro.specific.SpecificRecordBase] scheme, i.e. a custom backtype.storm.spout.Scheme to auto-deserialize a spout's incoming data. The scheme can be parameterized with the type of the Avro record T it will deserializes its data to (i.e. no need to write another scheme just because the scheme needs to handle a different Avro schema).
      • You can opt to configure a spout (such as the Kafka spout) with AvroScheme if you want to perform the Avro decoding step directly in the spout instead of placing an AvroDecoderBolt after the Kafka spout. You may want to profile your topology which of the two approaches works best for your use case.
    • TweetAvroKryoDecorator: A custom backtype.storm.serialization.IKryoDecorator, i.e. a custom Kryo serializer for Storm.
      • Unfortunately we have not figured out a way to implement a parameterized AvroKryoDecorator[T] variant yet. (A "straight-forward" approach we tried -- similar to the other parameterized components -- compiled fine but failed at runtime when running the tests). Code contributions are welcome!
  • Kafka and Storm integration
    • AvroKafkaSinkBolt[T]: An AvroKafkaSinkBolt[T <: org.apache.avro.specific.SpecificRecordBase] that can be parameterized with the type of the Avro record T it will serialize its data to before sending the encoded data to Kafka (i.e. no need to write another Kafka sink bolt just because the bolt needs to handle a different Avro schema).
    • Storm topologies that read Avro-encoded data from Kafka: KafkaStormDemo and KafkaStormSpec
    • A Storm topology that writes Avro-encoded data to Kafka: KafkaStormSpec
  • Kafka and Spark Streaming integration
    • KafkaSparkStreamingSpec a streaming job that reads input data from Kafka and writes output data to Kafka. It demonstrates how to read from all partitions of a topic in parallel, how to decouple the downstream parallelism from the number of parttions (think: use 20 "threads" for processing the Kafka data even though the Kafka topic has only 5 partitions), and how to write the output of the streaming job back into Kafka. The input and output data is in Avro format, and we use Twitter Bijection for the serialization work. See my blog post on Integrating Kafka and Spark Streaming for further details.
  • Unit testing
  • Integration testing
    • KafkaSpec: Tests for Kafka, which launch and run against in-memory instances of Kafka and ZooKeeper. See EmbeddedKafkaZooKeeperCluster and its constituents KafkaEmbedded and ZooKeeperEmbedded.
    • StormSpec: Tests for Storm, which launch and run against in-memory instances of Storm and ZooKeeper.
    • KafkaStormSpec: Tests for integrating Storm and Kafka, which launch and run against in-memory instances of Kafka, Storm, and ZooKeeper.
    • KafkaSparkStreamingSpec: Tests for integrating Spark Streaming and Kafka, which launch and run against in-memory instances of Kafka, Spark Streaming, and ZooKeeper.

Implementation details

Development

Git setup: git-flow

This project follows the git-flow approach. This means, for instance, that:

  • The branch develop is used for integration of the "next release".
  • The branch master is used for bringing forth production releases.

Follow the git-flow installation instructions for your development machine.

See git-flow and the introduction article Why aren't you using git-flow? for details.

Build requirements

Your development machine requires:

  • Oracle JDK or OpenJDK for Java 7 (Oracle JDK preferred).

This project also needs Scala 2.10.4 and sbt 0.13.2, but these will be automatically downloaded and made available (locally/sandboxed) to the project as part of the build setup.

Building the code

$ ./sbt clean compile

If you want to only (re)generate Java classes from Avro schemas:

$ ./sbt avro:generate

Generated Java sources are stored under target/scala-*/src_managed/main/compiled_avro/.

Running the tests

$ ./sbt clean test

Here are some examples that demonstrate how you can run only a certain subset of tests:

# Use `-l` to exclude tests by tag:
# Run all tests WITH THE EXCEPTION of those tagged as integration tests
$ ./sbt "test-only * -- -l com.miguno.kafkastorm.integration.IntegrationTest"

# Use `-n` to include tests by tag (and skip all tests that lack the tag):
# Run ONLY tests tagged as integration tests
$ ./sbt "test-only * -- -n com.miguno.kafkastorm.integration.IntegrationTest"

# Run only the tests in suite AvroSchemeSpec:
$ ./sbt "test-only com.miguno.kafkastorm.storm.serialization.AvroSchemeSpec"

# You can also combine the examples above, of course.

Test reports in JUnit XML format are written to target/test-reports/junitxml/*.xml. Make sure that your actual build steps run the ./sbt test task, otherwise the JUnit XML reports will not be generate (note that ./sbt scoverage:test will not generate the JUnit XML reports unfortunately).

Integration with CI servers:

  • Jenkins integration:
    • Configure the build job.
    • Go to Post-build Actions.
    • Add a post-build action for Publish JUnit test result report.
    • In the Test report XMLs field add the pattern **/target/test-reports/junitxml/*.xml.
    • Now each build of your job will have a Test Result link.
  • TeamCity integration:
    • Edit the build configuration.
    • Select configuration step 3, Build steps.
    • Under Additional Build Features add a new build feature.
    • Use the following build feature configuration:
      • Report type: Ant JUnit
      • Monitoring rules: target/test-reports/junitxml/*.xml
    • Now each build of your job will have a Tests tab.

Further details are available at:

Creating code coverage reports

We are using sbt-scoverage to create code coverage reports for unit tests.

Run the unit tests via:

$ ./sbt clean scoverage:test
  • An HTML report will be created at target/scala-2.10/scoverage-report/index.html.
  • XML reports will be created at:
    • ./target/scala-2.10/coverage-report/cobertura.xml
    • ./target/scala-2.10/scoverage-report/scoverage.xml

Integration with CI servers:

  • Jenkins integration:
    • Configure the build.
    • Go to Post-build Actions.
    • Add a post-build action for Publish Cobertura Coverage Report.
    • In the Cobertura xml report pattern field add the pattern **/target/scala-2.10/coverage-report/cobertura.xml.
    • Now each build of your job will have a Coverage Report link.
  • TeamCity integration:
    • Edit the build configuration.
    • Select configuration step 1, General settings.
    • In the Artifact Paths field add the path target/scala-2.10/scoverage-report/** => coberturareport/.
    • Now each build of your job will have a Cobertura Coverage Report tab.

Packaging the code

To create a normal ("slim") jar:

$ ./sbt clean package

>>> Generates `target/scala-2.10/kafka-storm-starter_2.10-0.2.0-SNAPSHOT.jar`

To create a fat jar, which includes any dependencies of kafka-storm-starter:

$ ./sbt assembly

>>> Generates `target/scala-2.10/kafka-storm-starter-assembly-0.2.0-SNAPSHOT.jar`

Note: By default, assembly by itself will NOT run any tests. If you want to run tests before assembly, chain sbt commands in sequence, e.g. ./sbt test assembly. See assembly.sbt` for details why we do this.

To create a scaladoc/javadoc jar:

$ ./sbt packageDoc

>>> Generates `target/scala-2.10/kafka-storm-starter_2.10-0.2.0-SNAPSHOT-javadoc.jar`

To create a sources jar:

$ ./sbt packageSrc

>>> Generates `target/scala-2.10/kafka-storm-starter_2.10-0.2.0-SNAPSHOT-sources.jar`

To create API docs:

$ ./sbt doc

>>> Generates `target/scala-2.10/api/*` (HTML files)

IDE support

IntelliJ IDEA

kafka-storm-starter integrates the sbt-idea plugin. Use the following command to build IDEA project files:

$ ./sbt gen-idea

You can then open kafka-storm-starter as a project in IDEA via File > Open... and selecting the top-level directory of kafka-storm-starter.

Important note: There is a bug when using the sbt plugins for Avro and for IntelliJ IDEA in combination. The sbt plugin for Avro reads the Avro *.avsc schemas stored under src/main/avro and generates the corresponding Java classes, which it stores under target/scala-2.10/src_managed/main/compiled_avro (in the case of kafka-storm-starter, a Tweet.java class will be generated from the Avro schema twitter.avsc). The latter path must be added to IDEA's Source Folders setting, which will happen automatically for you. However the aforementioned bug will add a second, incorrect path to Source Folders, too, which will cause IDEA to complain about not being able to find the Avro-generated Java classes (here: the Tweet class).

Until this bug is fixed upstream you can use the following workaround, which you must perform everytime you run ./sbt gen-idea:

  1. In IntelliJ IDEA open the project structure for kafka-storm-starter via File > Project Structure....
  2. Under Project settings on the left-hand side select Modules.
  3. Select the Sources tab on the right-hand side.
  4. Remove the problematic target/scala-2.10/src_managed/main/compiled_avro/com entry from the Source Folders listing (the source folders are colored in light-blue). Note the trailing .../com, which comes from com.miguno.avro.Tweet in the twitter.avsc Avro schema.
  5. Click Ok.

See also this screenshot (click to enlarge):

Fix bug in IntelliJIDEA when using avro Avro

Eclipse

kafka-storm-starter integrates the sbt-eclipse plugin. Use the following command to build Eclipse project files:

$ ./sbt eclipse

Then use the Import Wizard in Eclipse to import Existing Projects into Workspace.

FAQ

Kafka

ZooKeeper exceptions "KeeperException: NoNode for /[ZK path]" logged at INFO level

In short you can normally safely ignore those errors -- it's for a reason they are logged at INFO level and not at ERROR level.

As described in the mailing list thread Zookeeper exceptions:

"The reason you see those NoNode error code is the following. Every time we want to create a new [ZK] path, say /brokers/ids/1, we try to create it directly. If this fails because the parent path doesn't exist, we try to create the parent path first. This will happen recursively. However, the NoNode error should show up only once, not every time a broker is started (assuming ZK data hasn't been cleaned up)."

A similar answer was given in the thread Clean up kafka environment:

"These info messages show up when Kafka tries to create new consumer groups. While trying to create the children of /consumers/[group], if the parent path doesn't exist, the zookeeper server logs these messages. Kafka internally handles these cases correctly by first creating the parent node."

Storm

Storm LocalCluster and ZooKeeper

LocalCluster starts an embedded ZooKeeper instance listening at localhost:2000/tcp. If a different process is already bound to 2000/tcp, then Storm will increment the embedded ZooKeeper's port until it finds a free port (2000 -> 2001 -> 2002, and so on). LocalCluster then reads the Storm defaults and overrides some of Storm's configuration (see the mk-local-storm-cluster function in testing.clj and the mk-inprocess-zookeeper function in zookeeper.clj for details):

STORM-CLUSTER-MODE "local"
STORM-ZOOKEEPER-PORT zk-port
STORM-ZOOKEEPER-SERVERS ["localhost"]}

where zk-port is the final port chosen.

In Storm versions <= 0.9.2 it is not possible to launch a local Storm cluster via LocalCluster without its own embedded ZooKeeper. Likewise it is not possible to control on which port the embedded ZooKeeper process will listen -- it will always follow the 2000/tcp based algorithm above to set the port.

In Storm 0.9.3 and later you can configure LocalCluster to use a custom ZooKeeper instance, thanks to STORM-213.

Known issues and limitations

This section lists known issues and limitations a) for the upstream projects such as Storm and Kafka, and b) for our own code.

Upstream code

ZooKeeper throws InstanceAlreadyExistsException during tests

Note: We squelch this message during test runs. See log4j.properties.

You may see the following exception when running the integration tests, which you can safely ignore:

[2014-03-07 11:56:59,250] WARN Failed to register with JMX (org.apache.zookeeper.server.ZooKeeperServer)
javax.management.InstanceAlreadyExistsException: org.apache.ZooKeeperService:name0=StandaloneServer_port-1

The root cause is that in-memory ZooKeeper instances have a hardcoded JMX setup. And because we cannot prevent Storm's LocalCluster to start its own ZooKeeper instance alongside "ours" (see FAQ section above), there will be two ZK instances trying to use the same JMX setup. Since the JMX setup is not relevant for our testing the exception can be safely ignored, albeit we'd prefer to come up with a proper fix, of course.

See also ZOOKEEPER-1350: Make JMX registration optional in LearnerZooKeeperServer, which will make it possible to disable JMX registration when using Curator's TestServer to run an in-memory ZooKeeper instance (this patch will be included in ZooKeeper 3.5.0, see JIRA ticket above).

ZooKeeper version 3.3.4 recommended for use with Kafka 0.8

At the time of writing Kafka 0.8 is not officially compatible with ZooKeeper 3.4.x, which is the latest stable version of ZooKeeper. Instead the Kafka project recommends ZooKeeper 3.3.4.

So which version of ZooKeeper should you do pick, particularly if you are already running a ZooKeeper cluster for other parts of your infrastructure (such as an Hadoop cluster)?

The TL;DR version is: Try using ZooKeeper 3.4.5 for both Kafka and Storm, but see the caveats and workarounds below. In the worst case use separate ZooKeeper clusters/versions for Storm (3.4.5) and Kafka (3.3.4). Generally speaking though, the best 3.3.x version of ZooKeeper is 3.3.6, which is the latest stable 3.3.x version. This is because 3.3.6 fixed a number of serious bugs that could lead to data corruption.

Tip: You can verify the exact ZK version used in kafka-storm-starter by running ./sbt dependency-graph.

Notes:

kafka-storm-starter code

  • Some code in kafka-storm-starter does not look like idiomatic Scala code. While sometimes this may be our own fault, there is one area where we cannot easily prevent this from happening: When the underlying Java APIs (here: the Java API of Storm) do not lend themselves to a more Scala-like code style. You can see this, for instance, in the way we wire the spouts and bolts of a topology. One alternative, of course, would be to create Scala-fied wrappers but this seemed inappropriate for this project.
  • We are using Thread.sleep() in some tests instead of more intelligent approaches. To prevent transient failures we may thus want to improve those tests. In Kafka's test suites, for instance, tests are using waitUntilTrue() to detect more reliably when to proceed (or fail/timeout) with the next step. See the related discussion in the review request 19696 for KAFKA-1317.
  • We noticed that the tests may fail when using Oracle/Sun JDK 1.6.0_24. Later versions (e.g. 1.6.0_31) work fine.

Change log

See CHANGELOG.

Contributing to kafka-storm-starter

Code contributions, bug reports, feature requests etc. are all welcome.

If you are new to GitHub please read Contributing to a project for how to send patches and pull requests to kafka-storm-starter.

License

Copyright ยฉ 2014 Michael G. Noll

See LICENSE for licensing information.

References

Wirbelsturm

Want to perform 1-click deployments of Kafka clusters and/or Storm clusters (with a Graphite instance, with Redis, with...)? Take a look at Wirbelsturm, with which you can deploy such environments locally and to Amazon AWS.

Kafka

Kafka in general:

Unit testing:

  • buildlackey/cep/kafka-0.8.x -- A simple Kafka producer/consumer example with in-memory Kafka and Zookeeper instances. For a number of reasons we opted not to use that code. We list it in this section in case someone else may find it helpful.

Storm

Storm in general:

Unit testing:

Kafka spout wurstmeister/storm-kafka-0.8-plus:

Kafka spout HolmesNL/kafka-spout, written by the Netherlands Forensics Institute:

Avro

Twitter Bijection:

Kafka:

Kryo

kafka-storm-starter's People

Contributors

miguno avatar viktortnk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-storm-starter's Issues

org.eclipse.aether.resolution.ArtifactResolutionException:

I run to build project:

mvn site

I have got exceptions;
Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: Failure to transfer org.apache.kafka:kafka_2.11:pom:2.5.0 from http://www.terracotta.org/download/reflector/rel
eases was cached in the local repository, resolution will not be reattempted until the update interval of terracotta-releases has elapsed or updates are forced. Original error: Coul
d not transfer artifact org.apache.kafka:kafka_2.11:pom:2.5.0 from/to terracotta-releases (http://www.terracotta.org/download/reflector/releases): Access denied to: http://www.terra
cotta.org/download/reflector/releases/org/apache/kafka/kafka_2.11/2.5.0/kafka_2.11-2.5.0.pom , ReasonPhrase:Forbidden.
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve (DefaultArtifactResolver.java:422)

Test suit doesn't pass

Running on a Vagrant ubuntu/trusty64 box with OpenJDK 1.7.0_79.

Some output truncated:

[info]   + Then the local cluster should start properly 
2015-09-23 17:19:58 backtype.storm.testing4j [ERROR] Error in cluster
java.lang.AssertionError: Test timed out (5000ms)
    at backtype.storm.testing$wait_until_cluster_waiting.invoke(testing.clj:213) ~[storm-core-0.9.3.jar:0.9.3]
...
    at java.lang.Thread.run(Thread.java:745) [na:1.7.0_79]
[info] - should run a basic topology *** FAILED ***
[info]   java.lang.AssertionError: Test timed out (5000ms)
...
at com.miguno.kafkastorm.integration.StormSpec$$anonfun$1$$anonfun$apply$mcV$sp$2$$anon$2.run(StormSpec.scala:81)
[info]   ...
[info]   + Given a local cluster
...

KafkaSparkStreamingSpec testsuite does not pass.

Hi,
I have problem running ./sbt test:

[info]   Scenario: User creates a Spark Streaming job that reads from and writes to Kafka *** FAILED ***
[info]   0 was not equal to 3 (KafkaSparkStreamingSpec.scala:237)
[info]     Given a ZooKeeper instance
[info]     And a Kafka broker instance
[info]     And some tweets
[info]     And a synchronous Kafka producer app that writes to the topic KafkaTopic(testing-input,1,1,{})
[info]     And a single-threaded Kafka consumer app that reads from topic KafkaTopic(testing-output,1,1,{}) and Avro-decodes the incoming data
[info]     When I Avro-encode the tweets and use the Kafka producer app to sent them to Kafka
[info]     And I run a streaming job that reads tweets from topic KafkaTopic(testing-input,1,1,{}) and writes them as-is to topic KafkaTopic(testing-output,1,1,{})
[info]     Then the Spark Streaming job should consume all tweets from Kafka
2014-10-09 19:43:09 org.apache.spark.streaming.scheduler.ReceiverTracker [ERROR] Deregistered receiver for stream 0: Error starting receiver 0 - org.I0Itec.zkclient.exception.ZkTimeoutException: Unable to connect to zookeeper server within timeout: 1000
    at org.I0Itec.zkclient.ZkClient.connect(ZkClient.java:895)
    at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:98)
    at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:84)
    at kafka.consumer.ZookeeperConsumerConnector.connectZk(ZookeeperConsumerConnector.scala:156)
    at kafka.consumer.ZookeeperConsumerConnector.<init>(ZookeeperConsumerConnector.scala:114)
    at kafka.consumer.ZookeeperConsumerConnector.<init>(ZookeeperConsumerConnector.scala:128)
    at kafka.consumer.Consumer$.create(ConsumerConnector.scala:89)
    at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:97)
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
    at org.apache.spark.scheduler.Task.run(Task.scala:54)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

2014-10-09 19:43:09 org.apache.spark.streaming.receiver.ReceiverSupervisorImpl [ERROR] Stopped executor with error: org.I0Itec.zkclient.exception.ZkTimeoutException: Unable to connect to zookeeper server within timeout: 1000
2014-10-09 19:43:09 org.apache.spark.executor.Executor [ERROR] Exception in task 0.0 in stage 0.0 (TID 0)
org.I0Itec.zkclient.exception.ZkTimeoutException: Unable to connect to zookeeper server within timeout: 1000
    at org.I0Itec.zkclient.ZkClient.connect(ZkClient.java:895) ~[zkclient-0.4.jar:0.4]
    at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:98) ~[zkclient-0.4.jar:0.4]
    at org.I0Itec.zkclient.ZkClient.<init>(ZkClient.java:84) ~[zkclient-0.4.jar:0.4]
    at kafka.consumer.ZookeeperConsumerConnector.connectZk(ZookeeperConsumerConnector.scala:156) ~[kafka_2.10-0.8.1.1.jar:na]
    at kafka.consumer.ZookeeperConsumerConnector.<init>(ZookeeperConsumerConnector.scala:114) ~[kafka_2.10-0.8.1.1.jar:na]
    at kafka.consumer.ZookeeperConsumerConnector.<init>(ZookeeperConsumerConnector.scala:128) ~[kafka_2.10-0.8.1.1.jar:na]
    at kafka.consumer.Consumer$.create(ConsumerConnector.scala:89) ~[kafka_2.10-0.8.1.1.jar:na]
    at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:97) ~[spark-streaming-kafka_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121) ~[spark-streaming_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106) ~[spark-streaming_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264) ~[spark-streaming_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257) ~[spark-streaming_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121) ~[spark-core_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121) ~[spark-core_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62) ~[spark-core_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.scheduler.Task.run(Task.scala:54) ~[spark-core_2.10-1.1.0.jar:1.1.0]
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177) ~[spark-core_2.10-1.1.0.jar:1.1.0]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_65]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_65]
    at java.lang.Thread.run(Thread.java:745) [na:1.7.0_65]
2014-10-09 19:43:09 org.apache.spark.scheduler.TaskSetManager [ERROR] Task 0 in stage 0.0 failed 1 times; aborting job

Running the testsuite within IntelliJ returns another exception:

As a user of Spark Streaming


I want to read Avro-encoded data from Kafka


so that I can quickly build Kafka<->Spark Streaming data flows


Exception encountered when invoking run on a nested suite - class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
    at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
    at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:794)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
    at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
    at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
    at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98)
    at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89)
    at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:67)
    at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:60)
    at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:60)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:60)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:60)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:42)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
    at com.miguno.kafkastorm.spark.KafkaSparkStreamingSpec.prepareSparkStreaming(KafkaSparkStreamingSpec.scala:77)
    at com.miguno.kafkastorm.spark.KafkaSparkStreamingSpec.beforeEach(KafkaSparkStreamingSpec.scala:47)
    at org.scalatest.BeforeAndAfterEach$class.beforeEach(BeforeAndAfterEach.scala:154)
    at com.miguno.kafkastorm.spark.KafkaSparkStreamingSpec.beforeEach(KafkaSparkStreamingSpec.scala:30)
    at org.scalatest.BeforeAndAfterEach$class.beforeEach(BeforeAndAfterEach.scala:173)
    at com.miguno.kafkastorm.spark.KafkaSparkStreamingSpec.beforeEach(KafkaSparkStreamingSpec.scala:30)
    at org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:253)
    at com.miguno.kafkastorm.spark.KafkaSparkStreamingSpec.runTest(KafkaSparkStreamingSpec.scala:30)
    at org.scalatest.FeatureSpecLike$$anonfun$runTests$1.apply(FeatureSpecLike.scala:267)
    at org.scalatest.FeatureSpecLike$$anonfun$runTests$1.apply(FeatureSpecLike.scala:267)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
    at org.scalatest.FeatureSpecLike$class.runTests(FeatureSpecLike.scala:267)
    at org.scalatest.FeatureSpec.runTests(FeatureSpec.scala:1836)
    at org.scalatest.Suite$class.run(Suite.scala:1424)
    at org.scalatest.FeatureSpec.org$scalatest$FeatureSpecLike$$super$run(FeatureSpec.scala:1836)
    at org.scalatest.FeatureSpecLike$$anonfun$run$1.apply(FeatureSpecLike.scala:309)
    at org.scalatest.FeatureSpecLike$$anonfun$run$1.apply(FeatureSpecLike.scala:309)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
    at org.scalatest.FeatureSpecLike$class.run(FeatureSpecLike.scala:309)
    at org.scalatest.FeatureSpec.run(FeatureSpec.scala:1836)
    at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
    at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
    at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
    at org.scalatest.tools.Runner$.run(Runner.scala:883)
    at org.scalatest.tools.Runner.run(Runner.scala)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:141)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:32)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)


Process finished with exit code 0

Please help.

Thanks,
Huy

KafkaSparkStreamingSpec.scala: NotSerializableException: org.apache.commons.pool2.impl.GenericObjectPool

Hello

I tried made the same pool of the producers in my spark application, but I got this error when spark trying to broadcast pool.

Exception in thread "Driver" java.io.NotSerializableException: org.apache.commons.pool2.impl.GenericObjectPool
    at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1183)
    at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
    at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:42)
    at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:202)
    at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:101)
    at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:84)
    at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
    at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
    at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
    at org.apache.spark.DefaultExecutionContext.broadcast(DefaultExecutionContext.scala:80)
    at org.apache.spark.SparkContext.broadcast(SparkContext.scala:961)
     ....

Could you please comment this issue?

Failed to execute goal on project streams-examples

:~/IdeaProjects/streams-examples$ mvn test
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building streams-examples 3.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-resources-plugin/2.6/maven-resources-plugin-2.6.pom
Downloaded: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-resources-plugin/2.6/maven-resources-plugin-2.6.pom (8 KB at 6.7 KB/sec)
Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-plugins/23/maven-plugins-23.pom
Downloaded: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-plugins/23/maven-plugins-23.pom (9 KB at 14.8 KB/sec)
Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-resources-plugin/2.6/maven-resources-plugin-2.6.jar
Downloaded: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-resources-plugin/2.6/maven-resources-plugin-2.6.jar (29 KB at 86.6 KB/sec)
Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-surefire-plugin/2.17/maven-surefire-plugin-2.17.pom
Downloaded: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-surefire-plugin/2.17/maven-surefire-plugin-2.17.pom (5 KB at 42.8 KB/sec)
Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/surefire/surefire/2.17/surefire-2.17.pom
Downloaded: https://repo.maven.apache.org/maven2/org/apache/maven/surefire/surefire/2.17/surefire-2.17.pom (17 KB at 56.7 KB/sec)
Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-surefire-plugin/2.17/maven-surefire-plugin-2.17.jar
Downloaded: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-surefire-plugin/2.17/maven-surefire-plugin-2.17.jar (34 KB at 102.5 KB/sec)
Downloading: http://packages.confluent.io/maven/org/apache/kafka/kafka-clients/0.10.1.0-SNAPSHOT/kafka-clients-0.10.1.0-SNAPSHOT.pom
[WARNING] The POM for org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT is missing, no dependency information available
Downloading: http://packages.confluent.io/maven/org/apache/kafka/kafka-streams/0.10.1.0-SNAPSHOT/kafka-streams-0.10.1.0-SNAPSHOT.pom
[WARNING] The POM for org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT is missing, no dependency information available
Downloading: http://packages.confluent.io/maven/org/apache/kafka/kafka_2.11/0.10.1.0-SNAPSHOT/kafka_2.11-0.10.1.0-SNAPSHOT.pom
[WARNING] The POM for org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT is missing, no dependency information available
Downloading: http://packages.confluent.io/maven/org/apache/kafka/kafka-clients/0.10.1.0-SNAPSHOT/kafka-clients-0.10.1.0-SNAPSHOT.jar
Downloading: http://packages.confluent.io/maven/org/apache/kafka/kafka-streams/0.10.1.0-SNAPSHOT/kafka-streams-0.10.1.0-SNAPSHOT.jar
Downloading: http://packages.confluent.io/maven/org/apache/kafka/kafka_2.11/0.10.1.0-SNAPSHOT/kafka_2.11-0.10.1.0-SNAPSHOT-test.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.692 s
[INFO] Finished at: 2016-07-12T14:39:52-04:00
[INFO] Final Memory: 14M/162M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project streams-examples: Could not resolve dependencies for project io.confluent:streams-examples:jar:3.1.0-SNAPSHOT: The following artifacts could not be resolved: org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT: Could not find artifact org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT in confluent (http://packages.confluent.io/maven/) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException



:~/IdeaProjects/streams-examples$ mvn test
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building streams-examples 3.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT is missing, no dependency information available
[WARNING] The POM for org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT is missing, no dependency information available
[WARNING] The POM for org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT is missing, no dependency information available
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.958 s
[INFO] Finished at: 2016-07-12T14:40:14-04:00
[INFO] Final Memory: 11M/193M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project streams-examples: Could not resolve dependencies for project io.confluent:streams-examples:jar:3.1.0-SNAPSHOT: The following artifacts could not be resolved: org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT: Failure to find org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException




:~/IdeaProjects/streams-examples$ mvn -e -X test
Apache Maven 3.3.9
Maven home: /usr/share/maven
Java version: 1.8.0_91, vendor: Oracle Corporation
Java home: /usr/lib/jvm/java-8-oracle/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "4.4.0-28-generic", arch: "amd64", family: "unix"
[DEBUG] Created new class realm maven.api
[DEBUG] Importing foreign packages into class realm maven.api
[DEBUG]   Imported: javax.enterprise.inject.* < plexus.core
[DEBUG]   Imported: javax.enterprise.util.* < plexus.core
[DEBUG]   Imported: javax.inject.* < plexus.core
[DEBUG]   Imported: org.apache.maven.* < plexus.core
[DEBUG]   Imported: org.apache.maven.artifact < plexus.core
[DEBUG]   Imported: org.apache.maven.classrealm < plexus.core
[DEBUG]   Imported: org.apache.maven.cli < plexus.core
[DEBUG]   Imported: org.apache.maven.configuration < plexus.core
[DEBUG]   Imported: org.apache.maven.exception < plexus.core
[DEBUG]   Imported: org.apache.maven.execution < plexus.core
[DEBUG]   Imported: org.apache.maven.execution.scope < plexus.core
[DEBUG]   Imported: org.apache.maven.lifecycle < plexus.core
[DEBUG]   Imported: org.apache.maven.model < plexus.core
[DEBUG]   Imported: org.apache.maven.monitor < plexus.core
[DEBUG]   Imported: org.apache.maven.plugin < plexus.core
[DEBUG]   Imported: org.apache.maven.profiles < plexus.core
[DEBUG]   Imported: org.apache.maven.project < plexus.core
[DEBUG]   Imported: org.apache.maven.reporting < plexus.core
[DEBUG]   Imported: org.apache.maven.repository < plexus.core
[DEBUG]   Imported: org.apache.maven.rtinfo < plexus.core
[DEBUG]   Imported: org.apache.maven.settings < plexus.core
[DEBUG]   Imported: org.apache.maven.toolchain < plexus.core
[DEBUG]   Imported: org.apache.maven.usability < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.* < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.authentication < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.authorization < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.events < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.observers < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.proxy < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.repository < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.resource < plexus.core
[DEBUG]   Imported: org.codehaus.classworlds < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.* < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.classworlds < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.component < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.configuration < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.container < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.context < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.lifecycle < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.logging < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.personality < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.util.xml.Xpp3Dom < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.util.xml.pull.XmlPullParser < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.util.xml.pull.XmlPullParserException < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.util.xml.pull.XmlSerializer < plexus.core
[DEBUG]   Imported: org.eclipse.aether.* < plexus.core
[DEBUG]   Imported: org.eclipse.aether.artifact < plexus.core
[DEBUG]   Imported: org.eclipse.aether.collection < plexus.core
[DEBUG]   Imported: org.eclipse.aether.deployment < plexus.core
[DEBUG]   Imported: org.eclipse.aether.graph < plexus.core
[DEBUG]   Imported: org.eclipse.aether.impl < plexus.core
[DEBUG]   Imported: org.eclipse.aether.installation < plexus.core
[DEBUG]   Imported: org.eclipse.aether.internal.impl < plexus.core
[DEBUG]   Imported: org.eclipse.aether.metadata < plexus.core
[DEBUG]   Imported: org.eclipse.aether.repository < plexus.core
[DEBUG]   Imported: org.eclipse.aether.resolution < plexus.core
[DEBUG]   Imported: org.eclipse.aether.spi < plexus.core
[DEBUG]   Imported: org.eclipse.aether.transfer < plexus.core
[DEBUG]   Imported: org.eclipse.aether.version < plexus.core
[DEBUG]   Imported: org.slf4j.* < plexus.core
[DEBUG]   Imported: org.slf4j.helpers.* < plexus.core
[DEBUG]   Imported: org.slf4j.spi.* < plexus.core
[DEBUG] Populating class realm maven.api
[INFO] Error stacktraces are turned on.
[DEBUG] Reading global settings from /usr/share/maven/conf/settings.xml
[DEBUG] Reading user settings from /home/john/.m2/settings.xml
[DEBUG] Reading global toolchains from /usr/share/maven/conf/toolchains.xml
[DEBUG] Reading user toolchains from /home/john/.m2/toolchains.xml
[DEBUG] Using local repository at /home/john/.m2/repository
[DEBUG] Using manager EnhancedLocalRepositoryManager with priority 10.0 for /home/john/.m2/repository
[INFO] Scanning for projects...
[DEBUG] Extension realms for project io.confluent:streams-examples:jar:3.1.0-SNAPSHOT: (none)
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[plexus.core, parent: null]
[DEBUG] === REACTOR BUILD PLAN ================================================
[DEBUG] Project: io.confluent:streams-examples:jar:3.1.0-SNAPSHOT
[DEBUG] Tasks:   [test]
[DEBUG] Style:   Regular
[DEBUG] =======================================================================
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building streams-examples 3.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] Lifecycle default -> [validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy]
[DEBUG] Lifecycle clean -> [pre-clean, clean, post-clean]
[DEBUG] Lifecycle site -> [pre-site, site, post-site, site-deploy]
[DEBUG] === PROJECT BUILD PLAN ================================================
[DEBUG] Project:       io.confluent:streams-examples:3.1.0-SNAPSHOT
[DEBUG] Dependencies (collect): []
[DEBUG] Dependencies (resolve): [compile, test]
[DEBUG] Repositories (dependencies): [confluent (http://packages.confluent.io/maven/, default, releases+snapshots), central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] Repositories (plugins)     : [central (https://repo.maven.apache.org/maven2, default, releases)]
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          org.codehaus.mojo:build-helper-maven-plugin:1.10:add-source (add-source)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <project default-value="${project}"/>
  <sources>
    <source>src/main/scala</source>
  </sources>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          org.apache.avro:avro-maven-plugin:1.7.7:schema (default)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <createSetters default-value="true"/>
  <fieldVisibility default-value="PUBLIC_DEPRECATED"/>
  <outputDirectory default-value="${project.build.directory}/generated-sources/avro">/home/john/IdeaProjects/streams-examples/target/generated-sources</outputDirectory>
  <project default-value="${project}"/>
  <sourceDirectory default-value="${basedir}/src/main/avro">src/main/resources/avro/io/confluent/examples/streams</sourceDirectory>
  <stringType>String</stringType>
  <templateDirectory>${templateDirectory}</templateDirectory>
  <testOutputDirectory default-value="${project.build.directory}/generated-test-sources/avro">${outputDirectory}</testOutputDirectory>
  <testSourceDirectory default-value="${basedir}/src/test/avro">${sourceDirectory}</testSourceDirectory>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          org.apache.maven.plugins:maven-resources-plugin:2.6:resources (default-resources)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <buildFilters default-value="${project.build.filters}"/>
  <encoding default-value="${project.build.sourceEncoding}">${encoding}</encoding>
  <escapeString>${maven.resources.escapeString}</escapeString>
  <escapeWindowsPaths default-value="true">${maven.resources.escapeWindowsPaths}</escapeWindowsPaths>
  <includeEmptyDirs default-value="false">${maven.resources.includeEmptyDirs}</includeEmptyDirs>
  <outputDirectory default-value="${project.build.outputDirectory}"/>
  <overwrite default-value="false">${maven.resources.overwrite}</overwrite>
  <project default-value="${project}"/>
  <resources default-value="${project.resources}"/>
  <session default-value="${session}"/>
  <supportMultiLineFiltering default-value="false">${maven.resources.supportMultiLineFiltering}</supportMultiLineFiltering>
  <useBuildFilters default-value="true"/>
  <useDefaultDelimiters default-value="true"/>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          org.apache.maven.plugins:maven-compiler-plugin:3.3:compile (default-compile)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <basedir default-value="${basedir}"/>
  <buildDirectory default-value="${project.build.directory}"/>
  <classpathElements default-value="${project.compileClasspathElements}"/>
  <compileSourceRoots default-value="${project.compileSourceRoots}"/>
  <compilerId default-value="javac">${maven.compiler.compilerId}</compilerId>
  <compilerReuseStrategy default-value="${reuseCreated}">${maven.compiler.compilerReuseStrategy}</compilerReuseStrategy>
  <compilerVersion>${maven.compiler.compilerVersion}</compilerVersion>
  <debug default-value="true">${maven.compiler.debug}</debug>
  <debuglevel>${maven.compiler.debuglevel}</debuglevel>
  <encoding default-value="${project.build.sourceEncoding}">${encoding}</encoding>
  <executable>${maven.compiler.executable}</executable>
  <failOnError default-value="true">${maven.compiler.failOnError}</failOnError>
  <forceJavacCompilerUse default-value="false">${maven.compiler.forceJavacCompilerUse}</forceJavacCompilerUse>
  <fork default-value="false">${maven.compiler.fork}</fork>
  <generatedSourcesDirectory default-value="${project.build.directory}/generated-sources/annotations"/>
  <maxmem>${maven.compiler.maxmem}</maxmem>
  <meminitial>${maven.compiler.meminitial}</meminitial>
  <mojoExecution default-value="${mojoExecution}"/>
  <optimize default-value="false">${maven.compiler.optimize}</optimize>
  <outputDirectory default-value="${project.build.outputDirectory}"/>
  <project default-value="${project}"/>
  <projectArtifact default-value="${project.artifact}"/>
  <session default-value="${session}"/>
  <showDeprecation default-value="false">${maven.compiler.showDeprecation}</showDeprecation>
  <showWarnings default-value="false">${maven.compiler.showWarnings}</showWarnings>
  <skipMain>${maven.main.skip}</skipMain>
  <skipMultiThreadWarning default-value="false">${maven.compiler.skipMultiThreadWarning}</skipMultiThreadWarning>
  <source default-value="1.5">1.8</source>
  <staleMillis default-value="0">${lastModGranularityMs}</staleMillis>
  <target default-value="1.5">1.8</target>
  <useIncrementalCompilation default-value="true">${maven.compiler.useIncrementalCompilation}</useIncrementalCompilation>
  <verbose default-value="false">${maven.compiler.verbose}</verbose>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          net.alchim31.maven:scala-maven-plugin:3.2.1:compile (default)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <addJavacArgs>${addJavacArgs}</addJavacArgs>
  <addScalacArgs>${addScalacArgs}</addScalacArgs>
  <addZincArgs>${addZincArgs}</addZincArgs>
  <analysisCacheFile default-value="${project.build.directory}/analysis/compile">${analysisCacheFile}</analysisCacheFile>
  <args>
    <arg>-Xexperimental</arg>
  </args>
  <checkMultipleScalaVersions default-value="true">${maven.scala.checkConsistency}</checkMultipleScalaVersions>
  <compileOrder default-value="mixed">${compileOrder}</compileOrder>
  <displayCmd default-value="false">${displayCmd}</displayCmd>
  <encoding>${project.build.sourceEncoding}</encoding>
  <failOnMultipleScalaVersions default-value="false"/>
  <forceUseArgFile default-value="false"/>
  <fork default-value="true"/>
  <javacArgs>${javacArgs}</javacArgs>
  <javacGenerateDebugSymbols default-value="true">${javacGenerateDebugSymbols}</javacGenerateDebugSymbols>
  <localRepo>${localRepository}</localRepo>
  <localRepository>${localRepository}</localRepository>
  <notifyCompilation default-value="true">${notifyCompilation}</notifyCompilation>
  <outputDir>${project.build.outputDirectory}</outputDir>
  <pluginArtifacts default-value="${plugin.artifacts}"/>
  <project>${project}</project>
  <reactorProjects default-value="${reactorProjects}"/>
  <recompileMode default-value="all">${recompileMode}</recompileMode>
  <remoteRepos>${project.remoteArtifactRepositories}</remoteRepos>
  <scalaClassName default-value="scala.tools.nsc.Main">${maven.scala.className}</scalaClassName>
  <scalaCompatVersion>${scala.compat.version}</scalaCompatVersion>
  <scalaHome>${scala.home}</scalaHome>
  <scalaOrganization default-value="org.scala-lang">${scala.organization}</scalaOrganization>
  <scalaVersion>${scala.version}</scalaVersion>
  <sendJavaToScalac default-value="true"/>
  <session>${session}</session>
  <source>${maven.compiler.source}</source>
  <sourceDir default-value="${project.build.sourceDirectory}/../scala"/>
  <target>${maven.compiler.target}</target>
  <useCanonicalPath default-value="true">${maven.scala.useCanonicalPath}</useCanonicalPath>
  <useZincServer default-value="false">${useZincServer}</useZincServer>
  <zincPort default-value="3030">${zincPort}</zincPort>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          org.codehaus.mojo:build-helper-maven-plugin:1.10:add-test-source (add-test-source)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <project default-value="${project}"/>
  <sources>
    <source>src/test/scala</source>
  </sources>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          org.apache.maven.plugins:maven-resources-plugin:2.6:testResources (default-testResources)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <buildFilters default-value="${project.build.filters}"/>
  <encoding default-value="${project.build.sourceEncoding}">${encoding}</encoding>
  <escapeString>${maven.resources.escapeString}</escapeString>
  <escapeWindowsPaths default-value="true">${maven.resources.escapeWindowsPaths}</escapeWindowsPaths>
  <includeEmptyDirs default-value="false">${maven.resources.includeEmptyDirs}</includeEmptyDirs>
  <outputDirectory default-value="${project.build.testOutputDirectory}"/>
  <overwrite default-value="false">${maven.resources.overwrite}</overwrite>
  <project default-value="${project}"/>
  <resources default-value="${project.testResources}"/>
  <session default-value="${session}"/>
  <skip>${maven.test.skip}</skip>
  <supportMultiLineFiltering default-value="false">${maven.resources.supportMultiLineFiltering}</supportMultiLineFiltering>
  <useBuildFilters default-value="true"/>
  <useDefaultDelimiters default-value="true"/>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          org.apache.maven.plugins:maven-compiler-plugin:3.3:testCompile (default-testCompile)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <basedir default-value="${basedir}"/>
  <buildDirectory default-value="${project.build.directory}"/>
  <classpathElements default-value="${project.testClasspathElements}"/>
  <compileSourceRoots default-value="${project.testCompileSourceRoots}"/>
  <compilerId default-value="javac">${maven.compiler.compilerId}</compilerId>
  <compilerReuseStrategy default-value="${reuseCreated}">${maven.compiler.compilerReuseStrategy}</compilerReuseStrategy>
  <compilerVersion>${maven.compiler.compilerVersion}</compilerVersion>
  <debug default-value="true">${maven.compiler.debug}</debug>
  <debuglevel>${maven.compiler.debuglevel}</debuglevel>
  <encoding default-value="${project.build.sourceEncoding}">${encoding}</encoding>
  <executable>${maven.compiler.executable}</executable>
  <failOnError default-value="true">${maven.compiler.failOnError}</failOnError>
  <forceJavacCompilerUse default-value="false">${maven.compiler.forceJavacCompilerUse}</forceJavacCompilerUse>
  <fork default-value="false">${maven.compiler.fork}</fork>
  <generatedTestSourcesDirectory default-value="${project.build.directory}/generated-test-sources/test-annotations"/>
  <maxmem>${maven.compiler.maxmem}</maxmem>
  <meminitial>${maven.compiler.meminitial}</meminitial>
  <mojoExecution default-value="${mojoExecution}"/>
  <optimize default-value="false">${maven.compiler.optimize}</optimize>
  <outputDirectory default-value="${project.build.testOutputDirectory}"/>
  <project default-value="${project}"/>
  <session default-value="${session}"/>
  <showDeprecation default-value="false">${maven.compiler.showDeprecation}</showDeprecation>
  <showWarnings default-value="false">${maven.compiler.showWarnings}</showWarnings>
  <skip>${maven.test.skip}</skip>
  <skipMultiThreadWarning default-value="false">${maven.compiler.skipMultiThreadWarning}</skipMultiThreadWarning>
  <source default-value="1.5">1.8</source>
  <staleMillis default-value="0">${lastModGranularityMs}</staleMillis>
  <target default-value="1.5">1.8</target>
  <testSource>${maven.compiler.testSource}</testSource>
  <testTarget>${maven.compiler.testTarget}</testTarget>
  <useIncrementalCompilation default-value="true">${maven.compiler.useIncrementalCompilation}</useIncrementalCompilation>
  <verbose default-value="false">${maven.compiler.verbose}</verbose>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          net.alchim31.maven:scala-maven-plugin:3.2.1:testCompile (default)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <addJavacArgs>${addJavacArgs}</addJavacArgs>
  <addScalacArgs>${addScalacArgs}</addScalacArgs>
  <addZincArgs>${addZincArgs}</addZincArgs>
  <args>
    <arg>-Xexperimental</arg>
  </args>
  <checkMultipleScalaVersions default-value="true">${maven.scala.checkConsistency}</checkMultipleScalaVersions>
  <compileOrder default-value="mixed">${compileOrder}</compileOrder>
  <displayCmd default-value="false">${displayCmd}</displayCmd>
  <encoding>${project.build.sourceEncoding}</encoding>
  <failOnMultipleScalaVersions default-value="false"/>
  <forceUseArgFile default-value="false"/>
  <fork default-value="true"/>
  <javacArgs>${javacArgs}</javacArgs>
  <javacGenerateDebugSymbols default-value="true">${javacGenerateDebugSymbols}</javacGenerateDebugSymbols>
  <localRepo>${localRepository}</localRepo>
  <localRepository>${localRepository}</localRepository>
  <notifyCompilation default-value="true">${notifyCompilation}</notifyCompilation>
  <pluginArtifacts default-value="${plugin.artifacts}"/>
  <project>${project}</project>
  <reactorProjects default-value="${reactorProjects}"/>
  <recompileMode default-value="all">${recompileMode}</recompileMode>
  <remoteRepos>${project.remoteArtifactRepositories}</remoteRepos>
  <scalaClassName default-value="scala.tools.nsc.Main">${maven.scala.className}</scalaClassName>
  <scalaCompatVersion>${scala.compat.version}</scalaCompatVersion>
  <scalaHome>${scala.home}</scalaHome>
  <scalaOrganization default-value="org.scala-lang">${scala.organization}</scalaOrganization>
  <scalaVersion>${scala.version}</scalaVersion>
  <sendJavaToScalac default-value="true"/>
  <session>${session}</session>
  <skip>${maven.test.skip}</skip>
  <source>${maven.compiler.source}</source>
  <target>${maven.compiler.target}</target>
  <testAnalysisCacheFile default-value="${project.build.directory}/analysis/test-compile">${testAnalysisCacheFile}</testAnalysisCacheFile>
  <testOutputDir default-value="${project.build.testOutputDirectory}"/>
  <testSourceDir default-value="${project.build.testSourceDirectory}/../scala"/>
  <useCanonicalPath default-value="true">${maven.scala.useCanonicalPath}</useCanonicalPath>
  <useZincServer default-value="false">${useZincServer}</useZincServer>
  <zincPort default-value="3030">${zincPort}</zincPort>
</configuration>
[DEBUG] -----------------------------------------------------------------------
[DEBUG] Goal:          org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test)
[DEBUG] Style:         Regular
[DEBUG] Configuration: <?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <additionalClasspathElements>${maven.test.additionalClasspath}</additionalClasspathElements>
  <argLine>${argLine}</argLine>
  <basedir default-value="${basedir}"/>
  <childDelegation default-value="false">${childDelegation}</childDelegation>
  <classesDirectory default-value="${project.build.outputDirectory}"/>
  <classpathDependencyExcludes>${maven.test.dependency.excludes}</classpathDependencyExcludes>
  <debugForkedProcess>${maven.surefire.debug}</debugForkedProcess>
  <dependenciesToScan>${dependenciesToScan}</dependenciesToScan>
  <disableXmlReport default-value="false">${disableXmlReport}</disableXmlReport>
  <enableAssertions default-value="true">${enableAssertions}</enableAssertions>
  <excludedGroups>${excludedGroups}</excludedGroups>
  <failIfNoSpecifiedTests>${surefire.failIfNoSpecifiedTests}</failIfNoSpecifiedTests>
  <failIfNoTests>${failIfNoTests}</failIfNoTests>
  <forkCount default-value="1">${forkCount}</forkCount>
  <forkMode default-value="once">${forkMode}</forkMode>
  <forkedProcessTimeoutInSeconds>${surefire.timeout}</forkedProcessTimeoutInSeconds>
  <groups>${groups}</groups>
  <junitArtifactName default-value="junit:junit">${junitArtifactName}</junitArtifactName>
  <jvm>${jvm}</jvm>
  <localRepository default-value="${localRepository}"/>
  <objectFactory>${objectFactory}</objectFactory>
  <parallel>${parallel}</parallel>
  <parallelMavenExecution default-value="${session.parallel}"/>
  <parallelOptimized default-value="true">${parallelOptimized}</parallelOptimized>
  <parallelTestsTimeoutForcedInSeconds>${surefire.parallel.forcedTimeout}</parallelTestsTimeoutForcedInSeconds>
  <parallelTestsTimeoutInSeconds>${surefire.parallel.timeout}</parallelTestsTimeoutInSeconds>
  <perCoreThreadCount default-value="true">${perCoreThreadCount}</perCoreThreadCount>
  <pluginArtifactMap>${plugin.artifactMap}</pluginArtifactMap>
  <pluginDescriptor default-value="${plugin}"/>
  <printSummary default-value="true">${surefire.printSummary}</printSummary>
  <projectArtifactMap>${project.artifactMap}</projectArtifactMap>
  <redirectTestOutputToFile default-value="false">${maven.test.redirectTestOutputToFile}</redirectTestOutputToFile>
  <remoteRepositories default-value="${project.pluginArtifactRepositories}"/>
  <reportFormat default-value="brief">${surefire.reportFormat}</reportFormat>
  <reportNameSuffix default-value="">${surefire.reportNameSuffix}</reportNameSuffix>
  <reportsDirectory default-value="${project.build.directory}/surefire-reports"/>
  <reuseForks default-value="true">${reuseForks}</reuseForks>
  <runOrder default-value="filesystem"/>
  <skip default-value="false">${maven.test.skip}</skip>
  <skipExec>${maven.test.skip.exec}</skipExec>
  <skipTests default-value="false">${skipTests}</skipTests>
  <test>${test}</test>
  <testClassesDirectory default-value="${project.build.testOutputDirectory}"/>
  <testFailureIgnore default-value="false">${maven.test.failure.ignore}</testFailureIgnore>
  <testNGArtifactName default-value="org.testng:testng">${testNGArtifactName}</testNGArtifactName>
  <testSourceDirectory default-value="${project.build.testSourceDirectory}"/>
  <threadCount>${threadCount}</threadCount>
  <threadCountClasses default-value="0">${threadCountClasses}</threadCountClasses>
  <threadCountMethods default-value="0">${threadCountMethods}</threadCountMethods>
  <threadCountSuites default-value="0">${threadCountSuites}</threadCountSuites>
  <trimStackTrace default-value="true">${trimStackTrace}</trimStackTrace>
  <useFile default-value="true">${surefire.useFile}</useFile>
  <useManifestOnlyJar default-value="true">${surefire.useManifestOnlyJar}</useManifestOnlyJar>
  <useSystemClassLoader default-value="true">${surefire.useSystemClassLoader}</useSystemClassLoader>
  <useUnlimitedThreads default-value="false">${useUnlimitedThreads}</useUnlimitedThreads>
  <workingDirectory>${basedir}</workingDirectory>
  <project default-value="${project}"/>
  <session default-value="${session}"/>
</configuration>
[DEBUG] =======================================================================
[DEBUG] Could not find metadata org.apache.kafka:kafka-clients:0.10.1.0-SNAPSHOT/maven-metadata.xml in local (/home/john/.m2/repository)
[DEBUG] Failure to find org.apache.kafka:kafka-clients:0.10.1.0-SNAPSHOT/maven-metadata.xml in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
[DEBUG] Could not find metadata org.apache.kafka:kafka-clients:0.10.1.0-SNAPSHOT/maven-metadata.xml in local (/home/john/.m2/repository)
[DEBUG] Failure to find org.apache.kafka:kafka-clients:0.10.1.0-SNAPSHOT/maven-metadata.xml in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
[WARNING] The POM for org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT is missing, no dependency information available
[DEBUG] Could not find metadata org.apache.kafka:kafka-streams:0.10.1.0-SNAPSHOT/maven-metadata.xml in local (/home/john/.m2/repository)
[DEBUG] Failure to find org.apache.kafka:kafka-streams:0.10.1.0-SNAPSHOT/maven-metadata.xml in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
[DEBUG] Could not find metadata org.apache.kafka:kafka-streams:0.10.1.0-SNAPSHOT/maven-metadata.xml in local (/home/john/.m2/repository)
[DEBUG] Failure to find org.apache.kafka:kafka-streams:0.10.1.0-SNAPSHOT/maven-metadata.xml in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
[WARNING] The POM for org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT is missing, no dependency information available
[DEBUG] Could not find metadata org.apache.kafka:kafka_2.11:0.10.1.0-SNAPSHOT/maven-metadata.xml in local (/home/john/.m2/repository)
[DEBUG] Failure to find org.apache.kafka:kafka_2.11:0.10.1.0-SNAPSHOT/maven-metadata.xml in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
[DEBUG] Could not find metadata org.apache.kafka:kafka_2.11:0.10.1.0-SNAPSHOT/maven-metadata.xml in local (/home/john/.m2/repository)
[DEBUG] Failure to find org.apache.kafka:kafka_2.11:0.10.1.0-SNAPSHOT/maven-metadata.xml in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
[WARNING] The POM for org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT is missing, no dependency information available
[DEBUG] Dependency collection stats: {ConflictMarker.analyzeTime=3, ConflictMarker.markTime=1, ConflictMarker.nodeCount=307, ConflictIdSorter.graphTime=1, ConflictIdSorter.topsortTime=1, ConflictIdSorter.conflictIdCount=107, ConflictIdSorter.conflictIdCycleCount=0, ConflictResolver.totalTime=15, ConflictResolver.conflictItemCount=224, DefaultDependencyCollector.collectTime=588, DefaultDependencyCollector.transformTime=23}
[DEBUG] io.confluent:streams-examples:jar:3.1.0-SNAPSHOT
[DEBUG]    io.confluent:kafka-avro-serializer:jar:3.0.0:compile
[DEBUG]       io.confluent:common-config:jar:3.0.0:compile
[DEBUG]    io.confluent:kafka-schema-registry-client:jar:3.0.0:compile
[DEBUG]       com.fasterxml.jackson.core:jackson-databind:jar:2.5.4:compile
[DEBUG]          com.fasterxml.jackson.core:jackson-annotations:jar:2.5.0:compile
[DEBUG]          com.fasterxml.jackson.core:jackson-core:jar:2.5.4:compile
[DEBUG]       org.slf4j:slf4j-log4j12:jar:1.7.6:compile
[DEBUG]    org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT:compile
[DEBUG]    org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT:compile
[DEBUG]    org.apache.avro:avro:jar:1.7.7:compile
[DEBUG]       org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile
[DEBUG]       org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:compile
[DEBUG]       com.thoughtworks.paranamer:paranamer:jar:2.3:compile
[DEBUG]       org.xerial.snappy:snappy-java:jar:1.0.5:compile
[DEBUG]       org.apache.commons:commons-compress:jar:1.4.1:compile
[DEBUG]          org.tukaani:xz:jar:1.0:compile
[DEBUG]       org.slf4j:slf4j-api:jar:1.6.4:compile
[DEBUG]    org.apache.avro:avro-maven-plugin:jar:1.7.7:compile
[DEBUG]       org.apache.maven:maven-plugin-api:jar:2.0.10:compile
[DEBUG]       org.apache.maven:maven-project:jar:2.0.10:compile
[DEBUG]          org.apache.maven:maven-settings:jar:2.0.10:compile
[DEBUG]          org.apache.maven:maven-profile:jar:2.0.10:compile
[DEBUG]          org.apache.maven:maven-model:jar:2.0.10:compile
[DEBUG]          org.apache.maven:maven-artifact-manager:jar:2.0.10:compile
[DEBUG]             org.apache.maven:maven-repository-metadata:jar:2.0.10:compile
[DEBUG]             org.apache.maven.wagon:wagon-provider-api:jar:1.0-beta-2:compile
[DEBUG]          org.apache.maven:maven-plugin-registry:jar:2.0.10:compile
[DEBUG]          org.codehaus.plexus:plexus-interpolation:jar:1.1:compile
[DEBUG]          org.codehaus.plexus:plexus-utils:jar:1.5.5:compile
[DEBUG]          org.apache.maven:maven-artifact:jar:2.0.10:compile
[DEBUG]          org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile
[DEBUG]             classworlds:classworlds:jar:1.1-alpha-2:compile
[DEBUG]       org.apache.maven.shared:file-management:jar:1.2.1:compile
[DEBUG]          org.apache.maven.shared:maven-shared-io:jar:1.1:compile
[DEBUG]       org.apache.avro:avro-compiler:jar:1.7.7:compile
[DEBUG]          commons-lang:commons-lang:jar:2.6:compile
[DEBUG]          org.apache.velocity:velocity:jar:1.7:compile
[DEBUG]             commons-collections:commons-collections:jar:3.2.1:compile
[DEBUG]    org.scala-lang:scala-library:jar:2.11.8:compile
[DEBUG]    com.101tec:zkclient:jar:0.7:compile
[DEBUG]       log4j:log4j:jar:1.2.15:compile
[DEBUG]          javax.mail:mail:jar:1.4:compile
[DEBUG]             javax.activation:activation:jar:1.1:compile
[DEBUG]       org.apache.zookeeper:zookeeper:jar:3.4.6:compile
[DEBUG]          jline:jline:jar:0.9.94:compile
[DEBUG]          io.netty:netty:jar:3.7.0.Final:compile
[DEBUG]    junit:junit:jar:4.12:test
[DEBUG]       org.hamcrest:hamcrest-core:jar:1.3:test
[DEBUG]    org.assertj:assertj-core:jar:3.3.0:test
[DEBUG]    org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT:test
[DEBUG]    org.apache.curator:curator-test:jar:2.9.0:test
[DEBUG]       org.javassist:javassist:jar:3.18.1-GA:test
[DEBUG]       org.apache.commons:commons-math:jar:2.2:test
[DEBUG]       com.google.guava:guava:jar:16.0.1:test
[DEBUG]    io.confluent:kafka-schema-registry:jar:3.0.0:test
[DEBUG]       org.apache.kafka:kafka_2.11:jar:0.10.0.0-cp1:test
[DEBUG]          com.yammer.metrics:metrics-core:jar:2.2.0:test
[DEBUG]          org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4:test
[DEBUG]          net.sf.jopt-simple:jopt-simple:jar:4.9:test
[DEBUG]       io.confluent:common-utils:jar:3.0.0:compile
[DEBUG]       org.glassfish.jersey.ext:jersey-bean-validation:jar:2.19:test
[DEBUG]          org.glassfish.hk2.external:javax.inject:jar:2.4.0-b25:test
[DEBUG]          org.glassfish.jersey.core:jersey-common:jar:2.19:test
[DEBUG]             javax.annotation:javax.annotation-api:jar:1.2:test
[DEBUG]             org.glassfish.jersey.bundles.repackaged:jersey-guava:jar:2.19:test
[DEBUG]             org.glassfish.hk2:hk2-api:jar:2.4.0-b25:test
[DEBUG]                org.glassfish.hk2:hk2-utils:jar:2.4.0-b25:test
[DEBUG]                org.glassfish.hk2.external:aopalliance-repackaged:jar:2.4.0-b25:test
[DEBUG]             org.glassfish.hk2:hk2-locator:jar:2.4.0-b25:test
[DEBUG]             org.glassfish.hk2:osgi-resource-locator:jar:1.0.1:test
[DEBUG]          org.glassfish.jersey.core:jersey-server:jar:2.19:test
[DEBUG]             org.glassfish.jersey.core:jersey-client:jar:2.19:test
[DEBUG]             org.glassfish.jersey.media:jersey-media-jaxb:jar:2.19:test
[DEBUG]          javax.validation:validation-api:jar:1.1.0.Final:test
[DEBUG]          org.hibernate:hibernate-validator:jar:5.1.2.Final:test
[DEBUG]             org.jboss.logging:jboss-logging:jar:3.1.3.GA:test
[DEBUG]             com.fasterxml:classmate:jar:1.0.0:test
[DEBUG]          javax.el:javax.el-api:jar:2.2.4:test
[DEBUG]          org.glassfish.web:javax.el:jar:2.2.4:test
[DEBUG]          javax.ws.rs:javax.ws.rs-api:jar:2.0.1:test
[DEBUG]       io.confluent:rest-utils:jar:3.0.0:test
[DEBUG]          io.confluent:common-metrics:jar:3.0.0:test
[DEBUG]          org.glassfish.jersey.containers:jersey-container-servlet:jar:2.19:test
[DEBUG]             org.glassfish.jersey.containers:jersey-container-servlet-core:jar:2.19:test
[DEBUG]          org.eclipse.jetty:jetty-server:jar:9.2.12.v20150709:test
[DEBUG]             javax.servlet:javax.servlet-api:jar:3.1.0:test
[DEBUG]             org.eclipse.jetty:jetty-http:jar:9.2.12.v20150709:test
[DEBUG]             org.eclipse.jetty:jetty-io:jar:9.2.12.v20150709:test
[DEBUG]          org.eclipse.jetty:jetty-servlet:jar:9.2.12.v20150709:test
[DEBUG]             org.eclipse.jetty:jetty-security:jar:9.2.12.v20150709:test
[DEBUG]          org.eclipse.jetty:jetty-servlets:jar:9.2.12.v20150709:test
[DEBUG]             org.eclipse.jetty:jetty-continuation:jar:9.2.12.v20150709:test
[DEBUG]             org.eclipse.jetty:jetty-util:jar:9.2.12.v20150709:test
[DEBUG]          com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:jar:2.5.4:test
[DEBUG]             com.fasterxml.jackson.module:jackson-module-jaxb-annotations:jar:2.5.4:test
[DEBUG]          com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar:2.5.4:test
[DEBUG]    io.confluent:kafka-schema-registry:jar:tests:3.0.0:test
[DEBUG]    org.scalactic:scalactic_2.11:jar:2.2.6:compile
[DEBUG]       org.scala-lang:scala-reflect:jar:2.11.7:compile
[DEBUG]    org.scalatest:scalatest_2.11:jar:2.2.6:test
[DEBUG]       org.scala-lang.modules:scala-xml_2.11:jar:1.0.2:test
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.094 s
[INFO] Finished at: 2016-07-12T14:45:13-04:00
[INFO] Final Memory: 11M/122M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project streams-examples: Could not resolve dependencies for project io.confluent:streams-examples:jar:3.1.0-SNAPSHOT: The following artifacts could not be resolved: org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT: Failure to find org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal on project streams-examples: Could not resolve dependencies for project io.confluent:streams-examples:jar:3.1.0-SNAPSHOT: The following artifacts could not be resolved: org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT: Failure to find org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
        at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.getDependencies(LifecycleDependencyResolver.java:221)
        at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.resolveProjectDependencies(LifecycleDependencyResolver.java:127)
        at org.apache.maven.lifecycle.internal.MojoExecutor.ensureDependenciesAreResolved(MojoExecutor.java:245)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:199)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
        at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
        at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
        at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
        at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
        at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
        at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
        at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
        at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
        at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.project.DependencyResolutionException: Could not resolve dependencies for project io.confluent:streams-examples:jar:3.1.0-SNAPSHOT: The following artifacts could not be resolved: org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT: Failure to find org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
        at org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:211)
        at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.getDependencies(LifecycleDependencyResolver.java:195)
        ... 23 more
Caused by: org.eclipse.aether.resolution.DependencyResolutionException: The following artifacts could not be resolved: org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT: Failure to find org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
        at org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:384)
        at org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:205)
        ... 24 more
Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: The following artifacts could not be resolved: org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka-streams:jar:0.10.1.0-SNAPSHOT, org.apache.kafka:kafka_2.11:jar:test:0.10.1.0-SNAPSHOT: Failure to find org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
        at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:444)
        at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:246)
        at org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:367)
        ... 25 more
Caused by: org.eclipse.aether.transfer.ArtifactNotFoundException: Failure to find org.apache.kafka:kafka-clients:jar:0.10.1.0-SNAPSHOT in http://packages.confluent.io/maven/ was cached in the local repository, resolution will not be reattempted until the update interval of confluent has elapsed or updates are forced
        at org.eclipse.aether.internal.impl.DefaultUpdateCheckManager.newException(DefaultUpdateCheckManager.java:231)
        at org.eclipse.aether.internal.impl.DefaultUpdateCheckManager.checkArtifact(DefaultUpdateCheckManager.java:206)
        at org.eclipse.aether.internal.impl.DefaultArtifactResolver.gatherDownloads(DefaultArtifactResolver.java:585)
        at org.eclipse.aether.internal.impl.DefaultArtifactResolver.performDownloads(DefaultArtifactResolver.java:503)
        at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:421)
        ... 27 more
[ERROR] 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
john@john-VirtualBox:~/IdeaProjects/streams-examples$ 

User creates a Storm topology that uses AvroScheme in Kafka spout *** FAILED ***

I get this error when I run tests after setup.

[info]   Scenario: User creates a Storm topology that uses AvroScheme in Kafka spout *** FAILED ***
[info]   SynchronizedQueue() was not equal to List({"username": "ANY_USER_1", "text": "ANY_TEXT_1", "timestamp": 1458751980}, {"username": "ANY_USER_2", "text": "ANY_TEXT_2", "timestamp": 0}, {"username": "ANY_USER_3", "text": "ANY_TEXT_3", "timestamp": 1234}) (KafkaStormSpec.scala:259)
[info]     Given a ZooKeeper instance
[info]     And a Kafka broker instance
[info]     And a Storm topology that uses AvroScheme and that reads tweets from topic testing-input and writes them as-is to topic testing-output
[info]     And some tweets
[info]     And a synchronous Kafka producer app that writes to the topic testing-input
[info]     And a single-threaded Kafka consumer app that reads from topic testing-output and Avro-decodes the incoming data
[info]     And a Storm topology configuration that registers an Avro Kryo decorator for Tweet
[info]     When I run the Storm topology
[info]     And I Avro-encode the tweets and use the Kafka producer app to sent them to Kafka
[info]     Synchronously sending Tweet {"username": "ANY_USER_1", "text": "ANY_TEXT_1", "timestamp": 1458751980} to topic Some(testing-input)
[info]     Synchronously sending Tweet {"username": "ANY_USER_2", "text": "ANY_TEXT_2", "timestamp": 0} to topic Some(testing-input)
[info]     Synchronously sending Tweet {"username": "ANY_USER_3", "text": "ANY_TEXT_3", "timestamp": 1234} to topic Some(testing-input)
[info]     Then the Kafka consumer app should receive the original tweets from the Storm topology

KafkaStormSpec Fails on AvroScheme[T] and AvroDecoderBolt[T] features

On various Debian/Ubuntu Boxes:

  • java version "1.6.0_45" - Oracle JDK 6
  • scala 2.10.4
  • sbt 0.13.2

sample output:

[info] Feature: AvroScheme[T] for Kafka spout
[2014-07-11 10:25:29,454] ERROR Closing socket for /127.0.0.1 because of error (kafka.network.Processor)
java.io.IOException: Connection reset by peer
    at sun.nio.ch.FileDispatcher.read0(Native Method)
    at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:21)
    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:198)
    at sun.nio.ch.IOUtil.read(IOUtil.java:171)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:245)
    at kafka.utils.Utils$.read(Utils.scala:375)
    at kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
    at kafka.network.Processor.read(SocketServer.scala:347)
    at kafka.network.Processor.run(SocketServer.scala:245)
    at java.lang.Thread.run(Thread.java:662)
[info]   Scenario: User creates a Storm topology that uses AvroScheme in Kafka spout *** FAILED ***
[info]   SynchronizedQueue() was not equal to List({"username": "ANY_USER_1", "text": "ANY_TEXT_1", "timestamp": 1405088708}, {"username": "ANY_USER_2", "text": "ANY_TEXT_2", "timestamp": 0}, {"username": "ANY_USER_3", "text": "ANY_TEXT_3", "timestamp": 1234}) (KafkaStormSpec.scala:329)
[info]     Given a ZooKeeper instance 
[info]     And a Kafka broker instance 
[info]     And a Storm topology that uses AvroScheme and that reads tweets from topic testing-input and writes them as-is to topic testing-output 
[info]     And some tweets 
[info]     And a synchronous Kafka producer app that writes to the topic testing-input 
[info]     And a single-threaded Kafka consumer app that reads from topic testing-output and Avro-decodes the incoming data 
[info]     And a Storm topology configuration that registers an Avro Kryo decorator for Tweet 
[info]     When I run the Storm topology 
[info]     And I Avro-encode the tweets and use the Kafka producer app to sent them to Kafka 
[info]     Then the Kafka consumer app should receive the original tweets from the Storm topology 
[2014-07-11 10:25:55,971] ERROR Connection timed out for connection string (127.0.0.1:52204) and timeout (15000) / elapsed (15022) (org.apache.curator.ConnectionState)
org.apache.curator.CuratorConnectionLossException: KeeperErrorCode = ConnectionLoss
    at org.apache.curator.ConnectionState.checkTimeouts(ConnectionState.java:198)
    at org.apache.curator.ConnectionState.getZooKeeper(ConnectionState.java:88)
    at org.apache.curator.CuratorZookeeperClient.getZooKeeper(CuratorZookeeperClient.java:113)
    at org.apache.curator.framework.imps.CuratorFrameworkImpl.performBackgroundOperation(CuratorFrameworkImpl.java:763)
    at org.apache.curator.framework.imps.CuratorFrameworkImpl.backgroundOperationsLoop(CuratorFrameworkImpl.java:749)
    at org.apache.curator.framework.imps.CuratorFrameworkImpl.access$300(CuratorFrameworkImpl.java:56)
    at org.apache.curator.framework.imps.CuratorFrameworkImpl$3.call(CuratorFrameworkImpl.java:244)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
    at java.lang.Thread.run(Thread.java:662)

Failed to run "sbt test"

After running sbt test on my computer the error code posted here appears. Yet it is strange because I have succesfully run this code on my personal computer. I would believe there is an issue related to the generated code from the avro schema, but it is only an assumption.

I would really appreciate some help with this issue. Thanks in advanced.

P.S: Here goes the error code

[info] Loading project definition from /home/franco/sandbox/kafka-storm-starter/project
[info] Set current project to kafka-storm-starter (in build file:/home/franco/sandbox/kafka-storm-starter/)
[info] Compiling 12 Scala sources to /home/franco/sandbox/kafka-storm-starter/target/scala-2.10/test-classes...
[error] 
[error]      while compiling: /home/franco/sandbox/kafka-storm-starter/src/test/scala/com/miguno/kafkastorm/testing/EmbeddedKafkaZooKeeperCluster.scala
[error]         during phase: jvm
[error]      library version: version 2.10.5
[error]     compiler version: version 2.10.5
[error]   reconstructed args: -classpath /home/franco/sandbox/kafka-storm-starter/target/scala-2.10/test-classes:/home/franco/sandbox/kafka-storm-starter/target/scala-2.10/classes:/home/franco/.ivy2/cache/org.apache.avro/avro-compiler/bundles/avro-compiler-1.7.7.jar:/home/franco/.ivy2/cache/org.apache.avro/avro/jars/avro-1.7.7.jar:/home/franco/.ivy2/cache/org.codehaus.jackson/jackson-core-asl/jars/jackson-core-asl-1.9.13.jar:/home/franco/.ivy2/cache/org.codehaus.jackson/jackson-mapper-asl/jars/jackson-mapper-asl-1.9.13.jar:/home/franco/.ivy2/cache/org.apache.commons/commons-compress/jars/commons-compress-1.4.1.jar:/home/franco/.ivy2/cache/org.tukaani/xz/jars/xz-1.0.jar:/home/franco/.ivy2/cache/commons-lang/commons-lang/jars/commons-lang-2.6.jar:/home/franco/.ivy2/cache/org.apache.velocity/velocity/jars/velocity-1.7.jar:/home/franco/.ivy2/cache/commons-collections/commons-collections/jars/commons-collections-3.2.1.jar:/home/franco/.ivy2/cache/com.twitter/bijection-core_2.10/bundles/bijection-core_2.10-0.7.1.jar:/home/franco/.ivy2/cache/com.twitter/bijection-avro_2.10/bundles/bijection-avro_2.10-0.7.1.jar:/home/franco/.ivy2/cache/com.twitter/chill_2.10/jars/chill_2.10-0.5.1.jar:/home/franco/.ivy2/cache/com.twitter/chill-java/jars/chill-java-0.5.1.jar:/home/franco/.ivy2/cache/com.esotericsoftware.kryo/kryo/bundles/kryo-2.21.jar:/home/franco/.ivy2/cache/com.esotericsoftware.reflectasm/reflectasm/jars/reflectasm-1.07-shaded.jar:/home/franco/.ivy2/cache/org.ow2.asm/asm/jars/asm-4.0.jar:/home/franco/.ivy2/cache/com.esotericsoftware.minlog/minlog/jars/minlog-1.2.jar:/home/franco/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-1.2.jar:/home/franco/.ivy2/cache/com.twitter/chill-avro_2.10/jars/chill-avro_2.10-0.5.1.jar:/home/franco/.ivy2/cache/com.twitter/chill-bijection_2.10/jars/chill-bijection_2.10-0.5.1.jar:/home/franco/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.2.jar:/home/franco/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.2.jar:/home/franco/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.7.6.jar:/home/franco/.ivy2/cache/org.xerial.snappy/snappy-java/bundles/snappy-java-1.1.1.7.jar:/home/franco/.ivy2/cache/net.jpountz.lz4/lz4/jars/lz4-1.2.0.jar:/home/franco/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar:/home/franco/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar:/home/franco/.ivy2/cache/org.apache.storm/storm-kafka/jars/storm-kafka-0.9.6.jar:/home/franco/.ivy2/cache/org.apache.curator/curator-framework/bundles/curator-framework-2.5.0.jar:/home/franco/.ivy2/cache/org.apache.curator/curator-client/bundles/curator-client-2.5.0.jar:/home/franco/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.1.jar:/home/franco/.ivy2/cache/org.apache.spark/spark-core_2.10/jars/spark-core_2.10-1.1.1.jar:/home/franco/.ivy2/cache/org.apache.hadoop/hadoop-client/jars/hadoop-client-1.0.4.jar:/home/franco/.ivy2/cache/org.apache.hadoop/hadoop-core/jars/hadoop-core-1.0.4.jar:/home/franco/.ivy2/cache/xmlenc/xmlenc/jars/xmlenc-0.52.jar:/home/franco/.ivy2/cache/commons-configuration/commons-configuration/jars/commons-configuration-1.6.jar:/home/franco/.ivy2/cache/commons-digester/commons-digester/jars/commons-digester-1.8.jar:/home/franco/.ivy2/cache/commons-beanutils/commons-beanutils/jars/commons-beanutils-1.7.0.jar:/home/franco/.ivy2/cache/commons-beanutils/commons-beanutils-core/jars/commons-beanutils-core-1.8.0.jar:/home/franco/.ivy2/cache/commons-net/commons-net/jars/commons-net-2.2.jar:/home/franco/.ivy2/cache/commons-el/commons-el/jars/commons-el-1.0.jar:/home/franco/.ivy2/cache/hsqldb/hsqldb/jars/hsqldb-1.8.0.10.jar:/home/franco/.ivy2/cache/oro/oro/jars/oro-2.0.8.jar:/home/franco/.ivy2/cache/net.java.dev.jets3t/jets3t/jars/jets3t-0.7.1.jar:/home/franco/.ivy2/cache/commons-httpclient/commons-httpclient/jars/commons-httpclient-3.1.jar:/home/franco/.ivy2/cache/org.apache.curator/curator-recipes/bundles/curator-recipes-2.4.0.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-plus/jars/jetty-plus-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty.orbit/javax.transaction/orbits/javax.transaction-1.1.1.v201105210645.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-webapp/jars/jetty-webapp-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-xml/jars/jetty-xml-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-util/jars/jetty-util-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-servlet/jars/jetty-servlet-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-security/jars/jetty-security-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-server/jars/jetty-server-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-continuation/jars/jetty-continuation-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-http/jars/jetty-http-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-io/jars/jetty-io-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty/jetty-jndi/jars/jetty-jndi-8.1.14.v20131031.jar:/home/franco/.ivy2/cache/org.eclipse.jetty.orbit/javax.mail.glassfish/orbits/javax.mail.glassfish-1.4.1.v201005082020.jar:/home/franco/.ivy2/cache/org.eclipse.jetty.orbit/javax.activation/orbits/javax.activation-1.1.0.v201105071233.jar:/home/franco/.ivy2/cache/org.apache.commons/commons-lang3/jars/commons-lang3-3.3.2.jar:/home/franco/.ivy2/cache/com.google.code.findbugs/jsr305/jars/jsr305-1.3.9.jar:/home/franco/.ivy2/cache/com.ning/compress-lzf/bundles/compress-lzf-1.0.0.jar:/home/franco/.ivy2/cache/org.spark-project.akka/akka-remote_2.10/bundles/akka-remote_2.10-2.2.3-shaded-protobuf.jar:/home/franco/.ivy2/cache/org.spark-project.akka/akka-actor_2.10/jars/akka-actor_2.10-2.2.3-shaded-protobuf.jar:/home/franco/.ivy2/cache/com.typesafe/config/bundles/config-1.0.2.jar:/home/franco/.ivy2/cache/io.netty/netty/bundles/netty-3.6.6.Final.jar:/home/franco/.ivy2/cache/org.spark-project.protobuf/protobuf-java/jars/protobuf-java-2.4.1-shaded.jar:/home/franco/.ivy2/cache/org.uncommons.maths/uncommons-maths/jars/uncommons-maths-1.2.2a.jar:/home/franco/.ivy2/cache/org.spark-project.akka/akka-slf4j_2.10/bundles/akka-slf4j_2.10-2.2.3-shaded-protobuf.jar:/home/franco/.ivy2/cache/org.json4s/json4s-jackson_2.10/jars/json4s-jackson_2.10-3.2.10.jar:/home/franco/.ivy2/cache/org.json4s/json4s-core_2.10/jars/json4s-core_2.10-3.2.10.jar:/home/franco/.ivy2/cache/org.json4s/json4s-ast_2.10/jars/json4s-ast_2.10-3.2.10.jar:/home/franco/.ivy2/cache/com.thoughtworks.paranamer/paranamer/jars/paranamer-2.6.jar:/home/franco/.ivy2/cache/org.scala-lang/scalap/jars/scalap-2.10.4.jar:/home/franco/.sbt/boot/scala-2.10.4/lib/scala-compiler.jar:/home/franco/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.3.1.jar:/home/franco/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.3.0.jar:/home/franco/.ivy2/cache/com.fasterxml.jackson.core/jackson-core/bundles/jackson-core-2.3.1.jar:/home/franco/.ivy2/cache/colt/colt/jars/colt-1.2.0.jar:/home/franco/.ivy2/cache/concurrent/concurrent/jars/concurrent-1.3.4.jar:/home/franco/.ivy2/cache/org.apache.mesos/mesos/jars/mesos-0.18.1-shaded-protobuf.jar:/home/franco/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.23.Final.jar:/home/franco/.ivy2/cache/com.clearspring.analytics/stream/jars/stream-2.7.0.jar:/home/franco/.ivy2/cache/com.codahale.metrics/metrics-core/bundles/metrics-core-3.0.0.jar:/home/franco/.ivy2/cache/com.codahale.metrics/metrics-jvm/bundles/metrics-jvm-3.0.0.jar:/home/franco/.ivy2/cache/com.codahale.metrics/metrics-json/bundles/metrics-json-3.0.0.jar:/home/franco/.ivy2/cache/com.codahale.metrics/metrics-graphite/bundles/metrics-graphite-3.0.0.jar:/home/franco/.ivy2/cache/org.tachyonproject/tachyon-client/jars/tachyon-client-0.5.0.jar:/home/franco/.ivy2/cache/org.tachyonproject/tachyon/jars/tachyon-0.5.0.jar:/home/franco/.ivy2/cache/commons-io/commons-io/jars/commons-io-2.4.jar:/home/franco/.ivy2/cache/org.spark-project/pyrolite/jars/pyrolite-2.0.1.jar:/home/franco/.ivy2/cache/net.sf.py4j/py4j/jars/py4j-0.8.2.1.jar:/home/franco/.ivy2/cache/org.apache.spark/spark-streaming-kafka_2.10/jars/spark-streaming-kafka_2.10-1.1.1.jar:/home/franco/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.1.1.jar:/home/franco/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.4.jar:/home/franco/.ivy2/cache/org.apache.curator/curator-test/jars/curator-test-2.4.0.jar:/home/franco/.ivy2/cache/org.javassist/javassist/jars/javassist-3.15.0-GA.jar:/home/franco/.ivy2/cache/org.apache.commons/commons-math/jars/commons-math-2.2.jar:/home/franco/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.5.jar:/home/franco/.ivy2/cache/log4j/log4j/jars/log4j-1.2.15.jar:/home/franco/.ivy2/cache/javax.mail/mail/jars/mail-1.4.jar:/home/franco/.ivy2/cache/javax.activation/activation/jars/activation-1.1.jar:/home/franco/.ivy2/cache/org.apache.commons/commons-pool2/jars/commons-pool2-2.3.jar:/home/franco/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.1.2.jar:/home/franco/.ivy2/cache/ch.qos.logback/logback-core/jars/logback-core-1.1.2.jar:/home/franco/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.10.5.jar:/home/franco/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.4.jar:/home/franco/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.9.5.jar:/home/franco/.ivy2/cache/org.apache.storm/storm-core/jars/storm-core-0.9.6.jar:/home/franco/.ivy2/cache/org.clojure/clojure/jars/clojure-1.5.1.jar:/home/franco/.ivy2/cache/clj-time/clj-time/jars/clj-time-0.4.1.jar:/home/franco/.ivy2/cache/joda-time/joda-time/jars/joda-time-2.0.jar:/home/franco/.ivy2/cache/compojure/compojure/jars/compojure-1.1.3.jar:/home/franco/.ivy2/cache/org.clojure/core.incubator/jars/core.incubator-0.1.0.jar:/home/franco/.ivy2/cache/org.clojure/tools.macro/jars/tools.macro-0.1.0.jar:/home/franco/.ivy2/cache/clout/clout/jars/clout-1.0.1.jar:/home/franco/.ivy2/cache/ring/ring-core/jars/ring-core-1.1.5.jar:/home/franco/.ivy2/cache/commons-codec/commons-codec/jars/commons-codec-1.6.jar:/home/franco/.ivy2/cache/commons-fileupload/commons-fileupload/jars/commons-fileupload-1.2.1.jar:/home/franco/.ivy2/cache/javax.servlet/servlet-api/jars/servlet-api-2.5.jar:/home/franco/.ivy2/cache/hiccup/hiccup/jars/hiccup-0.3.6.jar:/home/franco/.ivy2/cache/ring/ring-devel/jars/ring-devel-0.3.11.jar:/home/franco/.ivy2/cache/clj-stacktrace/clj-stacktrace/jars/clj-stacktrace-0.2.2.jar:/home/franco/.ivy2/cache/ring/ring-jetty-adapter/jars/ring-jetty-adapter-0.3.11.jar:/home/franco/.ivy2/cache/ring/ring-servlet/jars/ring-servlet-0.3.11.jar:/home/franco/.ivy2/cache/org.mortbay.jetty/jetty/jars/jetty-6.1.26.jar:/home/franco/.ivy2/cache/org.mortbay.jetty/jetty-util/jars/jetty-util-6.1.26.jar:/home/franco/.ivy2/cache/org.mortbay.jetty/servlet-api/jars/servlet-api-2.5-20081211.jar:/home/franco/.ivy2/cache/org.clojure/tools.logging/jars/tools.logging-0.2.3.jar:/home/franco/.ivy2/cache/org.clojure/math.numeric-tower/jars/math.numeric-tower-0.0.1.jar:/home/franco/.ivy2/cache/org.clojure/tools.cli/jars/tools.cli-0.2.4.jar:/home/franco/.ivy2/cache/org.apache.commons/commons-exec/jars/commons-exec-1.1.jar:/home/franco/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar:/home/franco/.ivy2/cache/com.twitter/carbonite/jars/carbonite-1.4.0.jar:/home/franco/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar:/home/franco/.ivy2/cache/commons-logging/commons-logging/jars/commons-logging-1.1.3.jar:/home/franco/.ivy2/cache/com.googlecode.disruptor/disruptor/jars/disruptor-2.10.4.jar:/home/franco/.ivy2/cache/org.jgrapht/jgrapht-core/jars/jgrapht-core-0.9.0.jar:/home/franco/.ivy2/cache/jline/jline/jars/jline-2.11.jar -Ywarn-nullary-unit -deprecation -feature -Xlint -Ywarn-inaccessible -Ywarn-nullary-override -bootclasspath /usr/lib/jvm/java-8-oracle/jre/lib/resources.jar:/usr/lib/jvm/java-8-oracle/jre/lib/rt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jsse.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jce.jar:/usr/lib/jvm/java-8-oracle/jre/lib/charsets.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfr.jar:/usr/lib/jvm/java-8-oracle/jre/classes:/home/franco/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.10.5.jar -unchecked -Ywarn-adapted-args -target:jvm-1.7
[error] 
[error]   last tree to typer: Literal(Constant(com.miguno.avro.Tweet))
[error]               symbol: null
[error]    symbol definition: null
[error]                  tpe: Class(classOf[com.miguno.avro.Tweet])
[error]        symbol owners: 
[error]       context owners: anonymous class anonfun$createAndStartConsumer$1 -> package testing
[error] 
[error] == Enclosing template or block ==
[error] 
[error] Template( // val <local $anonfun>: <notype>, tree.tpe=com.miguno.kafkastorm.testing.anonfun$createAndStartConsumer$1
[error]   "scala.runtime.AbstractFunction3", "scala.Serializable" // parents
[error]   ValDef(
[error]     private
[error]     "_"
[error]     <tpt>
[error]     <empty>
[error]   )
[error]   // 4 statements
[error]   DefDef( // final def apply(m: kafka.message.MessageAndMetadata,c: com.miguno.kafkastorm.kafka.ConsumerTaskContext,n: Option): Unit
[error]     <method> final <triedcooking>
[error]     "apply"
[error]     []
[error]     // 1 parameter list
[error]     ValDef( // m: kafka.message.MessageAndMetadata
[error]       <param> <triedcooking>
[error]       "m"
[error]       <tpt> // tree.tpe=kafka.message.MessageAndMetadata
[error]       <empty>
[error]     )
[error]     ValDef( // c: com.miguno.kafkastorm.kafka.ConsumerTaskContext
[error]       <param> <triedcooking>
[error]       "c"
[error]       <tpt> // tree.tpe=com.miguno.kafkastorm.kafka.ConsumerTaskContext
[error]       <empty>
[error]     )
[error]     ValDef( // n: Option
[error]       <param>
[error]       "n"
[error]       <tpt> // tree.tpe=Option
[error]       <empty>
[error]     )
[error]     <tpt> // tree.tpe=Unit
[error]     Block( // tree.tpe=Unit
[error]       Apply( // def apply(v1: Object,v2: Object): Object in trait Function2, tree.tpe=Object
[error]         EmbeddedKafkaZooKeeperCluster$$anonfun$createAndStartConsumer$1.this."consume$1"."apply" // def apply(v1: Object,v2: Object): Object in trait Function2, tree.tpe=(v1: Object, v2: Object)Object
[error]         // 2 arguments
[error]         "m" // m: kafka.message.MessageAndMetadata, tree.tpe=kafka.message.MessageAndMetadata
[error]         "c" // c: com.miguno.kafkastorm.kafka.ConsumerTaskContext, tree.tpe=com.miguno.kafkastorm.kafka.ConsumerTaskContext
[error]       )
[error]       ()
[error]     )
[error]   )
[error]   DefDef( // final def apply(v1: Object,v2: Object,v3: Object): Object
[error]     <method> final <bridge>
[error]     "apply"
[error]     []
[error]     // 1 parameter list
[error]     ValDef( // v1: Object
[error]       <param> <triedcooking>
[error]       "v1"
[error]       <tpt> // tree.tpe=Object
[error]       <empty>
[error]     )
[error]     ValDef( // v2: Object
[error]       <param> <triedcooking>
[error]       "v2"
[error]       <tpt> // tree.tpe=Object
[error]       <empty>
[error]     )
[error]     ValDef( // v3: Object
[error]       <param> <triedcooking>
[error]       "v3"
[error]       <tpt> // tree.tpe=Object
[error]       <empty>
[error]     )
[error]     <tpt> // tree.tpe=Object
[error]     Block( // tree.tpe=runtime.BoxedUnit
[error]       Apply( // final def apply(m: kafka.message.MessageAndMetadata,c: com.miguno.kafkastorm.kafka.ConsumerTaskContext,n: Option): Unit, tree.tpe=Unit
[error]         EmbeddedKafkaZooKeeperCluster$$anonfun$createAndStartConsumer$1.this."apply" // final def apply(m: kafka.message.MessageAndMetadata,c: com.miguno.kafkastorm.kafka.ConsumerTaskContext,n: Option): Unit, tree.tpe=(m: kafka.message.MessageAndMetadata, c: com.miguno.kafkastorm.kafka.ConsumerTaskContext, n: Option)Unit
[error]         // 3 arguments
[error]         Apply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=kafka.message.MessageAndMetadata
[error]           TypeApply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=()kafka.message.MessageAndMetadata
[error]             "v1"."$asInstanceOf" // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=[T0 >: ? <: ?]()T0
[error]             <tpt> // tree.tpe=kafka.message.MessageAndMetadata
[error]           )
[error]           Nil
[error]         )
[error]         Apply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=com.miguno.kafkastorm.kafka.ConsumerTaskContext
[error]           TypeApply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=()com.miguno.kafkastorm.kafka.ConsumerTaskContext
[error]             "v2"."$asInstanceOf" // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=[T0 >: ? <: ?]()T0
[error]             <tpt> // tree.tpe=com.miguno.kafkastorm.kafka.ConsumerTaskContext
[error]           )
[error]           Nil
[error]         )
[error]         Apply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=Option
[error]           TypeApply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=()Option
[error]             "v3"."$asInstanceOf" // final def $asInstanceOf[T0 >: ? <: ?](): T0 in class Object, tree.tpe=[T0 >: ? <: ?]()T0
[error]             <tpt> // tree.tpe=Option
[error]           )
[error]           Nil
[error]         )
[error]       )
[error]       "scala"."runtime"."BoxedUnit"."UNIT" // final val UNIT: runtime.BoxedUnit in object BoxedUnit, tree.tpe=runtime.BoxedUnit
[error]     )
[error]   )
[error]   ValDef( // private[this] val consume$1: Function2
[error]     private <local> <synthetic> <paramaccessor> <triedcooking>
[error]     "consume$1"
[error]     <tpt> // tree.tpe=Function2
[error]     <empty>
[error]   )
[error]   DefDef( // def <init>(arg$outer: com.miguno.kafkastorm.testing.EmbeddedKafkaZooKeeperCluster,consume$1: Function2): com.miguno.kafkastorm.testing.anonfun$createAndStartConsumer$1
[error]     <method> <triedcooking>
[error]     "<init>"
[error]     []
[error]     // 1 parameter list
[error]     ValDef( // $outer: com.miguno.kafkastorm.testing.EmbeddedKafkaZooKeeperCluster
[error]       <param>
[error]       "$outer"
[error]       <tpt> // tree.tpe=com.miguno.kafkastorm.testing.EmbeddedKafkaZooKeeperCluster
[error]       <empty>
[error]     )
[error]     ValDef( // consume$1: Function2
[error]       <param> <synthetic> <triedcooking>
[error]       "consume$1"
[error]       <tpt> // tree.tpe=Function2
[error]       <empty>
[error]     )
[error]     <tpt> // tree.tpe=com.miguno.kafkastorm.testing.anonfun$createAndStartConsumer$1
[error]     Block( // tree.tpe=Unit
[error]       // 2 statements
[error]       Assign( // tree.tpe=Unit
[error]         EmbeddedKafkaZooKeeperCluster$$anonfun$createAndStartConsumer$1.this."consume$1" // private[this] val consume$1: Function2, tree.tpe=Function2
[error]         "consume$1" // consume$1: Function2, tree.tpe=Function2
[error]       )
[error]       Apply( // def <init>(): scala.runtime.AbstractFunction3 in class AbstractFunction3, tree.tpe=scala.runtime.AbstractFunction3
[error]         EmbeddedKafkaZooKeeperCluster$$anonfun$createAndStartConsumer$1.super."<init>" // def <init>(): scala.runtime.AbstractFunction3 in class AbstractFunction3, tree.tpe=()scala.runtime.AbstractFunction3
[error]         Nil
[error]       )
[error]       ()
[error]     )
[error]   )
[error] )
[error] 
[error] == Expanded type of tree ==
[error] 
[error] ConstantType(value = Constant(com.miguno.avro.Tweet))
[error] 
[error] uncaught exception during compilation: java.io.IOException
[error] File name too long
[error] two errors found
[error] (test:compile) Compilation failed
[error] Total time: 9 s, completed Sep 1, 2016 9:00:17 AM

Weird NoRouteToHostException trying to run the tests

Hey there, I'm getting this weird NoRouteToHostException when I try to run the tests (1.7.0_60). Any idea what might cause this?

> test

[error] Uncaught exception when running tests: java.net.NoRouteToHostException: No route to host
[trace] Stack trace suppressed: run last test:test for the full output.
Exception in thread "Thread-1" java.io.EOFException
    at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2598)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1318)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    at sbt.React.react(ForkTests.scala:116)
    at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
    at java.lang.Thread.run(Thread.java:745)

java.lang.UnsatisfiedLinkError

i am using sparkstreaming on s3 now,but when i test my code on win10,it causes just like this follower.

Exception in thread "pool-17-thread-1" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileWithMode0(Ljava/lang/String;JJJI)Ljava/io/FileDescriptor;
	at org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileWithMode0(Native Method)
	at org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileOutputStreamWithMode(NativeIO.java:559)
	at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:219)
	at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:209)
	at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
	at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
	at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
	at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.<init>(ChecksumFileSystem.java:398)
	at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:461)
	at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
	at org.apache.spark.streaming.CheckpointWriter$CheckpointWriteHandler.run(Checkpoint.scala:234)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

could you give me some some guide?

Storm UI showing improper values for Kafka Trident

I have been trying with trident topology for Kafka i.e. TransactionalTridentKafkaSpout. All works fine except the Storm UI. Even though I've not produced any data to my topic, the Storm UI keeps showing invalid emitted/transferred values. Meaning the count keeps on increasing even when there is no data in the topic. I've tried deleting the data/logs stored in zookeeper, storm, kafka and recreate kafka topics and also have set
topology.stats.sample.rate: 1.0

Here is my configuration and source code snippet:

 final Config config = new Config();
    config.put(Config.TOPOLOGY_TRIDENT_BATCH_EMIT_INTERVAL_MILLIS, 3000);
    config.setNumWorkers(2);
    config.put(Config.NIMBUS_HOST, "192.168.125.20");
    config.put(Config.NIMBUS_THRIFT_PORT, 6627);
    config.put(Config.STORM_ZOOKEEPER_PORT, 2181);
    config.put(Config.STORM_ZOOKEEPER_SERVERS, Arrays.asList("192.168.125.20"));
    config.put(Config.TOPOLOGY_EXECUTOR_RECEIVE_BUFFER_SIZE, 16384);
    config.put(Config.TOPOLOGY_ACKER_EXECUTORS, 1);
    config.put(Config.TOPOLOGY_MAX_SPOUT_PENDING, 10);
    config.put(Config.DRPC_SERVERS, Arrays.asList("192.168.125.20"));
    config.put(Config.DRPC_PORT, 3772);

final BrokerHosts zkHosts = new ZkHosts("192.168.125.20");
final TridentKafkaConfig kafkaConfig = new TridentKafkaConfig(zkHosts, "Test_Topic", "");
kafkaConfig.scheme = new SchemeAsMultiScheme(new StringScheme());
kafkaConfig.bufferSizeBytes = 1024 * 1024 * 4;
kafkaConfig.fetchSizeBytes = 1024 * 1024 * 4;
kafkaConfig.forceFromStart = false;

final TransactionalTridentKafkaSpout kafkaSpout = new TransactionalTridentKafkaSpout(kafkaConfig);
final TridentTopology topology = new TridentTopology();
topology.newStream("spout", kafkaSpout)
       .each(new Fields("str"), new TestFunction(), new Fields("test"))
       .each(new Fields("str"), new PrintFilter());

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.