Giter Site home page Giter Site logo

simplesteph / kafka-avro-course Goto Github PK

View Code? Open in Web Editor NEW
181.0 13.0 199.0 31 KB

Learn the Confluent Schema Registry & REST Proxy

Home Page: https://www.udemy.com/confluent-schema-registry/?couponCode=GITHUB

Java 100.00%
kafka confluent-kafka schema-registry kafka-rest-proxy udemy learning

kafka-avro-course's Introduction

kafka-avro-course's People

Contributors

anks2024 avatar cc-lemery avatar johnboy-leeds avatar mihirlimbachia avatar simplesteph avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-avro-course's Issues

Exception in KafkaAvroJavaProducerV1Demo

https://github.com/simplesteph/kafka-avro-course/blob/c991400baaa62352a87255234f306a04d275e780/kafka-avro-v1/src/main/java/com/github/simplesteph/kafka/apps/v1/KafkaAvroJavaProducerV1Demo.java#L23C14-L23C14

This line results into the below exception:

Exception in thread "main" java.lang.NoClassDefFoundError: io/confluent/kafka/schemaregistry/client/rest/exceptions/RestClientException
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:348)
	at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:735)
	at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:490)
	at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:483)
	at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:113)
	at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:133)
	at org.apache.kafka.clients.producer.ProducerConfig.<init>(ProducerConfig.java:553)
	at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:289)
	at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:316)
	at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:301)
	at com.example.KafkaAvroProducerV1.main(KafkaAvroProducerV1.java:23)
Caused by: java.lang.ClassNotFoundException: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException
	at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 12 more

The way to resolve it is to not exclude kafka-schema-registry-client in the respective pom file.

Error serializing Avro message

I'm creating an avro class that contains a string and a map as fields.
I can generate the avro class through maven, and i was able to create a registry in localhost:8081

.avsc file:
{
"type":"record",
"name":"AvroClass",
"namespace":"belliPack.avro",
"fields":[
{
"name":"title",
"type":"string"
},
{
"name":"map",
"type": {"type": "map", "values": "double"}
}
]
}

Schema registry returns this:
$ curl -X GET http://localhost:8081/subjects/teste1-value/versions/1

{"subject":"teste1-value","version":1,"id":42,"schema":"{"type":"record","name":"AvroClass","namespace":"belliPack.avro","fields":[{"name":"title","type":"string"},{"name":"map","type":{"type":"map","values":"double"}}]}"}

But when running my Kafka Producer have this error:
Exception in thread "Thread-1" Exception in thread "Thread-3" org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.net.MalformedURLException: no protocol: 127.0.0.1:8081/subjects/teste1-value/versions
at java.base/java.net.URL.(URL.java:644)
at java.base/java.net.URL.(URL.java:540)
at java.base/java.net.URL.(URL.java:487)
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:175)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:256)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:356)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:348)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:334)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:168)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:222)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:198)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:70)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:903)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:865)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:752)
at belliPack.Kafka.Kafka_Producer.sendData(Kafka_Producer.java:32)
at belliPack.OPC.ExtractNodeValues.run(ExtractNodeValues.java:82)
at java.base/java.lang.Thread.run(Thread.java:834)
org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.net.MalformedURLException: no protocol: 127.0.0.1:8081/subjects/teste1-value/versions
at java.base/java.net.URL.(URL.java:644)
at java.base/java.net.URL.(URL.java:540)
at java.base/java.net.URL.(URL.java:487)
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:175)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:256)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:356)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:348)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:334)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:168)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:222)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:198)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:70)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:903)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:865)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:752)
at belliPack.Kafka.Kafka_Producer.sendData(Kafka_Producer.java:32)
at belliPack.OPC.ExtractNodeValues.run(ExtractNodeValues.java:82)
at java.base/java.lang.Thread.run(Thread.java:834)

I appreciate any help...
Thanks

Suggestion: Encourage the use of mvn install

Hello Stéphane,

I was following the instructions in the course and my generated specific class was not being found by the runnable class. I'm not very experienced in maven so I followed all the steps as described in your videos without success. Then I thought it was something on my code but after cloning this repo the same thing happened: the runnable class couldn't find the generated classes.

The way I solved it was by running mvn install, however this was not obvious for me and I don't really understand why I had to do it if I enabled auto imports in Intellij and ran mvn clean and mvn package.

I'm creating this issue so that other people can find a solution when the main class doesn't recognize the generated specific classes from the .avsc files.

How to use already existing Pojo?

In this example you are using that avro file and creat that class automatically.

so in most of the situation we need to create a class with the variables we need and from there only we need to send or receive data?
so for this situation how can i do the same avro serialization?

Fix confluent repository URL

To fix the error below :

Could not resolve dependencies for project com.github.simplesteph.udemy.kafka:kafka-avro-v2:jar:1.0-SNAPSHOT: The following artifacts could not be resolved: io.confluent:kafka-avro-serializer:jar:3.1.1, io.confluent:kafka-schema-registry-client:jar:3.1.1, io.confluent:common-config:jar:3.1.1, io.confluent:common-utils:jar:3.1.1: Could not transfer artifact io.confluent:kafka-avro-serializer:jar:3.1.1 from/to maven-default-http-blocker (http://0.0.0.0/): Blocked mirror for repositories: [confluent (http://packages.confluent.io/maven/, default, releases+snapshots)] 

Needed to change the confluent repository url with https on all pom.xml :

    <!--necessary to resolve confluent dependencies-->
    <repositories>
        <repository>
            <id>confluent</id>
            <url>https://packages.confluent.io/maven/</url>
        </repository>
    </repositories>

Dependency conflicts on org.xerial.snappy:snappy-java, leading to inconsistent program behaviors

Hi, in kafka-avro-course/kafka-avro-v2, there are mulptiple versions of library org.xerial.snappy:snappy-java. However, according to Maven's dependency management strategy: "first declaration wins", only org.xerial.snappy:snappy-java:1.1.1.3 can be loaded, and org.xerial.snappy:snappy-java:1.1.2.6 will be shadowed.

In total, there are 4 conflicting API pairs between these two library version.

As shown in the following figure, your project expects to invoke method <org.xerial.snappy.SnappyOutputStream: write([BII)V> in library org.xerial.snappy:snappy-java:1.1.2.6 (along the original dependency path). As it has been shadowed, this method defined in org.xerial.snappy:snappy-java:1.1.1.3 is actually forced to be referenced via the following invocation path (along the actual dependency path):

<com.github.simplesteph.kafka.apps.v2.KafkaAvroJavaConsumerV2Demo: main([Ljava/lang/String;)V> /home/wwww/wangSensor/unzip/kafka-avro-course-master/kafka-avro-v2/target/classes
<org.apache.kafka.clients.consumer.KafkaConsumer: <init>(Ljava/util/Properties;)V> /home/wwww/.m2/repository/org/apache/kafka/kafka-clients/0.11.0.1/kafka-clients-0.11.0.1.jar
<org.apache.kafka.clients.consumer.KafkaConsumer: <init>(Ljava/util/Properties;Lorg/apache/kafka/common/serialization/Deserializer;Lorg/apache/kafka/common/serialization/Deserializer;)V> /home/wwww/.m2/repository/org/apache/kafka/kafka-clients/0.11.0.1/kafka-clients-0.11.0.1.jar
<org.apache.kafka.clients.consumer.KafkaConsumer: <init>(Lorg/apache/kafka/clients/consumer/ConsumerConfig;Lorg/apache/kafka/common/serialization/Deserializer;Lorg/apache/kafka/common/serialization/Deserializer;)V> /home/wwww/.m2/repository/org/apache/kafka/kafka-clients/0.11.0.1/kafka-clients-0.11.0.1.jar
<org.apache.kafka.clients.consumer.KafkaConsumer: close(JZ)V> /home/wwww/.m2/repository/org/apache/kafka/kafka-clients/0.11.0.1/kafka-clients-0.11.0.1.jar
<org.apache.kafka.clients.ClientUtils: closeQuietly(Ljava/io/Closeable;Ljava/lang/String;Ljava/util/concurrent/atomic/AtomicReference;)V> /home/wwww/.m2/repository/org/apache/kafka/kafka-clients/0.11.0.1/kafka-clients-0.11.0.1.jar
<org.apache.kafka.common.record.KafkaLZ4BlockOutputStream: close()V> /home/wwww/.m2/repository/org/apache/kafka/kafka-clients/0.11.0.1/kafka-clients-0.11.0.1.jar
<org.apache.kafka.common.record.KafkaLZ4BlockOutputStream: writeBlock()V> /home/wwww/.m2/repository/org/apache/kafka/kafka-clients/0.11.0.1/kafka-clients-0.11.0.1.jar
<org.xerial.snappy.SnappyOutputStream: write([BII)V>

kafka-avro-course

Although both of these conflicting libraries contain the referenced methods (with the same signature), they have different implementations. This issue will not cause runtime crashes, but it can introduce inconsistent semantic program hehaviors----

Code snippet of <org.xerial.snappy.SnappyOutputStream: write([BII)V> in org.xerial.snappy:snappy-java:1.1.2.6 (shadowed but expected to invoke method):

detailed method body
@Override
    public void write(byte[] b, int byteOffset, int byteLength)
            throws IOException
    {
        if (closed) {
            throw new IOException("Stream is closed");
        }
        int cursor = 0;
        while (cursor < byteLength) {
            int readLen = Math.min(byteLength - cursor, blockSize - inputCursor);
            // copy the input data to uncompressed buffer
            if (readLen > 0) {
                System.arraycopy(b, byteOffset + cursor, inputBuffer, inputCursor, readLen);
                inputCursor += readLen;
            }
            if (inputCursor < blockSize) {
                return;
            }

            compressInput();
            cursor += readLen;
        }
    }

Code snippet of <org.xerial.snappy.SnappyOutputStream: write([BII)V> in org.xerial.snappy:snappy-java:1.1.1.3 (loaded version):

detailed method body
@Override
public void write(byte[] b, int off, int len) throws IOException {
    rawWrite(b, off, len);
}

public void rawWrite(Object array, int byteOffset, int byteLength) throws IOException {

        if(inputCursor + byteLength < MIN_BLOCK_SIZE) {
            // copy the input data to uncompressed buffer
            Snappy.arrayCopy(array, byteOffset, byteLength, inputBuffer, inputCursor);
            inputCursor += byteLength;
            return;
        }

        compressInput();

        for(int readBytes = 0; readBytes < byteLength; ) {
            int inputLen = Math.min(blockSize, byteLength - readBytes);
            if(!hasSufficientOutputBufferFor(inputLen)) {
                dumpOutput();
            }
            int compressedSize = Snappy.rawCompress(array, byteOffset + readBytes, inputLen, outputBuffer, outputCursor + 4);
            writeInt(outputBuffer, outputCursor, compressedSize);
            outputCursor += 4 + compressedSize;
            readBytes += inputLen;
        }
    }

The detailed informantion of the remaining 3 conflicting API pairs can be found in the following attachment.
4 conflicting API pairs in project kafka-avro-v2.txt

Dependency tree--

[INFO] com.github.simplesteph.udemy.kafka:kafka-avro-v2:jar:1.0-SNAPSHOT
[INFO] +- org.apache.avro:avro:jar:1.8.2:compile
[INFO] | +- org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile
[INFO] | +- org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:compile
[INFO] | | - (org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile - omitted for duplicate)
[INFO] | +- com.thoughtworks.paranamer:paranamer:jar:2.7:compile
[INFO] | +- org.xerial.snappy:snappy-java:jar:1.1.1.3:compile
[INFO] | +- org.apache.commons:commons-compress:jar:1.8.1:compile
[INFO] | +- org.tukaani:xz:jar:1.5:compile
[INFO] | - org.slf4j:slf4j-api:jar:1.7.7:compile
[INFO] +- org.apache.kafka:kafka-clients:jar:0.11.0.1:compile
[INFO] | +- net.jpountz.lz4:lz4:jar:1.3.0:compile
[INFO] | +- (org.xerial.snappy:snappy-java:jar:1.1.2.6:compile - omitted for conflict with 1.1.1.3)
[INFO] | - (org.slf4j:slf4j-api:jar:1.7.25:compile - omitted for conflict with 1.7.7)
[INFO] - io.confluent:kafka-avro-serializer:jar:3.3.1:compile
[INFO] +- (org.apache.avro:avro:jar:1.8.2:compile - omitted for duplicate)
[INFO] +- io.confluent:kafka-schema-registry-client:jar:3.3.1:compile
[INFO] | +- (org.apache.avro:avro:jar:1.8.2:compile - omitted for duplicate)
[INFO] | +- com.fasterxml.jackson.core:jackson-databind:jar:2.8.4:compile
[INFO] | | +- com.fasterxml.jackson.core:jackson-annotations:jar:2.8.0:compile
[INFO] | | - com.fasterxml.jackson.core:jackson-core:jar:2.8.4:compile
[INFO] | +- org.slf4j:slf4j-log4j12:jar:1.7.21:compile
[INFO] | | +- (org.slf4j:slf4j-api:jar:1.7.21:compile - omitted for conflict with 1.7.7)
[INFO] | | - log4j:log4j:jar:1.2.17:compile
[INFO] | - (io.confluent:common-utils:jar:3.3.1:compile - omitted for duplicate)
[INFO] +- io.confluent:common-config:jar:3.3.1:compile
[INFO] | +- (io.confluent:common-utils:jar:3.3.1:compile - omitted for duplicate)
[INFO] | - (org.slf4j:slf4j-api:jar:1.7.21:compile - omitted for conflict with 1.7.7)
[INFO] - io.confluent:common-utils:jar:3.3.1:compile
[INFO] +- (org.slf4j:slf4j-api:jar:1.7.21:compile - omitted for conflict with 1.7.7)
[INFO] +- org.apache.zookeeper:zookeeper:jar:3.4.10:compile
[INFO] | +- (org.slf4j:slf4j-api:jar:1.6.1:compile - omitted for conflict with 1.7.7)
[INFO] | +- (org.slf4j:slf4j-log4j12:jar:1.6.1:compile - omitted for conflict with 1.7.21)
[INFO] | +- (log4j:log4j:jar:1.2.16:compile - omitted for conflict with 1.2.17)
[INFO] | +- jline:jline:jar:0.9.94:compile
[INFO] | | - junit:junit:jar:3.8.1:compile
[INFO] | - io.netty:netty:jar:3.10.5.Final:compile
[INFO] - com.101tec:zkclient:jar:0.10:compile
[INFO] +- (org.slf4j:slf4j-api:jar:1.6.1:compile - omitted for conflict with 1.7.7)
[INFO] - (org.apache.zookeeper:zookeeper:jar:3.4.8:compile - omitted for conflict with 3.4.10)

Suggested solutions:

Solution1: Declare version org.xerial.snappy:snappy-java:1.1.2.6 as a direct dependency, to override the version 1.1.1.3 (based on Maven's nearest wins loading strategy).

Solution2: reversing the dependencies (org.apache.avro:avro:jar:1.8.2 and org.apache.kafka:kafka-clients:jar:0.11.0.1) declaration order of these two libraries in pom file.

Thanks.
Best regards,
Coco

Error due to camel case

in SchemaEvolutionExamples there is a camel case error and the sample fails
should be final File file = new File("customerV1.avro");

Rest Proxy files

Hi Stephaane,

I was just wondering if you could add to this repository the rest-proxy files like the rest-proxy-insomnia,json

Thank you for your work!

Error while building Kafka-avro-v1

While building pom.xml I am getting this error in both v1 and v2

Downloaded: https://repo.maven.apache.org/maven2/joda-time/joda-time/2.7/joda-time-2.7.jar (576 KB at 114.5 KB/sec)
[WARNING] Error injecting: org.apache.avro.mojo.SchemaMojo
com.google.inject.ProvisionException: Unable to provision, see the following errors:

1) Error injecting constructor, java.lang.NoClassDefFoundError: org/codehaus/jackson/JsonParseException
  at org.apache.avro.mojo.SchemaMojo.<init>(Unknown Source)
  while locating org.apache.avro.mojo.SchemaMojo

1 error
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1025)
	at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1051)
	at org.eclipse.sisu.space.AbstractDeferredClass.get(AbstractDeferredClass.java:48)
	at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:81)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.provision(InternalFactoryToInitializableAdapter.java:53)
	at com.google.inject.internal.ProviderInternalFactory$1.call(ProviderInternalFactory.java:65)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:115)
	at org.eclipse.sisu.bean.BeanScheduler$Activator.onProvision(BeanScheduler.java:176)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:126)
	at com.google.inject.internal.ProvisionListenerStackCallback.provision(ProvisionListenerStackCallback.java:68)
	at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:63)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.get(InternalFactoryToInitializableAdapter.java:45)
	at com.google.inject.internal.InjectorImpl$2$1.call(InjectorImpl.java:1016)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1092)
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1012)
	at org.eclipse.sisu.inject.Guice4$1.get(Guice4.java:162)
	at org.eclipse.sisu.inject.LazyBeanEntry.getValue(LazyBeanEntry.java:81)
	at org.eclipse.sisu.plexus.LazyPlexusBean.getValue(LazyPlexusBean.java:51)
	at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:263)
	at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:255)
	at org.apache.maven.plugin.internal.DefaultMavenPluginManager.getConfiguredMojo(DefaultMavenPluginManager.java:517)
	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:121)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
	at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
	at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
	at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
	at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
	at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
	at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
	at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
	at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
	at org.codehaus.classworlds.Launcher.main(Launcher.java:47)
Caused by: java.lang.NoClassDefFoundError: org/codehaus/jackson/JsonParseException
	at org.apache.avro.mojo.SchemaMojo.<init>(SchemaMojo.java:41)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at com.google.inject.internal.DefaultConstructionProxyFactory$1.newInstance(DefaultConstructionProxyFactory.java:86)
	at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:105)
	at com.google.inject.internal.ConstructorInjector.access$000(ConstructorInjector.java:32)
	at com.google.inject.internal.ConstructorInjector$1.call(ConstructorInjector.java:89)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:115)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:133)
	at com.google.inject.internal.ProvisionListenerStackCallback.provision(ProvisionListenerStackCallback.java:68)
	at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:87)
	at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:267)
	at com.google.inject.internal.InjectorImpl$2$1.call(InjectorImpl.java:1016)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1103)
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1012)
	... 43 more
Caused by: java.lang.ClassNotFoundException: org.codehaus.jackson.JsonParseException
	at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:271)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:247)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:239)
	... 60 more

Fix confluent repository URL

To fix the error below :

Could not resolve dependencies for project com.github.simplesteph.udemy.kafka:kafka-avro-v2:jar:1.0-SNAPSHOT: The following artifacts could not be resolved: io.confluent:kafka-avro-serializer:jar:3.1.1, io.confluent:kafka-schema-registry-client:jar:3.1.1, io.confluent:common-config:jar:3.1.1, io.confluent:common-utils:jar:3.1.1: Could not transfer artifact io.confluent:kafka-avro-serializer:jar:3.1.1 from/to maven-default-http-blocker (http://0.0.0.0/): Blocked mirror for repositories: [confluent (http://packages.confluent.io/maven/, default, releases+snapshots)] 

Needed to change the confluent repository url with https on all pom.xml :

    <!--necessary to resolve confluent dependencies-->
    <repositories>
        <repository>
            <id>confluent</id>
            <url>https://packages.confluent.io/maven/</url>
        </repository>
    </repositories>

Fix confluent repository URL

To fix the error below :

Could not resolve dependencies for project com.github.simplesteph.udemy.kafka:kafka-avro-v2:jar:1.0-SNAPSHOT: The following artifacts could not be resolved: io.confluent:kafka-avro-serializer:jar:3.1.1, io.confluent:kafka-schema-registry-client:jar:3.1.1, io.confluent:common-config:jar:3.1.1, io.confluent:common-utils:jar:3.1.1: Could not transfer artifact io.confluent:kafka-avro-serializer:jar:3.1.1 from/to maven-default-http-blocker (http://0.0.0.0/): Blocked mirror for repositories: [confluent (http://packages.confluent.io/maven/, default, releases+snapshots)] 

Needed to change the confluent repository url with https on all pom.xml :

    <!--necessary to resolve confluent dependencies-->
    <repositories>
        <repository>
            <id>confluent</id>
            <url>https://packages.confluent.io/maven/</url>
        </repository>
    </repositories>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.