Giter Site home page Giter Site logo

nodefluent / kafka-connect Goto Github PK

View Code? Open in Web Editor NEW
127.0 7.0 8.0 695 KB

equivalent to kafka-connect :wrench: for nodejs :sparkles::turtle::rocket::sparkles:

License: MIT License

JavaScript 100.00%
kafka connect framework datastore etl nodejs kafka-connect

kafka-connect's People

Contributors

dependabot[bot] avatar greenkeeper[bot] avatar holgeradam avatar krystianity avatar rob3000 avatar rsilvestre avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-connect's Issues

An in-range update of coveralls is breaking the build 🚨

The devDependency coveralls was updated from 3.0.2 to 3.0.3.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

coveralls is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Release Notes for Dependency security updates

As suggested by NPM and Snyk.

Commits

The new version differs by 1 commits.

  • aa2519c dependency security audit fixes from npm & snyk (#210)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

An in-range update of debug is breaking the build 🚨

The dependency debug was updated from 4.1.0 to 4.1.1.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

debug is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Commits

The new version differs by 4 commits.

  • 68b4dc8 4.1.1
  • 7571608 remove .coveralls.yaml
  • 57ef085 copy custom logger to namespace extension (fixes #646)
  • d0e498f test: only run coveralls on travis

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Possible issue with native producer call

I am evaluating nodefluent kafka-connect framework for an ETL project. While testing the framework with the native consumer/producer, I noticed SourceConfig.js is calling the NProducer.buffer() method with possibly incorrect parameters. This is the call

    return this.producer.buffer(
        this.config.topic,
        record.key,
        record,
        this.config.produceCompressionType || 0
    );

The NProducer buffer API looks like this:
async buffer(topic, identifier, payload, partition = null, version = null, partitionKey = null)

Looks like compressionType is the right parameter for the non-native Producer but incorrect for NProducer.

I am using "kafka-connect": "^3.6.0". Is this a real bug or am I not reading the code correctly?

Version 10 of node.js has been released

Version 10 of Node.js (code name Dubnium) has been released! 🎊

To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:

  • Added the new Node.js version to your .travis.yml

If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.

More information on this issue

Greenkeeper has checked the engines key in any package.json file, the .nvmrc file, and the .travis.yml file, if present.

  • engines was only updated if it defined a single version, not a range.
  • .nvmrc was updated to Node.js 10
  • .travis.yml was only changed if there was a root-level node_js that didn’t already include Node.js 10, such as node or lts/*. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.

For many simpler .travis.yml configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖


FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

An in-range update of bluebird is breaking the build 🚨

The dependency bluebird was updated from 3.5.3 to 3.5.4.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

bluebird is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Release Notes for v3.5.4
  • Proper version check supporting VSCode(#1576)
Commits

The new version differs by 6 commits.

  • e0222e3 Release v3.5.4
  • 4b9fa33 missing --expose-gc flag (#1586)
  • 63b15da docs: improve and compare Promise.each and Promise.mapSeries (#1565)
  • 9dcefe2 .md syntax fix for coming-from-other-languages.md (#1584)
  • b97c0d2 added proper version check supporting VSCode (#1576)
  • 499cf8e Update jsdelivr url in docs (#1571)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

You have to install node-rdkafka to use NProducer

Test is failing because node-rdkafka is not the part of dependencies in package.json file

node-rdkakfa-missing
now After installing node-rdkakfa
node-rdkafka-added

Now My Question is Why this wasn't added into package.json dependencies because this way developer have to install node-rdkafka again and again where ever he/she wants to use this package

An in-range update of sinek is breaking the build 🚨

The dependency sinek was updated from 6.22.0 to 6.22.1.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

sinek is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Commits

The new version differs by 5 commits.

  • 6c803a5 6.22.1
  • 8262b76 Merge pull request #73 from nodefluent/add-auto-option-to-type-defintion
  • 6276c7f Merge pull request #74 from nodefluent/optional-identifier-in-nproducer-type-definition
  • fff9268 The identifier parameter in NProducer is optional
  • c3338a2 Add "auto" as value for defaultPartitionCount in NProducer

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Commit after all messages got processed

Hi,

I'm having a small question - I'm trying to understand how batching works here. Does it mean that the fetch will (potentially) retrieve multiple messages only? what about committing them all together (so we can have at-least-once policy)? is it something that existing?

For instance - here, it looks like that only one offset will be committed at a time. And if we take a look at the gcloud pubsub sink impl it seems like all messages are stored in memory and getting processed only when buffer > batchSize, so if the server is down for some reason, all messages will be gone because their offset already committed (so no at-least-once policy here).

Also, does config.batch only have guarantee on minimum bytes or something like that?

Thanks!

An in-range update of prom-client is breaking the build 🚨

The dependency prom-client was updated from 11.1.1 to 11.1.2.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

prom-client is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Release Notes for 11.1.2

Changed

  • Allow setting Gauge values to NaN, +Inf, and -Inf
  • Fixed histogram scrape performance by using acc.push instead of acc.concat. Fixes #216 with #219
Commits

The new version differs by 9 commits.

  • 25255c3 11.1.2
  • 707bff9 prepare for 11.1.2
  • 45e0c3e Fix histogram scrape performance (#219)
  • 6a71f47 chore: upgrade prettier (#218)
  • 7fd5477 Remove Sinon dev dependency (#217)
  • 00f1b79 Allow setting Gauge values to NaN. (#202)
  • d5d8b7d adding prefix to DefaultMetricsCollectorConfiguration, related to the release v11.1.0. (#208)
  • 8d792aa Add getMetricsAsArray to index.d.ts (#211)
  • f651c41 Updated README.md (#204)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Support prom-client cluster mode

Hi, there.

I'm making the monitoring system of my kafka-connect instances which has spawned by pm2. But I guess there's no support of prom-client cluster mode to gather all of instances metrics. Could you look into how it's possible to implement prom-client cluster mode on this kafka-connect, or may I contribute the code? I'll try to do this.

Thank you.

An in-range update of bluebird is breaking the build 🚨

The dependency bluebird was updated from 3.5.2 to 3.5.3.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

bluebird is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Release Notes for v3.5.3

Bugfixes:

  • Update acorn dependency
Commits

The new version differs by 7 commits.

  • a5a5b57 Release v3.5.3
  • c8a7714 update packagelock
  • 8a765fd Update getting-started.md (#1561)
  • f541801 deps: update acorn and add acorn-walk (#1560)
  • 247e512 Update promise.each.md (#1555)
  • e2756e5 fixed browser cdn links (#1554)
  • 7cfa9f7 Changed expected behaviour when promisifying (#1545)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

SinkConfig.js message value null

class SinkConfig

static _messageToRecord(message) {

        //check if a converter has already turned this message into a record
        if (message && typeof message.value === "object" &&
            message instanceof SinkRecord) {
            return message;
        }

        try {
            const record = new SinkRecord();

            record.kafkaOffset = message.offset;
            record.key = message.key;
            record.partition = message.partition;

            record.keySchema = message.value.keySchema;
            record.timestamp = message.value.timestamp;
            record.value = message.value.value;
            record.valueSchema = message.value.valueSchema;

            return record;
        } catch (error) {
            debug("Failed to turn message into sink record.", error.message);
            super.emit("error", "Failed to turn message into SinkRecord: " + error.message);
            return message;
        }
    }

Refresh Sinker to reroute the messages from and to

I am using multiple Instances of the sinker objects and I can refresh those instances in 24hours intervals So I have multiple Questions

  1. if I refresh the Object or want to reroute that object to another path or topic of the sink can I do that with Sink Object?
  2. If I recreate the Sinker object still Metrics.js file throws an error that metrics is already registered I don't want to use older values How can I clear that because Client is locally defined and even it can't be accessed via Object using it?
    https://github.com/nodefluent/kafka-connect/blob/master/lib/common/Metrics.js

How do I processes messages in parallel?

I have a Kafka topic that is partitioned into 32 partitions.

I've tried to write a simple sink to call an REST API once per Kafka message.

But it seems like it's processing my Kafka messages in sequence.

I've set maxTasks: 32 in my config, without any result.

How can I make it either process n partitions in parallel. And/or process multiple messages from each partition? It seems the put method gets an array, but for me it's always an array of one element.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.