Giter Site home page Giter Site logo

bridata / dbus Goto Github PK

View Code? Open in Web Editor NEW
1.2K 1.2K 549.0 51.94 MB

DBus

Home Page: https://bridata.github.io/DBus/

License: Apache License 2.0

Java 65.85% Shell 0.20% JavaScript 22.67% HTML 4.35% CSS 0.62% PLSQL 0.08% Ruby 0.01% Python 5.74% Less 0.38% Handlebars 0.11%

dbus's People

Contributors

bridata avatar star-brilliant avatar zhenlinzhong avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbus's Issues

云盘上的包(0.5.0)中于实际不符合

  1. FileBeat数据线接入检查数据线,应该不需要配置主备库(按照doc上默认为empty),但后台逻辑却有连接数据库的操作

  2. storm包com.creditease.dbus.stream.common.appender.utils.DBFacade#queryDataTables(long id)中查询了is_open 和is_auto_complete 2个字段,但脚本中未给出。请问是字段废弃了还是脚本漏掉?

使用 dbus 0.4.0,启动 dbus-heartbeat 报错

[pool-5-thread-1] ERROR: 2019/03/26 09:33:15.481 GlobalControlKafkaConsumerEvent 144 - [Global-Control-kafka-Consumer-event] topic: global_ctr
l_topic ,value:null
java.lang.NullPointerException
at com.creditease.dbus.heartbeat.event.impl.GlobalControlKafkaConsumerEvent.run(GlobalControlKafkaConsumerEvent.java:71)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

前端源码打包出错

  1. 运行npm install ,报错如下
    npm notice created a lockfile as package-lock.json. You should commit this file.
    npm WARN [email protected] requires a peer of jquery@>=1.8.0 but none is installed. You must install peer dependencies yourself.
    npm WARN [email protected] requires a peer of webpack@^1.11.0 || ^2.1.0-beta || ^2.1.0 but none is installed. You must install peer dependencies yourself.
    npm WARN [email protected] requires a peer of webpack@1 || ^2.1.0-beta but none is installed. You must install peer dependencies yourself.
    npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules\fsevents):
    npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"})

  2. 运行npm run build,报错如下
    WARNING in .//moment/src/lib/locale/locales.js
    Module not found: Error: Can't resolve './locale' in 'E:\workspace\DBusNew10\DBu
    s\dbus-keeper\keeper-web\node_modules\moment\src\lib\locale'
    @ ./
    /moment/src/lib/locale/locales.js 56:12-46
    @ .//moment/src/lib/locale/locale.js
    @ ./
    /moment/src/moment.js
    @ ./~/antd/lib/locale-provider/en_US.js
    @ ./app/i18n.js
    @ ./app/app.js
    @ multi main

WARNING in ./app/components/Login/saga/index.js
30:23 export 'LIST_API' was not found in '../../../containers/Login/api'

WARNING in ./app/containers/toolSet/redux/index.js
There are multiple modules with names that only differ in casing.
This can lead to unexpected behavior when compiling on a filesystem with other c
ase-semantic.
Use equal casing. Compare these module identifiers:

  • E:\workspace\DBusNew10\DBus\dbus-keeper\keeper-web\node_modules\babel-loader\l
    ib\index.js!E:\workspace\DBusNew10\DBus\dbus-keeper\keeper-web\app\containers\To
    olSet\redux\index.js
    Used by 4 module(s), i. e.
    E:\workspace\DBusNew10\DBus\dbus-keeper\keeper-web\node_modules\babel-loader
    \lib\index.js!E:\workspace\DBusNew10\DBus\dbus-keeper\keeper-web\app\containers
    ToolSet\ControlMessageWrapper.js
  • E:\workspace\DBusNew10\DBus\dbus-keeper\keeper-web\node_modules\babel-loader\l
    ib\index.js!E:\workspace\DBusNew10\DBus\dbus-keeper\keeper-web\app\containers\to
    olSet\redux\index.js
    Used by 3 module(s), i. e.
    E:\workspace\DBusNew10\DBus\dbus-keeper\keeper-web\node_modules\babel-loader
    \lib\index.js!E:\workspace\DBusNew10\DBus\dbus-keeper\keeper-web\app\reducers.js

部署好后访问报500错误

下的最新版本,里面多了nginx配置,配置好后访问8080端口,页面报500 Internal Server Error
看nginx日志报如下错误
2018/11/08 17:44:42 [error] 34760#0: *37 rewrite or internal redirection cycle while internally redirecting to "/index.html", client: 10.130.236.128, server: localhost, request: "GET / HTTP/1.1", host: "xxx(ip脱敏):8080"
2018/11/08 17:44:42 [error] 34760#0: *36 rewrite or internal redirection cycle while internally redirecting to "/index.html", client: 10.130.236.128, server: localhost, request: "GET /favicon.ico HTTP/1.1", host: "10.138.225.194:8080", referrer: "http://xxx(ip脱敏):8080/"

查看项目log,也没有报错信息

table detail board没有获取到canal的日志

我根据《Dbus快速开始手册》部署好了Dbus,以及相关插件,启动没问题,有心跳日志。但是在table detail board中一直获取不到test_table的数据插入信息。
canal-example/entry.log中是有数据的
image

但是在testdb.testschema.test_table中一直没有信息展示
image

是不是dbus哪个地方还需要配置修改一下?

容错和事务支持么?

数据同步任务出错,重启之后,能否从上一次继续,确保数据完整无重复地写入Kafka之中

dbus无法正常启动拓扑?

我使用的是dbus keeper是v0.5.0
问题描述
1.在加线最后一步无法显示拓扑列表


X5 NICZKN7YJ WXN_70S}5E

2.导致拓扑无法启动
6(S9 _6Y2F SR6@C9@Q VUN

集群其他组件均正常。

linux下因为ext4文件系统大小写敏感导致webpack编译时报错

我在linux下编译时报错如下
ERROR in ./src/main.js
Module not found: Error: Cannot resolve 'file' or 'directory' ./components/monitor/Monitor in /home/hadoop/projects/opensource/DBus/dbus-manager/web/src
@ ./src/main.js 135:15-54

根据错误提示查下如下文件46行:
https://github.com/BriData/DBus/blob/master/dbus-manager/web/src/main.js

import Monitor from './components/monitor/Monitor'

这里的Monitor应该改为小写,因为linux下ext4文件系统区分大小写,会找不到Monitor组件。

Canal 启动之后,执行5分钟左右就开始报: SessionHandler - something goes wrong with channel

2019-11-29 00:00:12.350 [New I/O server worker #1-3] ERROR c.a.otter.canal.server.netty.handler.SessionHandler - something goes wro
ng with channel:[id: 0x15024850, /10.10.0.210:51166 => /10.10.0.85:10000], exception=java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:322)
at org.jboss.netty.channel.socket.nio.NioWorker.processSelectedKeys(NioWorker.java:281)
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:201)
at org.jboss.netty.util.internal.IoWorkerRunnable.run(IoWorkerRunnable.java:46)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

2019-11-29 00:00:17.475 [New I/O server worker #1-4] ERROR c.a.otter.canal.server.netty.handler.SessionHandler - something goes wro
ng with channel:[id: 0x0b7935f6, /10.10.0.210:51168 => /10.10.0.85:10000], exception=java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:322)
at org.jboss.netty.channel.socket.nio.NioWorker.processSelectedKeys(NioWorker.java:281)
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:201)
at org.jboss.netty.util.internal.IoWorkerRunnable.run(IoWorkerRunnable.java:46)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

dbus 0.5.0 添加mysql数据源后,无法删除源

点击删除源后会提示“您访问的服务出错了呢”,请看下怎么解决这个问题

其实我只是数据源的“从库Ip:Port”设置错了,但现在没找到地方更改这项,所以要删除该数据源

DBus v0.5.0版本源端数据菜单下选择datasource报 Column 'Tables_in_dbus' not found.

2019-11-15 09:47:39.857 ERROR 2908 --- [io-10001-exec-3] c.c.d.controller.DataSourceController : Error encountered while validate datasources with parameter:{"password":"xxxxx","URL":"jdbc:mysql://xxxxxxx3306/testschema?useSSL=false&characterEncoding=utf8&zeroDateTimeBehavior=convertToNull","dsType":"mysql","user":"canal","dsId":614}

java.sql.SQLException: Column 'Tables_in_dbus' not found.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:965) ~[mysql-connector-java-5.1.46.jar!/:5.1.46]
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:898) ~[mysql-connector-java-5.1.46.jar!/:5.1.46]
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:887) ~[mysql-connector-java-5.1.46.jar!/:5.1.46]
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:861) ~[mysql-connector-java-5.1.46.jar!/:5.1.46]
at com.mysql.jdbc.ResultSetImpl.findColumn(ResultSetImpl.java:1080) ~[mysql-connector-java-5.1.46.jar!/:5.1.46]
at com.mysql.jdbc.ResultSetImpl.getString(ResultSetImpl.java:5177) ~[mysql-connector-java-5.1.46.jar!/:5.1.46]
at com.creditease.dbus.service.source.SourceFetcher.fetchTableStructure(SourceFetcher.java:96) ~[classes!/:na]
at com.creditease.dbus.controller.DataSourceController.searchFromSource(DataSourceController.java:120) ~[classes!/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_92]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_92]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_92]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_92]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) [spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133) [spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]

DBUS v0.6.1 新建数据线报错: com/creditease/dbus/canal/auto/AddLine : Unsupported major.minor version 52.0

  • JDK版本如下:
    [root@dbus-n1 logs]# java -version
    java version "1.8.0_251"
    Java(TM) SE Runtime Environment (build 1.8.0_251-b08)
    Java HotSpot(TM) 64-Bit Server VM (build 25.251-b08, mixed mode)

  • 异常如下:
    2020-07-15 13:09:39.786 INFO 29640 --- [nio-8901-exec-9] com.creditease.dbus.utils.SSHUtils : errorMsg:Exception in thread "main" java.lang.UnsupportedClassVersionError: com/creditease/dbus/canal/auto/AddLine : Unsupported major.minor version 52.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:803)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)

新建数据线时,启动Topology报错

Traceback (most recent call last):
File "/data/dbus/apache-storm-1.0.1//bin/storm.py", line 766, in
main()
File "/data/dbus/apache-storm-1.0.1//bin/storm.py", line 763, in main
(COMMANDS.get(COMMAND, unknown_command))(*ARGS)
File "/data/dbus/apache-storm-1.0.1//bin/storm.py", line 234, in jar
transform_class = confvalue("client.jartransformer.class", [CLUSTER_CONF_DIR])
File "/data/dbus/apache-storm-1.0.1//bin/storm.py", line 144, in confvalue
p = sub.Popen(command, stdout=sub.PIPE)
File "/usr/lib64/python2.7/subprocess.py", line 711, in init
errread, errwrite)
File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory

不知是哪里有问题?

The newest version auto deploy error

我使用云盘下载的最新带自动部署的canal功能,init dbusmgr时一切都是OK的,但是当我加线过程中出现警告和加线结束后报错
错误信息如下图:
image

DBus问题-抽取进程失败

  1. 检查数据线时,显示如下错误 :
    image

  2. 查看拓扑时,并没发现有什么错误
    image

  3. 查看canal时也没发现错误
    image

请问,如何解决抽取进程失败问题?

router 消费的offset 无法commit

先看AckWindows.java

public void flush() {
        if (System.currentTimeMillis() - baseTime >= time) {
            for (String key : ackBooks.keySet()) {
                Map map = ackBooks.get(key);
                Ack preValue = null;
                boolean isFail = false;
                List okOffsetList = new ArrayList<>();
                for (Map.Entry entry : map.entrySet()) {
                    Ack value = entry.getValue();
                    if (Ack.INIT == value.getStatus()) {
                        logger.info("topic:{}, partaiton:{}, offset:{}, status: init",
                                value.getTopic(), value.getPartition(), value.getOffset());
                        if (preValue != null) this.callBack.ack(preValue);
                        break;
                    } else if (Ack.OK == value.getStatus()) {
                        logger.info("topic:{}, partaiton:{}, offset:{}, status: ok",
                                value.getTopic(), value.getPartition(), value.getOffset());
                        okOffsetList.add(value.getOffset());
                        preValue = value;
                        continue;
                    } else if (Ack.FAIL == value.getStatus()) {
                        logger.info("topic:{}, partaiton:{}, offset:{}, status: fail",
                                value.getTopic(), value.getPartition(), value.getOffset());
                        this.callBack.fail(value);
                        isFail = true;
                        break;
                    }
                }
                if (isFail) {
                    ackBooks.get(key).clear();
                } else {
                    for (Long offset : okOffsetList) {
                        ackBooks.get(key).remove(offset);
                    }
                }
            }
            baseTime = System.currentTimeMillis();
        }
    }

flush只有在ackBooks存在init类型ack才会触发callback 执行kafka consumer commit offset ;但是从代码上看,很难保证这部分代码能被执行,在我这边环境日志看,基本上就没有执行过这个部分;导致offset根本不会commit。

2019-09-24 10:44:43.960 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5574, trigger ack.
2019-09-24 10:44:43.960 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5574, status: ok
2019-09-24 10:47:59.187 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5575, trigger ack.
2019-09-24 10:47:59.187 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5575, status: ok
2019-09-24 10:48:18.986 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5576, trigger ack.
2019-09-24 10:48:18.986 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5576, status: ok
2019-09-24 10:50:01.592 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5577, trigger ack.
2019-09-24 10:50:01.592 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5577, status: ok
2019-09-24 10:56:23.716 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5578, trigger ack.
2019-09-24 10:56:23.717 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5578, status: ok
2019-09-24 10:56:39.722 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5579, trigger ack.
2019-09-24 10:56:39.722 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5579, status: ok
2019-09-24 10:57:25.731 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5580, trigger ack.
2019-09-24 10:57:25.732 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5580, status: ok
2019-09-24 11:00:32.136 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5581, trigger ack.
2019-09-24 11:00:32.136 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5581, status: ok
2019-09-24 11:00:42.198 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5582, trigger ack.
2019-09-24 11:00:42.198 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5582, status: ok
2019-09-24 11:01:04.342 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5583, trigger ack.
2019-09-24 11:01:04.342 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5583, status: ok
2019-09-24 11:01:38.118 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:31, trigger ack.
2019-09-24 11:04:21.340 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:31, status: ok
2019-09-24 11:04:21.445 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:32, trigger ack.
2019-09-24 11:04:49.213 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:32, status: ok
2019-09-24 11:04:49.322 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:33, trigger ack.
2019-09-24 11:06:06.776 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:33, status: ok
2019-09-24 11:06:07.508 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:34, trigger ack.
2019-09-24 11:07:29.620 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:34, status: ok
2019-09-24 11:07:29.652 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:35, trigger ack.
2019-09-24 11:11:34.924 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:35, status: ok
2019-09-24 11:11:35.042 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:36, trigger ack.
2019-09-24 11:12:22.407 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5588, trigger ack.
2019-09-24 11:12:22.407 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5588, status: ok
2019-09-24 11:12:22.407 c.c.d.r.s.a.AckWindows [INFO] topic:bi_flink_test_tr1_ctrl, partaiton:0, offset:36, status: ok
2019-09-24 11:12:30.231 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5589, trigger ack.
2019-09-24 11:12:30.231 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5589, status: ok
2019-09-24 11:12:39.805 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5590, trigger ack.
2019-09-24 11:12:39.805 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5590, status: ok
2019-09-24 11:31:06.331 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5591, trigger ack.
2019-09-24 11:31:06.331 c.c.d.r.s.a.AckWindows [INFO] topic:db_172_20_13_238_3306.qq_data.result, partaiton:0, offset:5591, status: ok

不太清楚为啥一定要在存在init才触发callback回调,望答复

dbusV0.6.1新建数据线自动部署canal失败

dbusV0.6新建数据线连接到mysql数据源 自动部署canal失败,canal账号可以访问数据库。
2020-11-24 17:25:48.491 INFO 640 --- [nio-8901-exec-1] com.creditease.dbus.utils.SSHUtils : errorMsg:com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create connection to database server.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:389)
at com.mysql.jdbc.Util.getInstance(Util.java:372)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:958)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:937)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:926)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:872)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2316)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2069)
at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:794)
at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:44)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:389)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:399)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at com.creditease.dbus.canal.utils.DBUtils.getConn(DBUtils.java:97)
at com.creditease.dbus.canal.utils.DBUtils.checkDBAccount(DBUtils.java:49)
at com.creditease.dbus.canal.auto.AutoDeployStart.main(AutoDeployStart.java:92)
at com.creditease.dbus.canal.auto.AddLine.autoDeploy(AddLine.java:72)
at com.creditease.dbus.canal.auto.AddLine.main(AddLine.java:62)
Caused by: java.lang.NullPointerException
at com.mysql.jdbc.ConnectionImpl.getServerCharset(ConnectionImpl.java:2989)
at com.mysql.jdbc.MysqlIO.sendConnectionAttributes(MysqlIO.java:1873)
at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1802)
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1206)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2239)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2270)
... 17 more

ogg 同步oracle,hash校验出错

通过dbus 与ogg 配合同步Oracle 数据

  1. ogg 同步至kafka的源topic数据正常
  2. dbus解析源topic 中的数据出现如下异常
    image
    这个大致的原因是什么?非常感谢
  3. dbus 安装与相关软件版本均按照文档上推荐安装

com.jcraft.jsch.JSchException: Auth fail

加载config.properties配置文件中...
nginx地址检测通过
测试数据库连通性...
Tue Oct 20 09:05:42 CST 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
数据库连通性测试通过
kafka检测通过
influxdb地址检测通过
测试ZK连通性...
[main] INFO org.apache.curator.framework.imps.CuratorFrameworkImpl - Starting
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.8--1, built on 02/06/2016 03:18 GMT
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:host.name=centos151
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_262
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.262.b10-0.el7_8.x86_64/jre
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=../lib/keeper-auto-deploy-0.6.1-jar-with-dependencies.jar
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=/tmp
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:os.version=3.10.0-957.el7.x86_64
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.name=root
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.home=/root
[main] INFO org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/app/dbus/dbus-keeper/bin
[main] INFO org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=192.168.186.151:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@3159c4b8
[main-SendThread(192.168.186.151:2181)] INFO org.apache.zookeeper.ClientCnxn - Opening socket connection to server 192.168.186.151/192.168.186.151:2181. Will not attempt to authenticate using SASL (unknown error)
[main-SendThread(192.168.186.151:2181)] INFO org.apache.zookeeper.ClientCnxn - Socket connection established to 192.168.186.151/192.168.186.151:2181, initiating session
[main-SendThread(192.168.186.151:2181)] INFO org.apache.zookeeper.ClientCnxn - Session establishment complete on server 192.168.186.151/192.168.186.151:2181, sessionid = 0x175403e4296001c, negotiated timeout = 40000
[main-EventThread] INFO org.apache.curator.framework.state.ConnectionStateManager - State change: CONNECTED
ZK连通性测试通过
[main] INFO org.apache.zookeeper.ZooKeeper - Session: 0x175403e4296001c closed
验证密钥是否存在...
~/.ssh/id_rsa密钥文件存在
[main-EventThread] INFO org.apache.zookeeper.ClientCnxn - EventThread shut down for session: 0x175403e4296001c
com.jcraft.jsch.JSchException: Auth fail
java.lang.RuntimeException: com.jcraft.jsch.JSchException: Auth fail
at com.creditease.dbus.auto.deploy.DBusKeeperInitAll.checkSSH(DBusKeeperInitAll.java:604)
at com.creditease.dbus.auto.deploy.DBusKeeperInitAll.checkSSH(DBusKeeperInitAll.java:86)
at com.creditease.dbus.auto.deploy.DBusKeeperInitAll.main(DBusKeeperInitAll.java:54)
Caused by: com.jcraft.jsch.JSchException: Auth fail
at com.jcraft.jsch.Session.connect(Session.java:519)
at com.creditease.dbus.auto.deploy.DBusKeeperInitAll.checkSSH(DBusKeeperInitAll.java:589)
... 2 more

init失败:看日志先在zookeeper上创建了zknode,之后又回滚了,是什么原因呢?

user:root,host:dbus-n3,port:22,keyPath:~/.ssh/id_rsa,command:ps -ef | grep 'com.creditease.dbus.heartbeat.start.Start' | grep -v grep | awk '{print $2}'
2019-12-19 08:11:57.194 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.utils.SSHUtils : inputMsg:2978
3392

2019-12-19 08:11:57.194 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.utils.SSHUtils : errorMsg:
2019-12-19 08:11:57.195 INFO 31064 --- [nio-8901-exec-9] c.c.dbus.service.ConfigCenterService : cmd:kill -s 2978
3392

2019-12-19 08:11:57.195 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.utils.SSHUtils : user:root,host:dbus-n3,port:22,keyPath:~/.ssh/id_rsa,command:kill -s 2978
3392

2019-12-19 08:12:00.721 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.utils.SSHUtils : inputMsg:
2019-12-19 08:12:00.721 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.utils.SSHUtils : errorMsg:bash: line 0: kill: 2978: invalid signal specification
bash: line 1: 3392: command not found

2019-12-19 08:12:00.744 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.commons.ZkService : 节点删除成功, Path: /DBus/Streaming
2019-12-19 08:12:00.750 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.commons.ZkService : 节点删除成功, Path: /DBus/HeartBeat/Control
2019-12-19 08:12:00.758 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.commons.ZkService : 节点删除成功, Path: /DBus/HeartBeat/ProjectMonitor
2019-12-19 08:12:00.768 INFO 31064 --- [nio-8901-exec-9] com.creditease.dbus.commons.ZkService : 节点删除成功, Path:

log processor 要求数据必须存在host字段?

使用logstash 将数据 接入kafka后,创建了 几个规则,
在storm 界面 发现 数组越界异常,查询源码 发现,对host字段有要求。因为host信息 是命名空间的一部分。

DBus全量拉取MySQL到kafka的过程中,出现类型不匹配的错误。

MySQL数据类型:
mysql> desc actor;
+-------------+----------------------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+-------------+----------------------+------+-----+-------------------+-----------------------------+
| actor_id | smallint(5) unsigned | NO | PRI | NULL | auto_increment |
| first_name | varchar(45) | NO | | NULL | |
| last_name | varchar(45) | NO | MUL | NULL | |
| last_update | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
+-------------+----------------------+------+-----+-------------------+-----------------------------+
4 rows in set (0.00 sec)
zk中的全量拉取的报错日志:
{
"Partitions" : "1",
"TotalCount" : "1",
"FinishedCount" : null,
"TotalRows" : "201",
"FinishedRows" : null,
"StartSecs" : "1510910309",
"ConsumeSecs" : "0s",
"CreateTime" : "2017-11-17 17:18:29.774",
"UpdateTime" : "2017-11-17 17:18:29.966",
"StartTime" : "2017-11-17 17:18:29.774",
"EndTime" : null,
"ErrorMsg" : "sakila_actor_s23_v0.0:Exception happened when fetching data of split: {"TotalRows":201,"dataChunkSplitIndex":1,"dataChunkCount":1,"data.source.param":"{\"payload\":{\"OP_TS\":\"2017-11-17 17:18:24.000\",\"TABLE_NAME\":\"actor\",\"INCREASE_BATCH_NO\":true,\"PULL_REMARK\":\"src_37\",\"PULL_REQ_CREATE_TIME\":\"2017-11-17 17:18:24.000000\",\"BATCH_NO\":\"7\",\"SCHEMA_NAME\":\"sakila\",\"SEQNO\":\"23\",\"DBUS_DATASOURCE_ID\":5,\"POS\":\"0000040000537369462\",\"PHYSICAL_TABLES\":\"actor\",\"VERSION\":\"0\",\"topic\":\"src_37.sakila\"},\"from\":\"src_37-appender\",\"id\":1510910308764,\"type\":\"FULL_DATA_PULL_REQ\",\"timestamp\":\"2017-11-17 17:18:28.764\"}","dataChunkSplit":{"collate":" ","lowerOperator":" >= ","sqlType":4,"upperValue":201,"upperOperator":" <= ","length":0,"start":0,"condWithPlaceholder":"actor_id >= ? AND actor_id <= ?","end":0,"targetTableName":"sakila.actor","lowerValue":1,"splitCol":"actor_id"}}.Data type not match with String;Spout got fail!, record offset is:16, sakila_actor_s23_v0: split index is 1",
"Status" : "pulling",
"Version" : "0",
"BatchNo" : "7"
}

storm启动splitter_puller报错

报错内容如下:
1465 [main] INFO c.c.d.c.FullPullPropertiesHolder - getCommonConf(). customizeConfPath is test_schema1-fullsplitter/common-config

2019-05-06 11:08:27.688 INFO 10929 --- [nio-8901-exec-9] com.creditease.dbus.utils.SSHUtils : errorMsg:Exception in thread "main" java.lang.NumberFormatException: null
at java.lang.Integer.parseInt(Integer.java:542)
at java.lang.Integer.valueOf(Integer.java:766)
at com.creditease.dbus.FullPullerTopology.main(FullPullerTopology.java:71)

2019-05-06 11:08:43.660 ERROR 10929 --- [nio-8901-exec-2] c.c.dbus.base.DBusControllerAdvice : uncatch exception.

org.springframework.web.client.HttpServerErrorException: 500 Server Error
at org.springframework.web.client.DefaultResponseErrorHandler.handleError(DefaultResponseErrorHandler.java:89) ~[spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.client.RestTemplate.handleResponse(RestTemplate.java:708) ~[spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:661) ~[spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.client.RestTemplate.execute(RestTemplate.java:621) ~[spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.client.RestTemplate.getForObject(RestTemplate.java:295) ~[spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at com.creditease.dbus.utils.StormToplogyOpHelper.getForResult(StormToplogyOpHelper.java:256) ~[keeper-base-0.5.0.jar!/:na]
at com.creditease.dbus.utils.StormToplogyOpHelper.getTopoRunningInfoById(StormToplogyOpHelper.java:90) ~[keeper-base-0.5.0.jar!/:na]
at com.creditease.dbus.service.DataSourceService.viewLog(DataSourceService.java:338) ~[classes!/:na]
at com.creditease.dbus.controller.DataSourceController.viewLog(DataSourceController.java:215) ~[classes!/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_112]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_112]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_112]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_112]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) ~[spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133) ~[spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97) ~[spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827) ~[spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738) ~[spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) ~[spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967) ~[spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901) ~[spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:635) [tomcat-embed-core-8.5.29.jar!/:8.5.29]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846) [spring-webmvc-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:742) [tomcat-embed-core-8.5.29.jar!/:8.5.29]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231) [tomcat-embed-core-8.5.29.jar!/:8.5.29]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) [tomcat-embed-core-8.5.29.jar!/:8.5.29]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) [tomcat-embed-websocket-8.5.29.jar!/:8.5.29]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) [tomcat-embed-core-8.5.29.jar!/:8.5.29]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) [tomcat-embed-core-8.5.29.jar!/:8.5.29]
at org.springframework.boot.web.filter.ApplicationContextHeaderFilter.doFilterInternal(ApplicationContextHeaderFilter.java:55) [spring-boot-1.5.11.RELEASE.jar!/:1.5.11.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.3.15.RELEASE.jar!/:4.3.15.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) [tomcat-embed-core-8.5.29.jar!/:8.5.29]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) [tomcat-embed-core-8.5.29.jar!/:8.5.29]

是否考虑canal admin的引入

目前自动部署canal都是通过ssh 执行部署命令来实现,以后是否会考虑通过canal admin来管理canal与数据源配置之间的关系?增加admin之后可以在加线时在当前已有的canal server实例中选择。通过admin来管理的化canal server就可以很方便容器化,扩容以及设置ha 集群模式

基础配置的时候stormUI界面地址报错

我的ui界面可以通过192.168.100.101:8080/index.html正常访问我填写的是192.168.100.101:8080/index.html
报错日志如下:
java.net.MalformedURLException: no protocol: 192.168.100.101:8080/index.html/nimbus/summary

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.