Giter Site home page Giter Site logo

webankfintech / qualitis Goto Github PK

View Code? Open in Web Editor NEW
658.0 41.0 292.0 48.73 MB

Qualitis is a one-stop data quality management platform that supports quality verification, notification, and management for various datasource. It is used to solve various data quality problems caused by data processing. https://github.com/WeBankFinTech/Qualitis

License: Apache License 2.0

Java 74.85% Shell 0.03% JavaScript 4.53% HTML 0.01% Vue 20.22% Less 0.35%
quality quality-check quality-improvement data-quality linkis dss datashperestudio workflow data-quality-model compare

qualitis's People

Contributors

chaogefeng avatar chenhjia avatar davidhua1996 avatar dependabot[bot] avatar howeye avatar ivanzhongyq avatar kayle1994 avatar liaoyt avatar mayinrain avatar peacewong avatar shlpeng avatar tangjiafeng avatar webankadmin avatar wushengyeyouya avatar yh2388 avatar zqburde avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

qualitis's Issues

ui 乱码

Hi,在使用中发现以下小问题,希望帮解决下,谢谢

Qualitis的版本是0.6.0

在idea中引入源码启动登录后 访问界面出现部分乱码
aab3b80b8e6ade018a93a6df5e553aa

gradle clean distZip 报错

1585646690(1)

从dss1.0.1工作流页面跳转到qualitis0.9.0无法成功

1、使用Chrome83版本的浏览器可以跳转成功,而使用Chrome99版本的浏览器无法跳转成功,项目是对浏览器版本有要求吗?
2、使用Chrome83版本浏览器时,第一次请求点击工作流中qualitis节点时无法跳转,页面显示”哎呀,迷路了”,请求http://192.168.217.145:8090/qualitis/api/v1/redirect时Request Header中未携带cookie,Response Headers中含有cookie信息,而第二次请求是Request Header中携带cookie,是什么原因导致?

登录失败

image
启动之后,通过web页面访问,没有进入登录页面,直接到图片上这里。

java.lang.ClassNotFoundException: org.apache.zookeeper.Watcher

Describe the bug
with zkCurator at 2.12.0, I will get the Exception( java.lang.ClassNotFoundException: org.apache.zookeeper.Watcher) when I start QualitisServer
but none Execption at 2.7.1

To Reproduce
Steps to reproduce the behavior:

  1. just start QualitisServer

Expected behavior
Application run success

Screenshots
image

Desktop (please complete the following information):

  • OS: Ubuntu 18.4
  • IDEA: 2019.2.4

Additional context
Application run success with zkCurator:2.7.1

无法引入文档生成工具swagger

1.现象
引入swagger2.5.0以上版本,编译无法通过
23731583319207_ pic_hd
2.猜测原因
jar包依赖问题
191583134048_ pic_hd
121583133882_ pic
ApiDocumentationScanner类引用的Multimaps是org.apache.hive:hive-exec里的,但是应该要引用的Multimaps其实是com.google.guava:guava
3.解决方案
1)升级org.apache.hive:hive-exec
2)指定ApiDocumentationScanner类里的正确引用,比较难实现,因为类Multimaps包的路径完全一样

0.80版本中依赖的dss-appjoint-auth 0.9.0.WEBANK版本无法从仓库拉取

在gradle/dependencies.gradle文件中,存在以下引用

"dss":"0.9.0.WEBANK",
dssAppJointAuth":"com.webank.wedatasphere.dss:dss-appjoint-auth:$versions.dss",

maven从可以拉取到的最新版本为 0.7.0

DDS项目中,dss-appjoint-auth版本发布为:0.7.0 -> 0.8.0 -> 0.9.0 -> 0.9.1

建议将dss回退到 0.7.0 、0.8.0 或更新至 0.9.0 0.9.1

0.8.0 binary release version conflics with source code version

Describe the bug
Unzip the 0.8.0 release binary package, you can see in lib directory, all core-*.jar files are marked with 0.9.0 version, at the same time unzip the 0.8.0 source package, rebuild it and the jar file in build directories all marked with 0.8.0 version, this makes a little confuse, can any one explain how to match the source code and binary of 0.8.0 version?

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.
图片
图片

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

0.6.0qualitis how to update hive metadata?

Describe the bug
A clear and concise description of what the bug is.
image
i added one cdh cluster, then i create rule, but i did not see the detail databases and tables about hive, how to update metadata, the 0.5.0version has one moudle of metadata management

Qualitis适配3.0.0-CDH6.1.0版本hadoop时编译异常

Describe the bug
在适配3.0.0-CDH6.1.0版本hadoop,并进行编译时报如下错误:
`FAILURE: Build failed with an exception.

:core/common:compileJava > Resolving dependencies ':core/common:compileClasspath'

  • What went wrong:
    Could not resolve all dependencies for configuration ':core/common:compileClasspath'.

Could not determine artifacts for javax.ws.rs:javax.ws.rs-api:2.1
Could not get resource 'http://10.18.101.56:8081/repository/maven-public/javax/ws/rs/javax.ws.rs-api/2.1/javax.ws.rs-api-2.1.$%7Bpackaging.type%7D'.
> Could not HEAD 'http://10.18.101.56:8081/repository/maven-public/javax/ws/rs/javax.ws.rs-api/2.1/javax.ws.rs-api-2.1.$%7Bpackaging.type%7D'. Received status code 400 from server: Invalid repository path

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

BUILD FAILED

Total time: 1 mins 28.557 secs`

Create rule failed

Describe the bug

2019-12-10 05:10:03,888 INFO [Monitor Thread] monitor: Start to monitor application
2019-12-10 05:10:03,889 INFO [Monitor Thread] monitor: Trying to acquire lock of zk, lock_path: /qualitis/tmp/monitor, lock: 97826494
2019-12-10 05:10:03,898 INFO [Monitor Thread] monitor: Succeed to acquire lock of zk, lock_path: /qualitis/tmp/monitor, lock: 97826494
2019-12-10 05:10:03,906 INFO [Monitor Thread] monitor: Succeed to find applications that are not end. Application: []
2019-12-10 05:10:03,906 INFO [Monitor Thread] monitor: Finish to monitor application
2019-12-10 05:10:05,526 WARN [DefaultQuartzScheduler_Worker-1] spi.SqlExceptionHelper: SQL Error: 1406, SQLState: 22001
2019-12-10 05:10:05,528 ERROR [DefaultQuartzScheduler_Worker-1] spi.SqlExceptionHelper: Data truncation: Data too long for column 'column_type' at row 1
2019-12-10 05:10:05,544 ERROR [DefaultQuartzScheduler_Worker-1] core.JobRunShell: Job Meta data refresh job group.Meta data refresh job threw an unhandled Exception:

To Reproduce

在 “添加技术规则” 中配置库名和表名长度大于 50 个字符时,会无法创建规则。

The Root Reason

MariaDB [qualitis]> desc rule_variable;
+-----------------------------------+--------------+------+-----+---------+----------------+
| Field                             | Type         | Null | Key | Default | Extra          |
+-----------------------------------+--------------+------+-----+---------+----------------+
| id                                | bigint(20)   | NO   | PRI | NULL    | auto_increment |
| cluster_name                      | varchar(50)  | YES  |     | NULL    |                |
| db_name                           | varchar(50)  | YES  |     | NULL    |                |
| input_action_step                 | int(11)      | YES  |     | NULL    |                |
| origin_value                      | varchar(100) | YES  |     | NULL    |                |
| table_name                        | varchar(50)  | YES  |     | NULL    |                |
| value                             | varchar(500) | YES  |     | NULL    |                |
| rule_id                           | bigint(20)   | YES  | MUL | NULL    |                |
| template_mid_table_input_meta_id  | bigint(20)   | YES  | MUL | NULL    |                |
| template_statistics_input_meta_id | bigint(20)   | YES  | MUL | NULL    |                |
+-----------------------------------+--------------+------+-----+---------+----------------+

rule_variable 表里面设置了长度为50,可以根据实际情况,进行修改,再执行创建规则即可成功。

前端开源计划

看到fes.js 已经在开源仓库了,前端的组件什么时候能开源呢?谢谢

能提供一个 视频操作就好了,这样更方便别人看这个软件是不是自己想要的

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

[Feature][0.9.1] Qualitis test case

Search before asking

  • I had searched in the issues and found no similar feature requirement.

Problem Description

Based on linkis1.0.3 dss 1.0.1 test case

Description

Precondition Function module Operational steps and data Expected results New/Modified Version Use case type Applicable stage Use case status Status change Test results
Network normal, linkis, DSS normal Sign in 1. Enter the website and enter the correct account and password 2. Click the login button 2. Log in normally V1.0.3 Functional test Manual test Normal Through
Network normal, linkis, DSS normal Rule query Query rules, fuzzy query based on data source/database name /table name / table name V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule query Click the table name to view the table record information V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template Add Basic Template V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template Delete the base template V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template Edit Base Rule Template V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template View template details V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template Add a cross-table template V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template Delete Cross-Table Template V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template Edit Cross-Table Rule Template V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template View Cross-Table Template Detail V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule template Batch delete templates V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Task query Query by Task Status/Project/Data Source/Task No./Exception Notes Query to task list V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Task query Advanced filtering Query according to advanced filter criteria / Re-execute V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Task query Data quality analysis exception V1.0.3 Functional test Manual test Abnormal
Network normal, linkis, DSS normal Task query Batch re-execution, stop, Execute and stop in batch V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Task query View failed task status detail V1.0.3 Functional test Manual test Exception (Obtain the red log) Use the hfds user
Network normal, linkis, DSS normal System settings Add / Edit /Delete Cluster Configuration V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Staffing Add/Edit V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Staffing Delete V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Staffing New Agent User-> New V1.0.3 Functional test Manual test Abnormal (explosive red) Input string explosive red (Exception due to input format problem)
Network normal, linkis, DSS normal Permission settings Add Special Permission V1.0.3 Functional test Manual test Abnormal (explosive red) Input string explosive red (Exception due to input format problem)
Network normal, linkis, DSS normal Permission settings New Role Permission Management V1.0.3 Functional test Manual test Abnormal (explosive red) Input string explosive red (Exception due to input format problem)
Network normal, linkis, DSS normal Staffing Add user role management V1.0.3 Functional test Manual test Abnormal (explosive red) Input string explosive red (Exception due to input format problem)
Network normal, linkis, DSS normal Engine configuration Modify the engine configuration Modification succeeded V1.0.3 Normal
Network normal, linkis, DSS normal Indicators are added Fill in indicator data and add indicator V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Index search Normal query can be performed according to the indicator name / indicator classification / whether the indicator is available / English code / subsystem V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Indicator import/export function Indicator Click Indicator Import and Export Import and export metrics V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal View metrics Click to view metric details View metric details V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Edit the indicator Click Edit Metrics Modify indicator, save and modify successfully V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Delete an indicator Click Delete Indicator Indicator deleted successfully V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Associated indicators Click Associate Metrics V1.0.3 Functional test Manual test To be tested
Network normal, linkis, DSS normal Historical value Click the historical value V1.0.3 Functional test Manual test To be tested
Network normal, linkis, DSS normal Abnormal display of indicator list management drop-down box V1.0.3 Functional test Manual test Abnormal Front-end joint debugging
Network normal, linkis, DSS normal New addition of common item Item addition is normal Successfully added V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Normal item deletion The item deletion is normal Deleted successfully V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Project Import/Export Import and export normal Import and export succeeded V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Project execution Click Task Execution The task was executed successfully V1.0.3 Functional test Manual test Task execution failed (task execution timeout, KIll)
Network normal, linkis, DSS normal Item label function Add a task label Labeling tasks V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Project editing function V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Edit History \ Execution History V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Project permission management Give hdfs user, view permission You can only view the item and cannot perform other operations V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Project permission management Click to display the Permission Management pack Displays the rights management V1.0.3 Functional test Manual test Exception (permission management list display exception) Front-end joint debugging
Network normal, linkis, DSS normal Add Verification Rule-Single Table Verification Single table verification rule added successfully V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Add Verification Rule-Cross-table Comparison Cross-table comparison check rule added successfully V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Add Verification Rule-File Verification File validation rule added successfully V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Add Verification Rule-Cross-database Comparison Failed to add cross-database comparison verification rule Need more than one library, only one V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Add Verification Rule-Single Indicator Verification Verification rule of single pointer table is added successfully V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Add Verification Rule-Multiple Indicator Verification Multi-indicator verification rule is added successfully V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule enforcement Function is normal V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Import and export rules Successfully imported and exported V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Details of the rules View rule details succeeded V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule editing Rule editing function is normal V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule Rename Rule Rename OK V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Rule deletion The delete rule is normal V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Number of Checks Passed/Failed/Failed Today Get normal Displays the number of passed/failed/failed verification V1.0.3 Functional test Manual test Normal
Network normal, linkis, DSS normal Get Execution Task Log Get "" Failed Task "" Log Error Displays the number of passed/failed/failed verification V1.0.3 Functional test Manual test Get log exception Use the hdfs user
Network normal, linkis, DSS normal Logout login exception The toolbar in the upper right corner of the user is repeatedly logged out. V1.0.3 Functional test Manual test User Toolbar Exception Front-end joint debugging
Network normal, linkis, DSS normal Login page exception User login page appears, inexplicable jump exception V1.0.3 Functional test Manual test Login home page exception Front-end joint debugging
Network normal, linkis, DSS normal DSS-> Project Management-> Workflow-> Data Quality Data quality node, get jump address, projectID is null V1.0.3 Functional test Manual test Go viral The hdfs user is not registered with LDAP

Use case

No response

solutions

No response

Anything else

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Customize rule is translated to wrong SQL on min function

Describe the bug
Add a customize rule, and use the min fuction, the scala script is generated without coping the alias name

To Reproduce

  1. add a customize rule
  2. choose min stat function
  3. fill other fields
  4. save
  5. from project -> rule ->execute task
  6. check log, something likes:
    val tmp1 = spark.sql("SELECT sales/projects AS mymin FROM test_hadoop.test_qualities WHERE pt_d='2021-05-26'");
    val schemas = tmp1.schema.fields.map(f => f.name).toList
    val newSchemas = schemas.map(s => s.replaceAll("[()]", "")).toList
    val tmp2 = tmp1.toDF(newSchemas: _*)
    tmp2.write.saveAsTable("test_qualitis_result.mid_application_19_20210527124203_868574");
    tmp2.selectExpr("min(sales/projects) as value", "'QUALITIS20210527124203243_813453' as application_id", "'Long' as result_type", "'19' as rule_id", "'2021-05-27 12:42:03' as create_time").write.mode(org.apache.spark.sql.SaveMode.Append).

execute the script in spark shell could see excetion: org.apache.spark.sql.AnalysisException: cannot resolve 'sales' given input columns: [mymin]; line 1 pos 4;

Expected behavior
use the alias name instead of origin expression

Screenshots
image

Desktop (please complete the following information):

  • OS: Linux
  • Browser :Chrome
  • Version: 0.8.0

提交任务运行失败com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

使用likis提交到集群可以正常运行,通过qualitis提交后,shuffleRDD运行失败

com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
	at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:981)
	at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:339)
	at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253)
	at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2286)
	at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2085)
	at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:795)
	at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
	at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400)
	at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:327)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:610)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
	at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
	at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2121)
	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2121)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:121)
	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$11.apply(Executor.scala:407)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1408)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:413)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
	at java.net.PlainSocketImpl.socketConnect(Native Method)
	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
	at java.net.Socket.connect(Socket.java:60

init.sql缺好多字段

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

新增技术规则报错

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

截屏2020-02-07下午5 57 39

无法识别数据库信息 日志信息如下:

截屏2020-02-07下午6 01 35

Additional context
Add any other context about the problem here.

application-dev.yml

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

qualitis不支持kerberos认证

目前集群时CDH5.11和5.15两个版本,都启用了kerberos认证,但是在qualitis中配置好hive的库表和规则并运行的时候报错,连不上hive metastore
2019-12-11 21:43:10,588 ERROR [qtp1644213828-274] impl.OuterExecutionServiceImpl: com.webank.wedatasphere.qualitis.exception.HiveMetaStoreConnectException: Failed to connect to hive meta store, address: thrift://xxxxxx:9083

查询执行列表报错sql语法错误

版本:0.8.0

错误代码:

public class TaskDataSourceDaoImpl implements TaskDataSourceDao {

  ....
    @Override
    public List<TaskDataSource> findByUser(String username, Integer page, Integer size) {
        Pageable pageable = PageRequest.of(page, size);
        return taskDataSourceRepository.findByCreateUser(username, pageable);
    }
...
}
public interface TaskDataSourceRepository extends JpaRepository<TaskDataSource, Long>, JpaSpecificationExecutor<TaskDataSource> {

    /**
     * Paging find task datasource by creator
     * @param createUser
     * @param pageable
     * @return
     */
    @Query("select j from TaskDataSource j where j.createUser = ?1 group by j.clusterName, j.databaseName, j.tableName")
    List<TaskDataSource> findByCreateUser(String createUser, Pageable pageable);
...
}

通过以上代码生成的sql:

select id as id1_2_, cluster_name as cluster_2_2_, col_name as col_name3_2_, create_user as create_u4_2_, database_name as database5_2_, datasource_index as datasour6_2_, execute_user as execute_7_2_, rule_id as rule_id8_2_, table_name as table_na9_2_, task_id as task_id10_2_
from qualitis_application_task_datasource where create_user='root' group by cluster_name , database_name , table_name limit 10

可以看出ID没在group by 里面,但查询字段中有,因此报语法错误

When to support mysql、clickhouse

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

留给前端二次开发人的忠告

此项目中的 web 自带了一份打包好的前端静态资源,这份资源并不是从现有的 ui 代码打包而来。

这意味着,ui 的代码是滞后的,比如你用 ui 打包出来的前端,会遇到页面上图标丢失的问题,mock 代理导致启动失败的问题。

如果你和我一样,需要在现有代码基础上做二次开发,很不幸,你会被坑得很惨。

Support Linkis1.0

Since Linkis has released 1.0.3, I think qualitis should release a new version to adapt new linkis, I reckon it will be tremendous help for us.

Failed to login with the predefined admin user and password

In the log file I can see the following log message:

{"username":"admin","password":"8c6976e5b5410415bde908bd4dee15dfb167a9c873fc4bb8a81f6f2ab448a918"}

Checked the database, the hashed passwords are the same.

mysql> select password from qualitis_auth_user where username = 'admin';
+------------------------------------------------------------------+
| password |
+------------------------------------------------------------------+
| 8c6976e5b5410415bde908bd4dee15dfb167a9c873fc4bb8a81f6f2ab448a918 |
+------------------------------------------------------------------+
1 row in set (0.00 sec)

hive metadata refresh error when its db is postgresql

Describe the bug
when my hive metadata db is postgresql,and i create metadata table use such grammar (table name add "")

CREATE TABLE "GLOBAL_PRIVS" (
    "USER_GRANT_ID" bigint NOT NULL,
    "CREATE_TIME" bigint NOT NULL,
    "GRANT_OPTION" smallint NOT NULL,
    "GRANTOR" character varying(128) DEFAULT NULL::character varying,
    "GRANTOR_TYPE" character varying(128) DEFAULT NULL::character varying,
    "PRINCIPAL_NAME" character varying(128) DEFAULT NULL::character varying,
    "PRINCIPAL_TYPE" character varying(128) DEFAULT NULL::character varying,
    "USER_PRIV" character varying(128) DEFAULT NULL::character varying
);

when i click metadataManagement, it will try to get clusters, select cluster ,will get dbs. The exception is :

2019-12-05 13:49:50,006 INFO [DefaultQuartzScheduler_Worker-3] cron.RefreshMetaData: Start to refresh meta data of cluster: BDAP
2019-12-05 13:49:50,010 ERROR [DefaultQuartzScheduler_Worker-3] cron.RefreshMetaData: Failed to execute sql
org.postgresql.util.PSQLException: ERROR: relation "dbs" does not exist
  Position: 18
        at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2477) ~[postgresql-42.1.4.jar:42.1.4]
        at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2190) ~[postgresql-42.1.4.jar:42.1.4]
        at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:300) ~[postgresql-42.1.4.jar:42.1.4]
        at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:428) ~[postgresql-42.1.4.jar:42.1.4]
        at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:354) ~[postgresql-42.1.4.jar:42.1.4]
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:301) ~[postgresql-42.1.4.jar:42.1.4]
        at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:287) ~[postgresql-42.1.4.jar:42.1.4]
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:264) ~[postgresql-42.1.4.jar:42.1.4]
        at org.postgresql.jdbc.PgStatement.executeQuery(PgStatement.java:231) ~[postgresql-42.1.4.jar:42.1.4]
        at com.zaxxer.hikari.pool.ProxyStatement.executeQuery(ProxyStatement.java:111) ~[HikariCP-2.7.9.jar:?]
        at com.zaxxer.hikari.pool.HikariProxyStatement.executeQuery(HikariProxyStatement.java) ~[HikariCP-2.7.9.jar:?]
        at com.webank.wedatasphere.qualitis.hive.dao.impl.HiveMetaDataDaoImpl.getHiveDbByCluster(HiveMetaDataDaoImpl.java:55) ~[core_meta_data-0.5.0.jar:?]
        at com.webank.wedatasphere.qualitis.hive.cron.RefreshMetaData.getAndSaveDbs(RefreshMetaData.java:159) ~[core_meta_data-0.5.0.jar:?]
        at com.webank.wedatasphere.qualitis.hive.cron.RefreshMetaData.execute(RefreshMetaData.java:107) [core_meta_data-0.5.0.jar:?]
        at org.quartz.core.JobRunShell.run(JobRunShell.java:202) [quartz-2.2.1.jar:?]
        at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573) [quartz-2.2.1.jar:?]
2019-12-05 13:49:50,011 INFO [DefaultQuartzScheduler_Worker-3] core.JobRunShell: Job Meta data refresh job group.Meta data refresh job threw a JobExecutionException:
org.quartz.JobExecutionException: Failed to execute sql
        at com.webank.wedatasphere.qualitis.hive.cron.RefreshMetaData.execute(RefreshMetaData.java:115) ~[core_meta_data-0.5.0.jar:?]
        at org.quartz.core.JobRunShell.run(JobRunShell.java:202) [quartz-2.2.1.jar:?]
        at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573) [quartz-2.2.1.jar:?]

The bug code is HiveMetaDataDaoImpl , String sql = "SELECT NAME FROM DBS"; should be String sql = "SELECT \"NAME\" FROM \"DBS\""; for my case. same as get tables and columns.

the root question is use normal database access instead of hive metadata service or ORM .

qualitis version: release-0.5.0

无法访问页面,前端访问时有报错

Describe the bug
Qualitis-release-0.9.0
启动后访问页面无法显示
image
image

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

新建自定义规则后点确定报错

Describe the bug
新建自定义规则后点确定报错,web页面报错:request failed

Screenshots
image
image

Desktop (please complete the following information):

  • OS: CentOS7
  • Version: Qualitis-0.7.0

where is the front source code

目前Qualitis项目中 , 只在Qualitis/web/app/src/main/resources/static/js 看到有打包压缩后前端代码 , 但是源码缺失 .请问 前端源码目前在哪里 能找到谢谢..

fes.js has been open sourced.

Describe the bug
The README.md said fes.js is not open sourced but I saw it is open sourced.

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'https://github.com/WeBankFinTech/Qualitis#tips'
  2. See fes.js is not open sourced, but we found this project: https://github.com/WeBankFinTech/fes.js

Expected behavior
Remove this fes.js is not open sourced statement. And if you can open source your front end code, it will be great.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

npm install时提示 npm ERR! 404 '@webank/[email protected]' is not in the npm registry.

Describe the bug
npm install时提示 npm ERR! 404 '@webank/[email protected]' is not in the npm registry.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Qualitis system configuration

Describe the bug
wedatasphere-linkis-0.9.4-dist.tar.gz
wedatasphere-qualitis-0.8.0.zip

How are the data sources for Qualitis system configuration configured in the figure below? Any examples
image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.