Giter Site home page Giter Site logo

cloudera / hue Goto Github PK

View Code? Open in Web Editor NEW
1.1K 36.0 353.0 452.41 MB

Open source SQL Query Assistant service for Databases/Warehouses

Home Page: https://cloudera.com

License: Apache License 2.0

Makefile 0.12% Python 29.81% HTML 2.50% Mako 5.43% Shell 0.30% JavaScript 58.39% Thrift 0.45% CSS 0.48% PigLatin 0.01% Java 0.03% XSLT 0.55% C++ 0.01% Perl 0.01% Batchfile 0.01% Assembly 0.01% Go 0.01% PLpgSQL 0.01% Lex 0.33% Yacc 1.60% PHP 0.01%
sql sql-editor databases data-warehouse query-editor sql-assistant autocomplete compose

hue's Introduction

CircleCI DockerPulls GitHub contributors

Hue Logo

Query. Explore. Share.

Hue is a mature SQL Assistant for querying Databases & Data Warehouses.

  • 1000+ customers
  • Top Fortune 500

use Hue to quickly answer questions via self-service querying and are executing 100s of 1000s of queries daily.

Read more on gethue.com and

Hue Editor

Getting Started

Quick Demos:

Three ways to start the server then configure the databases you want to query:

Docker

Start Hue in a single click with the Docker Guide or the video blog post.

docker run -it -p 8888:8888 gethue/hue:latest

Now Hue should be up and running on your default Docker IP on http://localhost:8888!

Kubernetes

helm repo add gethue https://helm.gethue.com
helm repo update
helm install hue gethue/hue

Read more about configurations at tools/kubernetes.

Development

For a very Quick Start go with the Dev Environment Docker.

Or install the dependencies, clone the repository, build and get the server running.

# <install OS dependencies>
git clone https://github.com/cloudera/hue.git
cd hue
make apps
build/env/bin/hue runserver

Now Hue should be running on http://localhost:8000!

Read more in the documentation.

Components

SQL Editor, Parsers components and REST/Python/CLI APIs.

License

Apache License, Version 2.0

hue's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hue's Issues

hue-search: truncate function doesn't work anymore

In the Search app (master version), I'm using a custom HTML template with the truncate250 / truncate100 functions. However, the displayed results are not truncated anymore (it was working well with hue 3.5.0 from CDH 5.0.1).

Example:
{{#truncate250}}{{description}}{{/truncate250}}

Result (577 characters):
The acDevSoftware company is specialized in the development of software solutions in the industrial , financial and scientifical fields. It develops and sells products: - acZoomIn, software that allows to visualize scalar, vectorial or any data in 3 dimensions in a given spatial time. - acSCAM, framework to control devices and acquire data. It develops some free applications/tools: - acTelerikStylesAssembly: Application that generates an assembly of skins for the product ASP.NET AJAX of Telerik. - acWSAT.net: ASPNET website administration tool, to manage users and roles.

PS: it would be great if we could have truncateX where X is whatever number we'd like

Error when task attempt error message contains non-ascii characters

Hi,

This bug report is for Hue version 1.2.0 (cdh3u3).

We have found that, if the error message output from a task attempt contains non-ascii characters, Hue fails to open the task attempt when double-clicking on it in the list.

We get the following error message:

An error occurred: 'ascii' codec can't decode byte 0xc3 in position 93: ordinal not in range(128)

In the same time, int he JobTracker we can see the error message (note the the indicated position - 93 - point to Spansish Í character with accent):

That's another interesting question what urges Microsoft to translate their error messages (and keystrokes once we're talking...) but we can't fix their nonsense. As a workaround for now, we change the locale setting of the cluster.

If this bug has been fixed in the later versions, please ignore this bug report.

Thank a lot!

hue gives js error

when I run http://localhost:8000 it gives me js error ..
window.addEvent is not a function and
ReferenceError: Hue is not defined
can anybody knows what can be the solution for it ?

beeswax query history, often error long utf-8 string.

Hi,

beeswax query history, often error gave long utf-8 string.

apps/beeswax/src/beeswax/templates/list_history.mako line:120
${collapse_whitespace(query.query[:100])}...

this point give utf8 query, this.query broken & return error.

Error when making app

Hi Team,

I would like to thank you about your effort and your works, and I would like to ask you when I run "make app" I have the following errors

[root@ramzi hue]# make apps
cd /home/ramzi/hue/maven && mvn install
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Hue Maven Parent POM 3.6.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-enforcer-plugin:1.0:enforce (default) @ hue-parent ---
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireJavaVersion failed with message:
Detected JDK Version: 1.8.0 is not in the allowed range [1.7.0,1.7.1000].
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.493s
[INFO] Finished at: Sun Jun 22 11:45:21 AST 2014
[INFO] Final Memory: 6M/283M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.0:enforce (default) on project hue-parent: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
make: *** [parent-pom] Error 1


Can hue work on jdk 8?

Can't able to run "make apps" command

Can't able to run "make apps" command, I'm getting the following errors please find the screen shot.

screenshot from 2014-05-29 15 56 22

Please do the need full on this.

Thanks,
Varma

Oozie workflow shell action does not respect 'capture output' from UI

Cloudera 2.1.0 distribution:

Template for generating piece of workflow XML for shell action
does not contain capture-output element. Hence it's not possible
for shell scripts to output variables for configuring other workflow
actions.

Similar SSH action generates capture-output element correctly.

Compare:
apps/oozie/src/oozie/templates/editor/gen/workflow-shell.xml.mako
To
apps/oozie/src/oozie/templates/editor/gen/workflow-ssh.xml.mako

Thrift Compiler

After make apps under HUE_ROOT, the cli_service ImpalaService and Status python module do not show in HUE_ROOT/apps/impala/src/impala . And then can not use the metastore's refresh function. I guess there are some problems in makefile or others.

create table by semicolon delimited file

I try to create a table from a file. The file is delimited by semicolon, so i choose "Other..." and enter "\073". After clicking on Preview an InternalServerError occours. From the logfile:

Thrift saw exception (this may be expected).
Traceback (most recent call last):
File "/usr/share/hue/desktop/core/src/desktop/lib/thrift_util.py", line 331, in wrapper
ret = res(_args, *_kwargs)
File "/usr/share/hue/apps/beeswax/gen-py/hive_metastore/ThriftHiveMetastore.py", line 1055, in get_table
return self.recv_get_table()
File "/usr/share/hue/apps/beeswax/gen-py/hive_metastore/ThriftHiveMetastore.py", line 1081, in recv_get_table
raise result.o2
NoSuchObjectException: NoSuchObjectException(message='default.semicolontest table not found')

It works if i replace the delimiter in the file by "comma".

[HUE] HDFS Deployment directory is missing "slash" symbol

Hi

While creating workflow from Ozzie Editor panel , if you choose a path from "HDFS Deployment Directory" box , it will miss the "/" symbol . This will cause a 404 Not Found, when you try to access the workspace folder if clicking "Workspace" link in Workflow Editor later .

Attach Image :

screen shot 2014-05-27 at 11 12 51 am

screen shot 2014-05-27 at 11 18 19 am

hive query uses default principal hue/localhost@<REALM> in secure cluster

I configured CDH4 with kerberos and my hive queries failed until I added beeswax_server_host value and tweaked beeswax/db_utils.py to support my non-standard hue principal name (needed to comply with my company's kerberos naming conventions).

I didn't discover the beeswax_server_host configuration option until digging through code. Since the default of localhost isn't great for a secured cluster I'd request that the beeswax_server_host configuration option be added in the Beeswax section with some comment that it might be needed for secure clusters. Further, the call thrift_util.get_client in beeswax/src/beeswax/db_utils.py uses a hardcoded kerberos_principal value which doesn't accommodate kerberos configurations which need to make use of kerberos principal to short name mapping such as is possible through "hadoop.security.auth_to_local".

Changes made in pseudo-distributed.ini are not reflected

Hi,

I have made changes in my pseudo-distributed.ini file as per the configuration but the changes are not getting reflected even after restarting the hue.

screenshot from 2014-05-19 13 00 18

.ini file -
# Use WebHdfs/HttpFs as the communication mechanism.
# Domain should be the NameNode or HttpFs host.
# Default port is 14000 for HttpFs.
## webhdfs_url=http://localhost:50070

[sqoop][bug?]hue doesn't show error msg when starting job failed

hue version: hue-release-3.5.0
sqoop version: 1.9.3
hue log:
[07/Jan/2014 16:53:06 +0000] base ERROR Internal Server Error: /sqoop/api/jobs/2/start
Traceback (most recent call last):
File "/opt/huawei/Bigdata/hue-3.5.0.tar/hue/build/env/lib/python2.6/site-packages/Django-1.4.5-py2.6.egg/django/core/handlers/base.py", line 111, in get_response
response = callback(request, _callback_args, *_callback_kwargs)
File "/opt/huawei/Bigdata/hue-3.5.0.tar/hue/build/env/lib/python2.6/site-packages/Django-1.4.5-py2.6.egg/django/views/decorators/cache.py", line 89, in _wrapped_view_func
response = view_func(request, _args, *_kwargs)
File "/opt/huawei/Bigdata/hue-3.5.0.tar/hue/apps/sqoop/src/sqoop/api/decorators.py", line 92, in decorate
return view_func(request, job=job, _args, *_kwargs)
File "/opt/huawei/Bigdata/hue-3.5.0.tar/hue/apps/sqoop/src/sqoop/api/job.py", line 198, in job_start
response.update(handle_rest_exception(e, _('Could not start job.')))
File "/opt/huawei/Bigdata/hue-3.5.0.tar/hue/apps/sqoop/src/sqoop/api/exception.py", line 31, in handle_rest_exception

suggestion:
replace the line 31 in hue/apps/sqoop/src/sqoop/api/exception.py with following code:

  parent_ex = e.get_parent_ex()
  reason = None
  if hasattr(parent_ex, 'reason'):
    reason = parent_ex.reason

would someone fix it please?

stat as a directory name

Did you every try to use "stat" as a folder name? I tested 2.1 and a trunk.

/somepath/stat/subdirectory

you can't access subdirectory...

Error in loading data

Hello,

I got error while apply this.

grunt> data = load 'atoz.csv' using PigStorage(',');

grunt> dump data;

ERROR org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error while trying to run jobs.
java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected.

I am using ubuntu 12.02 32bit.
I installed hadoop2.2.0 and pig 0.12 successfully.
Haddop is runnig on my system.

But, when i try to load a file

so it gives error.
Please address me where i am wrong.


grunt> aatoz = load 'atoz.csv' using PigStorage(',');
grunt> dump aatoz;
2014-01-23 10:41:44,950 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2014-01-23 10:41:44,968 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2014-01-23 10:41:44,969 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2014-01-23 10:41:44,969 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2014-01-23 10:41:44,971 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2014-01-23 10:41:44,972 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig script settings are added to the job
2014-01-23 10:41:44,972 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2014-01-23 10:41:44,984 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2014-01-23 10:41:44,998 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2014-01-23 10:41:45,000 [Thread-9] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2014-01-23 10:41:45,001 [Thread-9] ERROR org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error while trying to run jobs.
java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:456)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:342)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at java.lang.Thread.run(Thread.java:724)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:260)
2014-01-23 10:41:45,498 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2014-01-23 10:41:45,502 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job null has failed! Stop running all dependent jobs
2014-01-23 10:41:45,503 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2014-01-23 10:41:45,507 [main] ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backend error: Unexpected System Error Occured: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:456)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:342)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at java.lang.Thread.run(Thread.java:724)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:260)

2014-01-23 10:41:45,507 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2014-01-23 10:41:45,507 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Detected Local mode. Stats reported below may be incomplete
2014-01-23 10:41:45,508 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:

HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.2.0 0.10.1 hardik 2014-01-23 10:41:44 2014-01-23 10:41:45 UNKNOWN

Failed!

Failed Jobs:
JobId Alias Feature Message Outputs
N/A aatoz MAP_ONLY Message: Unexpected System Error Occured: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:186)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:456)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:342)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at java.lang.Thread.run(Thread.java:724)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:260)
file:/tmp/temp1979716161/tmp-189979005,

Input(s):
Failed to read data from "file:///home/hardik/pig10/bin/input/atoz.csv"

Output(s):
Failed to produce result in "file:/tmp/temp1979716161/tmp-189979005"

Job DAG:
null

2014-01-23 10:41:45,509 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2014-01-23 10:41:45,510 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias aatoz
Details at logfile: /home/hardik/pig10/bin/pig_1390453192689.log

Error during installation - lxml

I'm trying install hue on Ubuntu 11.10.

I'm getting this below error. Anyways to fix it. I have install all deps for python-lxml. still its throwing this error.

[INFO] ------------------------------------------------------------------------ make[1]: Entering directory `/home/hduser/hue/desktop' make -C core env-install make[2]: Entering directory`/home/hduser/hue/desktop/core' --- Installing lxml into virtual environment Processing lxml-2.2.2-py2.7-linux-x86_64.egg make[2]: **\* [/home/hduser/hue/desktop/core/build/lxml/env-install.stamp] Error 1 make[2]: Leaving directory `/home/hduser/hue/desktop/core' make[1]: *** [.recursive-env-install/core] Error 2 make[1]: Leaving directory`/home/hduser/hue/desktop' make: **\* [desktop] Error 2

make apps error

gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c src/kerberos.c -o build/temp.linux-x86_64-2.7/src/kerberos.o sh: 1: krb5-config: not found
gcc: error: sh:: No such file or directory
gcc: error: 1:: No such file or directory
gcc: error: krb5-config:: No such file or directory
gcc: error: not: No such file or directory
gcc: error: found: No such file or directory
error: command 'gcc' failed with exit status 1
make[2]: *** [/home/hy/workspace/hue/desktop/core/build/kerberos-1.1.1/egg.stamp] Error 1
make[2]: Leaving directory /home/hy/workspace/hue/desktop/core' make[1]: *** [.recursive-env-install/core] Error 2 make[1]: Leaving directory/home/hy/workspace/hue/desktop'
make: *** [desktop] Error 2

Memory leak

We observed hue/django using more than 3gb memory today -

hue       3649  0.0 24.1 3900596 3711988 ?     S    Dec12   0:41 /usr/bin/python2.6

It was up for only about 3 days. We're running via CDH 4.1.2 on CentOS 6:

hue-2.1.0+221-1.cdh4.1.2.p0.9.el6.x86_64

Not sure what other info I can provide. I'll update the ticket if we see it happening again.

Hue hangs when Hive/HDFS are accessed through the hue-web-ui

Hi,

Firstly, I would like to acknowledge your work !!!
I was trying to install Hue in Mac OS X 10.7.4. (I hope Hue can work outside Cloudera VMs as well). Had sailed through the building of hue, fixing of mysql errrors, hue user/group configuration, changed the hue.ini to point to my HADOOP_HOME/HIVE_HOME but Oozie is working. I was getting these following errors. Let me know if I made a mistake and a ton of thanks in advance.

13/05/19 06:10:00 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS, InheritanceStrategy : new-table]
13/05/19 06:10:00 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : DATABASE_PARAMS]
13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 index(es) for table DBS
13/05/19 06:10:00 INFO Datastore.Schema: Validating 0 foreign key(s) for table DBS
13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 unique key(s) for table DBS
13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 index(es) for table DATABASE_PARAMS
13/05/19 06:10:00 INFO Datastore.Schema: Validating 1 foreign key(s) for table DATABASE_PARAMS
13/05/19 06:10:00 INFO Datastore.Schema: Validating 1 unique key(s) for table DATABASE_PARAMS
13/05/19 06:10:01 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase
13/05/19 06:10:01 INFO beeswax.Server: Started new Beeswax Thrift metaserver on port [8003]...
13/05/19 06:10:01 INFO beeswax.Server: minWorkerThreads = 5
13/05/19 06:10:01 INFO beeswax.Server: maxWorkerThreads = 2147483647
13/05/19 06:17:49 INFO metastore.HiveMetaStore: 1: get_all_databases
13/05/19 06:17:49 INFO metastore.HiveMetaStore: 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
13/05/19 06:17:49 INFO metastore.ObjectStore: ObjectStore, initialize called
13/05/19 06:17:49 INFO metastore.ObjectStore: Initialized ObjectStore
Exception in thread "pool-1-thread-1" java.lang.NoClassDefFoundError: org/apache/thrift/scheme/StandardScheme
at com.cloudera.beeswax.api.BeeswaxService$get_default_configuration_args.(BeeswaxService.java:8805)
at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1115)
at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1109)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:680)
Caused by: java.lang.ClassNotFoundException: org.apache.thrift.scheme.StandardScheme
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 9 more
13/05/19 06:20:11 INFO metastore.HiveMetaStore: 1: get_all_databases
Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError: Could not initialize class com.cloudera.beeswax.api.BeeswaxService$get_default_configuration_args
at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1115)
at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1109)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:680)
13/05/19 06:22:11 INFO metastore.HiveMetaStore: 1: get_all_databases
Exception in thread "pool-1-thread-3" java.lang.NoClassDefFoundError: org/apache/thrift/scheme/StandardScheme
at com.cloudera.beeswax.api.BeeswaxService$query_args.(BeeswaxService.java:1184)
at com.cloudera.beeswax.api.BeeswaxService$Processor$query.getEmptyArgsInstance(BeeswaxService.java:905)
at com.cloudera.beeswax.api.BeeswaxService$Processor$query.getEmptyArgsInstance(BeeswaxService.java:899)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:680)
Caused by: java.lang.ClassNotFoundException: org.apache.thrift.scheme.StandardScheme
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 9 more

LDAP support for Authorization Manager on user import

Current LDAP support is based on active directory. I have to connect with a Sun iPlanet based LDAP server with no group definition. A user should be able to specify BaseDN and filter string for LDAP to search for a user.

Solr export missing some columns

In Search tab, if you export the results of the current query, the columns exported depends on the first result row.
For example, if the first result has field A and B, the resulted CSV exported file will only contains these columns. Even if the following rows have more fields (C and D), these will be ignored.

I've found in the code the source of the problem:
https://github.com/cloudera/hue/blob/master/apps/search/src/search/data_export.py#L52

The export file columns should not depend on the first row fields.
The export file should contains all solr fields and always in the same order.

Error during installation

I came across a couple of issues during installation.

  1. In addition to the mentioned dev pre-reqs, we also need libyaml-dev. I think it is a good idea to add this to your documentation.
  2. I get this error when running make install:
$ HADOOP_HOME=/usr/lib/hadoop-0.20 PREFIX=/usr/local make install
...
...
...
init:

compile-gen:
     [echo] contrib: hue
    [javac] Compiling 55 source files to /home/ubuntu/test/hue-1.2.0/desktop/libs/hadoop/build/java/classes
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

compile:
     [echo] contrib: hue
     [echo] src dir: /home/ubuntu/test/hue-1.2.0/desktop/libs/hadoop/java/src/java
    [javac] Compiling 10 source files to /home/ubuntu/test/hue-1.2.0/desktop/libs/hadoop/build/java/classes
    [javac] /home/ubuntu/test/hue-1.2.0/desktop/libs/hadoop/java/src/java/org/apache/hadoop/mapred/ThriftJobTrackerPlugin.java:1036: getMapCounters(org.apache.hadoop.mapred.Counters) in org.apache.hadoop.mapred.JobInProgress cannot be applied to ()
    [javac]                 JTThriftUtils.toThrift(jip.getMapCounters()));
    [javac]                                           ^
    [javac] /home/ubuntu/test/hue-1.2.0/desktop/libs/hadoop/java/src/java/org/apache/hadoop/mapred/ThriftJobTrackerPlugin.java:1038: getReduceCounters(org.apache.hadoop.mapred.Counters) in org.apache.hadoop.mapred.JobInProgress cannot be applied to ()
    [javac]                 JTThriftUtils.toThrift(jip.getReduceCounters()));
    [javac]                                           ^
    [javac] 2 errors
make[2]: Leaving directory `/home/ubuntu/test/hue-1.2.0/desktop/libs/hadoop'
make[1]: Leaving directory `/home/ubuntu/test/hue-1.2.0/desktop'

Stderr:

BUILD FAILED
/home/ubuntu/test/hue-1.2.0/desktop/libs/hadoop/java/build.xml:85: Compile failed; see the compiler error output for details.

Total time: 3 seconds
make[2]: *** [/home/ubuntu/test/hue-1.2.0/desktop/libs/hadoop/java-lib/hue-plugins-1.2.0.jar] Error 1
make[1]: *** [.recursive-install-bdist/libs/hadoop] Error 2
make: *** [install-desktop] Error 2

Don't know if you need this, but in any case, the Hadoop version is:

$ hadoop version
Hadoop 0.20.2-cdh3u2
Subversion file:///tmp/nightly_2011-10-13_20-02-02_3/hadoop-0.20-0.20.2+923.142-1~lucid -r 95a824e4005b2a94fe1c11f1ef9db4c672ba43cb
Compiled by root on Thu Oct 13 21:52:18 PDT 2011
From source with checksum 644e5db6c59d45bca96cec7f220dda51

log time_zone do not change after install

I was install hue by cloudera management edition with online repo in RedHat 6.4
By default, the time zone is America/Los_Angele ,and log file config is :
datefmt=%d/%b/%Y %H:%M:%S +0000
when I change to Asia/Shanghai,it mean log.conf file time_zone must change to:
datefmt=%d/%b/%Y %H:%M:%S +0080
but it did not.

(hue 3.6.0 )make apps error , can not build hue

when i want to make apps
it shows some error like this

[ERROR] /home/cloud-user/hue/desktop/libs/hadoop/java/src/main/gen-java/org/apache/hadoop/thriftfs/jobtracker/api/Jobtracker.java:[147,92] cannot find symbol
symbol: class ThriftTaskID
location: interface org.apache.hadoop.thriftfs.jobtracker.api.Jobtracker.Iface
[ERROR] /home/cloud-user/hue/desktop/libs/hadoop/java/src/main/gen-java/org/apache/hadoop/thriftfs/jobtracker/api/ThriftTaskInProgress.java:[73,10] cannot find symbol
symbol: class ThriftTaskID
location: class org.apache.hadoop.thriftfs.jobtracker.api.ThriftTaskInProgress
[ERROR] /home/cloud-user/hue/desktop/libs/hadoop/java/src/main/gen-java/org/apache/hadoop/thriftfs/jobtracker/api/ThriftTaskInProgress.java:[245,5] cannot find symbol
symbol: class ThriftTaskID
location: class org.apache.hadoop.thriftfs.jobtracker.api.ThriftTaskInProgress
[ERROR] /home/cloud-user/hue/desktop/libs/hadoop/java/src/main/gen-java/org/apache/hadoop/thriftfs/jobtracker/api/ThriftTaskInProgress.java:[515,10] cannot find symbol
symbol: class ThriftTaskID
location: class org.apache.hadoop.thriftfs.jobtracker.api.ThriftTaskInProgress
[ERROR] /home/cloud-user/hue/desktop/libs/hadoop/java/src/main/gen-java/org/apache/hadoop/thriftfs/jobtracker/api/ThriftTaskInProgress.java:[519,41] cannot find symbol
symbol: class ThriftTaskID
location: class org.apache.hadoop.thriftfs.jobtracker.api.ThriftTaskInProgress

[ERROR] location: class org.apache.hadoop.mapred.ThriftJobTrackerPlugin.JTThriftUtils
[ERROR] /home/cloud-user/hue/desktop/libs/hadoop/java/src/main/java/org/apache/hadoop/mapred/ThriftJobTrackerPlugin.java:[1266,66] incompatible types
[ERROR] required: org.apache.hadoop.mapred.TaskAttemptID
[ERROR] found: org.apache.hadoop.mapred.TaskID
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
make[2]: *** [/home/cloud-user/hue/desktop/libs/hadoop/java-lib/hue-plugins-3.6.0-SNAPSHOT.jar] Error 1
make[2]: Leaving directory /home/cloud-user/hue/desktop/libs/hadoop' make[1]: *** [.recursive-env-install/libs/hadoop] Error 2 make[1]: Leaving directory/home/cloud-user/hue/desktop'
make: *** [desktop] Error 2

how can i fix the problem???
i install HUE 3.5.0 is ok, but when i install HUE 3.6.0, it occurs above error.
it very confused me!

[oozie] Can't submit Coordinator

While submitting an Oozie Coordinators , i received 500 ERROR from Hue UI . Take a look in Hue error log , i've found this message :

[02/May/2014 11:09:22 +0700] base ERROR Internal Server Error: /oozie/submit_coordinator/13
Traceback (most recent call last):
File "/usr/lib/hue/build/env/lib/python2.6/site-packages/Django-1.4.5-py2.6.egg/django/core/handlers/base.py", line 111, in get_response
response = callback(request, _callback_args, *_callback_kwargs)
File "/usr/lib/hue/apps/oozie/src/oozie/decorators.py", line 52, in decorate
return view_func(request, _args, *_kwargs)
File "/usr/lib/hue/apps/oozie/src/oozie/views/editor.py", line 633, in submit_coordinator
job_id = submit_coordinator(request, coordinator, mapping)
File "/usr/lib/hue/apps/oozie/src/oozie/views/editor.py", line 653, in submit_coordinator
wf_dir = Submission(request.user, coordinator.workflow, request.fs, request.jt, mapping).deploy()
File "/usr/lib/hue/desktop/libs/liboozie/src/liboozie/submittion.py", line 143, in deploy
oozie_xml = self.job.to_xml(self.properties)
File "/usr/lib/hue/apps/oozie/src/oozie/models.py", line 562, in to_xml
xml = re.sub(re.compile('\s
\n+', re.MULTILINE), '\n', django_mako.render_to_string(tmpl, {'workflow': self, 'mapping': mapping}))
File "/usr/lib/hue/desktop/core/src/desktop/lib/django_mako.py", line 106, in render_to_string_normal
result = template.render(
*data_dict)
File "/usr/lib/hue/build/env/lib/python2.6/site-packages/Mako-0.8.1-py2.6.egg/mako/template.py", line 443, in render
return runtime.render(self, self.callable, args, data)
File "/usr/lib/hue/build/env/lib/python2.6/site-packages/Mako-0.8.1-py2.6.egg/mako/runtime.py", line 786, in _render
**kwargs_for_callable(callable, data))
File "/usr/lib/hue/build/env/lib/python2.6/site-packages/Mako-0.8.1-py2.6.egg/mako/runtime.py", line 818, in _render_context
_exec_template(inherit, lclcontext, args=args, kwargs=kwargs)
File "/usr/lib/hue/build/env/lib/python2.6/site-packages/Mako-0.8.1-py2.6.egg/mako/runtime.py", line 844, in exec_template
callable
(context, _args, *_kwargs)
File "/tmp/tmpo7CHhO/oozie/editor/gen/workflow.xml.mako.py", line 74, in render_body
credential = mapping['credentials'][cred_type]
KeyError: 'credentials'

I'm using Hue+Oozie with kerberized Hadoop Cluster . Hue version is 3.5.0

Thanks for any help .

Alex

create_table with the interface:hiveserver2

when I use hive-0.9 and use hiveserver interface to create a table,
After I click "create table", It will raise a NoSuchObjectExcption(when db.get_table("defualt", name)). And it is not a instance of Exception? so it crush while creat a table not in database "default".

the code(in beeswax/forms.py, about in line 245 in branch2.3 and line 254 in branch2.5):

def _clean_tablename(db, name):
  try:
    table = db.get_table("default", name)
    if table.name:
      raise forms.ValidationError(_('Table "%(name)s" already exists') % {'name': name})
  except Exception:
    return name

when the database is not "default" , on hive-1.0 , It runs well, But on hive-0.9
get_table("default", name) will raise a NoSuchObjectException: <beeswax.server.dbms.NoSuchObjectException instance at 0x7ff379d60830>

By the way,why the database is always “default”? what if when I create a table in other databases?

thanks.

problem with common-header.mako logging in

Hi

I am building a instance with hue in Amazon AWS using Ubuntu 12.04 LTS.
I am following this guide:
http://edpflager.com/?p=1973
and but using Ubuntu and not CentOS.
Once I installed the service I could not log into http://blah:8888 getting an error saying search.Search_controller not found

Looking at the error, I noticed that it was this file producing the error:
https://github.com/cloudera/hue/blob/master/desktop/core/src/desktop/templates/common_header.mako

So I went into /tmp/tmpILkfKQ/desktop and editted the common_header.mako.py file to remove all references to search
Basically I removed all the template lines generated by lines 391-407.
I was then able to log in successfully as expected.

I can try reproduce the error if needed.

Brian

Can't install hue

The log is this:

--- Installing core source structure...
--- Installing Desktop core...
INSTALL_DIR=/usr/local/hue make -C desktop install
INSTALL_DIR=/usr/local/hue/desktop/core \
        INSTALL_CONF_DIR=/usr/local/hue/desktop/conf \
        make -C core install-bdist
--- Building egg for Twisted
running bdist_egg
running egg_info
writing requirements to Twisted.egg-info/requires.txt
writing Twisted.egg-info/PKG-INFO
writing top-level names to Twisted.egg-info/top_level.txt
writing dependency_links to Twisted.egg-info/dependency_links.txt
reading manifest file 'Twisted.egg-info/SOURCES.txt'
writing manifest file 'Twisted.egg-info/SOURCES.txt'
installing library code to build/bdist.macosx-10.8-intel/egg
running install_lib
running build_py
running build_ext
clang -fno-strict-aliasing -fno-common -dynamic -g -Os -pipe -fno-common -fno-strict-aliasing -fwrapv -mno-fused-madd -DENABLE_DTRACE -DMACOSX -DNDEBUG -Wall -Wstrict-prototypes -Wshorten-64-to-32 -DNDEBUG -g -Os -Wall -Wstrict-prototypes -DENABLE_DTRACE -arch i386 -arch x86_64 -pipe -I/System/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c conftest.c -o conftest.o
clang: warning: argument unused during compilation: '-mno-fused-madd'
clang -fno-strict-aliasing -fno-common -dynamic -g -Os -pipe -fno-common -fno-strict-aliasing -fwrapv -mno-fused-madd -DENABLE_DTRACE -DMACOSX -DNDEBUG -Wall -Wstrict-prototypes -Wshorten-64-to-32 -DNDEBUG -g -Os -Wall -Wstrict-prototypes -DENABLE_DTRACE -arch i386 -arch x86_64 -pipe -I/System/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c conftest.c -o conftest.o
clang: warning: argument unused during compilation: '-mno-fused-madd'
conftest.c:1:10: fatal error: 'sys/epoll.h' file not found
#include <sys/epoll.h>
         ^
1 error generated.
building 'twisted.internet.cfsupport' extension
clang -fno-strict-aliasing -fno-common -dynamic -g -Os -pipe -fno-common -fno-strict-aliasing -fwrapv -mno-fused-madd -DENABLE_DTRACE -DMACOSX -DNDEBUG -Wall -Wstrict-prototypes -Wshorten-64-to-32 -DNDEBUG -g -Os -Wall -Wstrict-prototypes -DENABLE_DTRACE -arch i386 -arch x86_64 -pipe -I/System/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c twisted/internet/cfsupport/cfsupport.c -o build/temp.macosx-10.8-intel-2.7/twisted/internet/cfsupport/cfsupport.o -w
twisted/internet/cfsupport/cfsupport.c:128:4: error: assignment to cast is illegal, lvalue casts are not supported
  ((PyObject*)__pyx_v_socket) = Py_None; Py_INCREF(((PyObject*)__pyx_v_socket));
  ~^~~~~~~~~~~~~~~~~~~~~~~~~~ ~
twisted/internet/cfsupport/cfsupport.c:134:4: error: assignment to cast is illegal, lvalue casts are not supported
  ((PyObject *)__pyx_v_socket) = __pyx_1;
  ~^~~~~~~~~~~~~~~~~~~~~~~~~~~ ~
twisted/internet/cfsupport/cfsupport.c:829:4: error: assignment to cast is illegal, lvalue casts are not supported
  ((PyObject*)__pyx_v_obj) = Py_None; Py_INCREF(((PyObject*)__pyx_v_obj));
  ~^~~~~~~~~~~~~~~~~~~~~~~ ~
twisted/internet/cfsupport/cfsupport.c:835:4: error: assignment to cast is illegal, lvalue casts are not supported
  ((PyObject *)__pyx_v_obj) = __pyx_1;
  ~^~~~~~~~~~~~~~~~~~~~~~~~ ~
4 errors generated.
error: command 'clang' failed with exit status 1
make[2]: *** [/Users/rajat.khandelwal/Downloads/hue-2.4.0/desktop/core/build/Twisted/egg.stamp] Error 1
make[1]: *** [.recursive-install-bdist/core] Error 2
make: *** [install-desktop] Error 2

HBase Browser binary row keys support?

Maybe i miss something but i can't filter table view by my binary row keys which consist of an integer concatenated with reverse timestamp (day and year only as shorts). For example, in hbase shell column key looks like \x00\x00\x01\x09x!{` so how do i put this value correctly in the search box? If it's not supported then i think it must be added...

KeyError: "Couldn't get user id for user hue"

[root@hbase2 hue]# build/env/bin/supervisor
Traceback (most recent call last):
File "build/env/bin/supervisor", line 8, in
load_entry_point('desktop==2.3.0', 'console_scripts', 'supervisor')()
File "/home/chamago/hue/desktop/core/src/desktop/supervisor.py", line 312, in main
setup_user_info()
File "/home/chamago/hue/desktop/core/src/desktop/supervisor.py", line 253, in setup_user_info
desktop.lib.daemon_utils.get_uid_gid(SETUID_USER, SETGID_GROUP)
File "/home/chamago/hue/desktop/core/src/desktop/lib/daemon_utils.py", line 45, in get_uid_gid
raise KeyError("Couldn't get user id for user %s" % (username,))
KeyError: "Couldn't get user id for user hue"

Need update $HUE/maven/pom.xml

When I compiled the project on Mac, maven built failed. And I found there is no files under the such repository paths:
https://repository.cloudera.com/content/repositories/snapshots/com/cloudera/parent/1.0-SNAPSHOT/
https://repository.cloudera.com/content/repositories/snapshots/com/cloudera/cdh/hadoop-root/2.1.0-mr1-cdh5.0.0-SNAPSHOT/
So, I removed the parent project and modified 2.1.0-mr1-cdh5.0.0-SNAPSHOT to 2.2.0-mr1-cdh5.0.0-SNAPSHOT in the $HUE/maven/pom.xml, and then I got it compiled.

Escaping ';' char in Beeswax.

(Beeswax, Hue version 2.3.0 with CDH4.5 )
When you run a query on a 'test' table like this,
select count(1),concat(';','a') from test;

it will result in ';' not being escaped, and one is forced to used things like '\073' .

oauth usernames could become too long

usernames are limited to 30 characters … but usernames generated from oauth email addresses could actually easily become longer.

When only one single domain is whitelisted, it would be easy to remove the domain part from the email address to make the usernames shorter … but maybe there's some other solution too

/usr/share/hue/build/env/bin/python not found

I tried to install hue on Ubuntu 12.04 (previously installed and uninstalled CHD4) and got into this error:

ubuntu@server1:$ sudo apt-get install hue
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following extra packages will be installed:
bigtop-jsvc hadoop hadoop-0.20-mapreduce hadoop-client hadoop-hdfs hadoop-mapreduce hadoop-yarn
hbase hive hue-about hue-beeswax hue-filebrowser hue-help hue-jobbrowser hue-jobsub hue-oozie
hue-plugins hue-proxy hue-shell hue-useradmin libcap2 libopts25 ntp
Suggested packages:
ntp-doc
The following NEW packages will be installed:
bigtop-jsvc hadoop hadoop-0.20-mapreduce hadoop-client hadoop-hdfs hadoop-mapreduce hadoop-yarn
hbase hive hue hue-about hue-beeswax hue-filebrowser hue-help hue-jobbrowser hue-jobsub hue-oozie
hue-plugins hue-proxy hue-shell hue-useradmin libcap2 libopts25 ntp
0 upgraded, 24 newly installed, 0 to remove and 2 not upgraded.
Need to get 0 B/149 MB of archives.
After this operation, 310 MB of additional disk space will be used.
Do you want to continue [Y/n]? Y
Selecting previously unselected package libcap2.
(Reading database ...
dpkg: warning: files list file for package `hue-common' missing, assuming package has no files currently installed.
(Reading database ... 36031 files and directories currently installed.)
Unpacking libcap2 (from .../libcap2_1%3a2.22-1ubuntu3_amd64.deb) ...
Selecting previously unselected package libopts25.
Unpacking libopts25 (from .../libopts25_1%3a5.12-0.1ubuntu1_amd64.deb) ...
Selecting previously unselected package ntp.
Unpacking ntp (from .../ntp_1%3a4.2.6.p3+dfsg-1ubuntu3.1_amd64.deb) ...
Selecting previously unselected package bigtop-jsvc.
Unpacking bigtop-jsvc (from .../bigtop-jsvc_0.4+391-1.cdh4.1.3.p0.27
precise-cdh4.1.3_amd64.deb) ...
Selecting previously unselected package hadoop.
Unpacking hadoop (from .../hadoop_2.0.0+556-1.cdh4.1.3.p0.23precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hadoop-hdfs.
Unpacking hadoop-hdfs (from .../hadoop-hdfs_2.0.0+556-1.cdh4.1.3.p0.23
precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hadoop-0.20-mapreduce.
Unpacking hadoop-0.20-mapreduce (from .../hadoop-0.20-mapreduce_0.20.2+1270-1.cdh4.1.3.p0.23precise-cdh4.1.3_amd64.deb) ...
Selecting previously unselected package hadoop-yarn.
Unpacking hadoop-yarn (from .../hadoop-yarn_2.0.0+556-1.cdh4.1.3.p0.23
precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hadoop-mapreduce.
Unpacking hadoop-mapreduce (from .../hadoop-mapreduce_2.0.0+556-1.cdh4.1.3.p0.23precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hadoop-client.
Unpacking hadoop-client (from .../hadoop-client_2.0.0+556-1.cdh4.1.3.p0.23
precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hbase.
Unpacking hbase (from .../hbase_0.92.1+165-1.cdh4.1.3.p0.23precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hive.
Unpacking hive (from .../hive_0.9.0+158-1.cdh4.1.3.p0.23
precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-plugins.
Unpacking hue-plugins (from .../hue-plugins_2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-help.
Unpacking hue-help (from .../hue-help_2.1.0+223-1.cdh4.1.3.p0.20
precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-filebrowser.
Unpacking hue-filebrowser (from .../hue-filebrowser_2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-about.
Unpacking hue-about (from .../hue-about_2.1.0+223-1.cdh4.1.3.p0.20
precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-useradmin.
Unpacking hue-useradmin (from .../hue-useradmin_2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-jobbrowser.
Unpacking hue-jobbrowser (from .../hue-jobbrowser_2.1.0+223-1.cdh4.1.3.p0.20
precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-jobsub.
Unpacking hue-jobsub (from .../hue-jobsub_2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-beeswax.
Unpacking hue-beeswax (from .../hue-beeswax_2.1.0+223-1.cdh4.1.3.p0.20
precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-proxy.
Unpacking hue-proxy (from .../hue-proxy_2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue-shell.
Unpacking hue-shell (from .../hue-shell_2.1.0+223-1.cdh4.1.3.p0.20
precise-cdh4.1.3_amd64.deb) ...
Selecting previously unselected package hue-oozie.
Unpacking hue-oozie (from .../hue-oozie_2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3_all.deb) ...
Selecting previously unselected package hue.
Unpacking hue (from .../hue_2.1.0+223-1.cdh4.1.3.p0.20
precise-cdh4.1.3_all.deb) ...
Processing triggers for ureadahead ...
Processing triggers for man-db ...
Setting up libcap2 (1:2.22-1ubuntu3) ...
Setting up libopts25 (1:5.12-0.1ubuntu1) ...
Setting up ntp (1:4.2.6.p3+dfsg-1ubuntu3.1) ...

  • Starting NTP server ntpd [ OK ]
    Setting up bigtop-jsvc (0.4+391-1.cdh4.1.3.p0.27precise-cdh4.1.3) ...
    Setting up hadoop (2.0.0+556-1.cdh4.1.3.p0.23
    precise-cdh4.1.3) ...
    update-alternatives: using /etc/hadoop/conf.empty to provide /etc/hadoop/conf (hadoop-conf) in auto mode.
    Setting up hadoop-hdfs (2.0.0+556-1.cdh4.1.3.p0.23precise-cdh4.1.3) ...
    Setting up hadoop-0.20-mapreduce (0.20.2+1270-1.cdh4.1.3.p0.23
    precise-cdh4.1.3) ...
    Setting up hadoop-yarn (2.0.0+556-1.cdh4.1.3.p0.23precise-cdh4.1.3) ...
    Setting up hadoop-mapreduce (2.0.0+556-1.cdh4.1.3.p0.23
    precise-cdh4.1.3) ...
    Setting up hadoop-client (2.0.0+556-1.cdh4.1.3.p0.23precise-cdh4.1.3) ...
    Setting up hbase (0.92.1+165-1.cdh4.1.3.p0.23
    precise-cdh4.1.3) ...
    update-alternatives: using /etc/hbase/conf.dist to provide /etc/hbase/conf (hbase-conf) in auto mode.
    Setting up hive (0.9.0+158-1.cdh4.1.3.p0.23precise-cdh4.1.3) ...
    update-alternatives: using /etc/hive/conf.dist to provide /etc/hive/conf (hive-conf) in auto mode.
    Setting up hue-plugins (2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3) ...
    Setting up hue-help (2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3) ...
    /var/lib/dpkg/info/hue-help.postinst: line 40: /usr/share/hue/build/env/bin/python: No such file or directory
    /var/lib/dpkg/info/hue-help.postinst: line 41: /usr/share/hue/build/env/bin/python: No such file or directory
    dpkg: error processing hue-help (--configure):
    subprocess installed post-installation script returned error exit status 1
    dpkg: dependency problems prevent configuration of hue-filebrowser:
    hue-filebrowser depends on hue-help (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-help is not configured yet.
    dpkg: error processing hue-filebrowser (--configure):
    dependency problems - leaving unconfigured
    Setting up hue-about (2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3) ...
    No apport report written because the error message indicates its a followup error from a previous failure.
    /var/lib/dpkg/info/hue-about.postinst: line 40: /usr/share/hue/build/env/bin/python: No such file or directory
    /var/lib/dpkg/info/hue-about.postinst: line 41: /usr/share/hue/build/env/bin/python: No such file or directory
    dpkg: error processing hue-about (--configure):
    subprocess installed post-installation script returned error exit status 1
    dpkg: dependency problems prevent configuration of hue-useradmin:
    hue-useradmin depends on hue-filebrowser (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-filebrowser is not configured yet.
    hue-useradmin depends on hue-about (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-about is not configured yet.
    hue-useradmin depends on hue-help (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-help is not configured yet.
    dpkg: error processing hue-useradmin (--configure):
    dependency problems - leaving unconfigured
    dpkg: dependency problems prevent configuration of hue-jobbrowser:
    hue-jobbrowser depends on hue-filebrowser (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-filebrowser is not configured yet.
    hue-jobbrowser depends on hue-help (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-help is not configured yet.
    dpkg: error processing hue-jobbrowser (--configure):
    dependency problemsNo apport report written because MaxReports is reached already
    No apport report written because MaxReports is reached already
    No apport report written because MaxReports is reached already
    No apport report written because MaxReports is reached already
    - leaving unconfigured
    dpkg: dependency problems prevent configuration of hue-jobsub:
    hue-jobsub depends on hue-jobbrowser (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-jobbrowser is not configured yet.
    hue-jobsub depends on hue-help (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-help is not configured yet.
    dpkg: error processing hue-jobsub (--configure):
    dependency problems - leaving unconfigured
    dpkg: dependency problems prevent configuration of hue-beeswax:
    hue-beeswax depends on hue-jobbrowser (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-jobbrowser is not configured yet.
    hue-beeswax depends on hue-jobsub (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-jobsub is not configured yet.
    hue-beeswax depends on hue-filebrowser (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-filebrowser is not configured yet.
    hue-beeswax depends on hue-help (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-help is not configured yet.
    dpkg: error processing hue-beeswax (--configure):
    dependency problems - leaving unconfigured
    Setting up hue-proxy (2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3) ...
    /var/lib/dpkg/info/hue-proxy.postinst: line 40: /usr/share/hue/build/env/bin/python: No such file or directory
    /var/lib/dpkg/info/hue-proxy.postinst: line 41: /usr/share/hue/build/env/bin/python: No such file or directory
    dpkg: error processing hue-proxy (--configure):
    subprocess installed post-installation script returned error exit status 1
    No apport report written because MaxReports is reached already
    dpkg: dependency problems prevent configuration of hue-shell:
    hue-shell depends on hue-help (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-help is not configured yet.
    dpkg: error processing hue-shell (--configure):
    dependency problems - leaving unconfigured
    No apport report written because MaxReports is reached already
    dpkg: dependency problems prevent configuration of hue-oozie:
    hue-oozie depends on hue-jobbrowser (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-jobbrowser is not configured yet.
    hue-oozie depends on hue-jobsub (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-jobsub is not configured yet.
    hue-oozie depends on hue-filebrowser (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-filebrowser is not configured yet.
    hue-oozie depends on hue-help (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-help is not configured yet.
    dpkg: error processing hue-oozie (--configure):
    dependency problems - leaving unconfigured
    No apport report written because MaxReports is reached already
    dpkg: dependency problems prevent configuration of hue:
    hue depends on hue-user; however:
    Package hue-user is not installed.
    Package hue-useradmin which provides hue-user is not configured yet.
    hue depends on hue-about (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-about is not configured yet.
    hue depends on hue-help (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-help is not configured yet.
    hue depends on hue-filebrowser (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-filebrowser is not configured yet.
    hue depends on hue-jobbrowser (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-jobbrowser is not configured yet.
    hue depends on hue-jobsub (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-jobsub is not configured yet.
    hue depends on hue-beeswax (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-beeswax is not configured yet.
    hue depends on hue-proxy (= 2.1.0+223-1.cdh4.1.3.p0.20precise-cdh4.1.3); however:
    Package hue-proxy is not configured yet.
    hue depends on hue-shell (= 2.1.0+223-1.cdh4.1.3.p0.20
    precise-cdh4.1.3); however:
    Package hue-shell is not configured yet.
    hue depends on hue-oozie (= 2.1.0+223-1.cdh4.1.3.p0.20~precise-cdh4.1.3); however:
    Package hue-oozie is not configured yet.
    dpkg: error processing hue (--configure):
    dependency problems - leaving unconfigured
    No apport report written because MaxReports is reached already
    Processing triggers for libc-bin ...
    ldconfig deferred processing now taking place
    Errors were encountered while processing:
    hue-help
    hue-filebrowser
    hue-about
    hue-useradmin
    hue-jobbrowser
    hue-jobsub
    hue-beeswax
    hue-proxy
    hue-shell
    hue-oozie
    hue
    E: Sub-process /usr/bin/dpkg returned an error code (1)

error during install

[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Hue Maven Parent POM 2.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ hue-parent ---
[INFO] Installing /home/hadoop/hue/maven/pom.xml to /home/hadoop/.m2/repository/com/cloudera/hue/hue-parent/2.1.0-SNAPSHOT/hue-parent-2.1.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.497s
[INFO] Finished at: Tue Oct 30 17:51:46 CST 2012
[INFO] Final Memory: 3M/117M
[INFO] ------------------------------------------------------------------------
make[1]: Entering directory `/home/hadoop/hue/desktop'
make -C core env-install
Traceback (most recent call last):
File "setup.py", line 17, in
from hueversion import VERSION
File "/home/hadoop/hue/desktop/core/hueversion.py", line 1
../../VERSION
^
SyntaxError: invalid syntax

comma-separated values in conf.ini

CONFIG.get() returns an unexpected value when the entry in conf.ini contain a comma.

e.g:
foo=one,two => FOO.get() returns a string(!): ['one','two']. I could live with it being a list, but
this is hard to work with.

Kill and resume not using doAs

Hi,

I'm running Hue 2.5 as shipped by cloudera, on a secure cluster. Using the oozie app, I can't kill or resume jobs that have suspended. I get a 401 from the remote oozie and looking in the logs, hue is not attempting a doAs for the kill or resume.

Submission does get a doAs for submitting user.

I realise this is a rather old version. I've tried digging around in the code but my python isn't too strong. If you can help it'd be appreciated, I realise things have moved on a fair bit.

[oozie] Args cannot contain spaces

It seems that the Hue interface for Oozie does not support spaces in args.

Namely, I am trying to add arguments with spaces in a Java action. Apparently the parser just splits the arguments string at spaces and escapes them. For instance --time "30 seconds" results in

<arg>--time</arg>
<arg>&quot;30</arg>
<arg>seconds&quot;</arg>

while --time 30\ seconds turns into

<arg>--time</arg>
<arg>30\</arg>
<arg>seconds</arg>

Neither results in

<arg>--time</arg>
<arg>30 seconds</arg>

libthrift 0.6 compatibility?

Are there any plans to upversion the libthrift dependency from 0.5 to 0.6, for compatibility with installations that use libthrift 0.6 ?

(I am not an expert in thrift, I just have a Hadoop installation that uses libthrift 0.6, and I tried to install HUE, but ran into trouble due to API changes in libthrift, specfically org.apache.thrift.server.TThreadPoolServer$Options )

Thank you for any information or advice.

More meaningful error message

What could be the cause of this error message? (importing collection for search on Hue 3.5)

There was an error importing the collection(s) or core(s)
Not imported:m4-collection: Start tag expected, '<' not found, line 1, column 1

hue hangs when creating first user

hue seems to just lock up on creating the first user. this is on a vanilla cdh 4.2.1 install on CentOS 6.4 (hue 2.2.0), it asks you to login on the webpage and then it just halts and never returns:

[16/May/2013 17:09:40 +0000] settings     INFO     Welcome to Hue 2.2.0
[16/May/2013 17:09:40 +0000] settings     WARNING  secret_key should be configured
[16/May/2013 10:09:44 +0000] settings     INFO     Welcome to Hue 2.2.0
[16/May/2013 10:09:44 +0000] settings     WARNING  secret_key should be configured
[16/May/2013 10:09:45 +0000] spawning_controller INFO     (13446) *** Controller starting at Thu May 16 10:09:45 2013
[16/May/2013 10:09:47 +0000] settings     INFO     Welcome to Hue 2.2.0
[16/May/2013 10:09:47 +0000] settings     WARNING  secret_key should be configured
[16/May/2013 10:09:51 +0000] middleware   INFO     Unloading SpnegoMiddleware
[16/May/2013 10:09:51 +0000] middleware   INFO     Unloading HueRemoteUserMiddleware
[16/May/2013 10:10:04 +0000] backend      INFO     Materializing user <XXX> in the database
[16/May/2013 10:10:05 +0000] backend      INFO     Augmenting users with class: <class 'desktop.auth.backend.DefaultUserAugmentor'>

HBase cell-edit-modal display incomplete

hbase1

CSS

cell_edit_modal {

margin-left: -40% !important;
max-height: 100%;
width: 80%;
left: 50%;
top: 10%;
margin-top: 0;
overflow: visible;
}

width: 80% change to width: 60%

looks like well :)

hbase2

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.