Giter Site home page Giter Site logo

docker-pinpoint's People

Contributors

dawidmalina avatar otlabs-ci avatar yezhiming avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

docker-pinpoint's Issues

pinpoint backend on kubernetes clusters

@dawidmalina I have forked your docker-pinpoint and tried to run pinpoint backend and the sample app (rancher) on a k8s cluster. https://github.com/proddam/docker-pinpoint

I used kompose to convert the docker-compose.yml to k8s manifest yamls. But no luck so far. Of course your docker-compose.yml worked well on docker. I think there are some connection issues between pinpoint modules when I run hbase, collector and agent on k8s.

Do you have any suggestion or solution for running pinpoint backend on k8s?

hbase java.net.ConnectException: Connection refused

Hi @dawidmalina
I think your dockerfile is very good
but I met a problem when I used it, The pinpoint collector can't connected hbase
The following is the collector of logs

2017/2/20 下午3:21:53Caused by: org.springframework.beans.factory.BeanCreationException: Could not autowire field: private com.navercorp.pinpoint.collector.dao.TraceDao com.navercorp.pinpoint.collector.handler.SpanHandler.traceDao; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'hbaseTraceDaoFactory': FactoryBean threw exception on object creation; nested exception is com.navercorp.pinpoint.common.hbase.HbaseSystemException: Failed after attempts=36, exceptions:
2017/2/20 下午3:21:53Mon Feb 20 15:21:52 CST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68679: row 'TraceV2,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hbase,33098,1487574090526, seqNum=0
2017/2/20 下午3:21:53; nested exception is org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
2017/2/20 下午3:21:53Mon Feb 20 15:21:52 CST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68679: row 'TraceV2,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hbase,33098,1487574090526, seqNum=0
2017/2/20 下午3:21:53
2017/2/20 下午3:21:53	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:561)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:88)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:331)
2017/2/20 下午3:21:53	... 26 more
2017/2/20 下午3:21:53Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'hbaseTraceDaoFactory': FactoryBean threw exception on object creation; nested exception is com.navercorp.pinpoint.common.hbase.HbaseSystemException: Failed after attempts=36, exceptions:
2017/2/20 下午3:21:53Mon Feb 20 15:21:52 CST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68679: row 'TraceV2,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hbase,33098,1487574090526, seqNum=0
2017/2/20 下午3:21:53; nested exception is org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
2017/2/20 下午3:21:53Mon Feb 20 15:21:52 CST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68679: row 'TraceV2,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hbase,33098,1487574090526, seqNum=0
2017/2/20 下午3:21:53
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:175)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.getObjectFromFactoryBean(FactoryBeanRegistrySupport.java:103)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.AbstractBeanFactory.getObjectForBeanInstance(AbstractBeanFactory.java:1525)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:251)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.DefaultListableBeanFactory.findAutowireCandidates(DefaultListableBeanFactory.java:1120)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1044)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:942)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:533)
2017/2/20 下午3:21:53	... 28 more
2017/2/20 下午3:21:53Caused by: com.navercorp.pinpoint.common.hbase.HbaseSystemException: Failed after attempts=36, exceptions:
2017/2/20 下午3:21:53Mon Feb 20 15:21:52 CST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68679: row 'TraceV2,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hbase,33098,1487574090526, seqNum=0
2017/2/20 下午3:21:53; nested exception is org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
2017/2/20 下午3:21:53Mon Feb 20 15:21:52 CST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68679: row 'TraceV2,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hbase,33098,1487574090526, seqNum=0
2017/2/20 下午3:21:53
2017/2/20 下午3:21:53	at com.navercorp.pinpoint.common.hbase.HBaseAdminTemplate.tableExists(HBaseAdminTemplate.java:61)
2017/2/20 下午3:21:53	at com.navercorp.pinpoint.collector.dao.hbase.HbaseTraceDaoFactory.getObject(HbaseTraceDaoFactory.java:54)
2017/2/20 下午3:21:53	at com.navercorp.pinpoint.collector.dao.hbase.HbaseTraceDaoFactory.getObject(HbaseTraceDaoFactory.java:18)
2017/2/20 下午3:21:53	at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:168)
2017/2/20 下午3:21:53	... 36 more
2017/2/20 下午3:21:53Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
2017/2/20 下午3:21:53Mon Feb 20 15:21:52 CST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68679: row 'TraceV2,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hbase,33098,1487574090526, seqNum=0
2017/2/20 下午3:21:53
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:276)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:207)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:410)
2017/2/20 下午3:21:53	at com.navercorp.pinpoint.common.hbase.HBaseAdminTemplate.tableExists(HBaseAdminTemplate.java:59)
2017/2/20 下午3:21:53	... 39 more
2017/2/20 下午3:21:53Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68679: row 'TraceV2,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=hbase,33098,1487574090526, seqNum=0
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65)
2017/2/20 下午3:21:53	... 3 more
2017/2/20 下午3:21:53Caused by: java.net.ConnectException: Connection refused
2017/2/20 下午3:21:53	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
2017/2/20 下午3:21:53	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
2017/2/20 下午3:21:53	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
2017/2/20 下午3:21:53	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
2017/2/20 下午3:21:53	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:416)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:722)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:394)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:203)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:64)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:360)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:334)
2017/2/20 下午3:21:53	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
2017/2/20 下午3:21:53	... 4 more
2017/2/20 下午3:21:5320-Feb-2017 15:21:53.167 SEVERE [localhost-startStop-1] org.apache.catalina.core.StandardContext.startInternal One or more listeners failed to start. Full details will be found in the appropriate container log file
2017/2/20 下午3:21:5520-Feb-2017 15:21:55.890 INFO [localhost-startStop-1] org.apache.catalina.util.SessionIdGeneratorBase.createSecureRandom Creation of SecureRandom instance for session ID generation using [SHA1PRNG] took [2,721] milliseconds.
2017/2/20 下午3:21:5520-Feb-2017 15:21:55.892 SEVERE [localhost-startStop-1] org.apache.catalina.core.StandardContext.startInternal Context [] startup failed due to previous errors
2017/2/20 下午3:21:5520-Feb-2017 15:21:55.929 WARNING [localhost-startStop-1] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [ROOT] appears to have started a thread named [org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
2017/2/20 下午3:21:55 java.lang.Object.wait(Native Method)
2017/2/20 下午3:21:55 java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
2017/2/20 下午3:21:55 java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
2017/2/20 下午3:21:55 org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner.run(FileSystem.java:3093)
2017/2/20 下午3:21:55 java.lang.Thread.run(Thread.java:745)
2017/2/20 下午3:21:5520-Feb-2017 15:21:55.932 SEVERE [localhost-startStop-1] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [ROOT] created a ThreadLocal with key of type [org.apache.htrace.core.Tracer.ThreadLocalContext] (value [org.apache.htrace.core.Tracer$ThreadLocalContext@248913bd]) and a value of type [org.apache.htrace.core.Tracer.ThreadContext] (value [org.apache.htrace.core.Tracer$ThreadContext@14d65ef8]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
2017/2/20 下午3:21:5620-Feb-2017 15:21:56.534 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /usr/local/tomcat/webapps/ROOT has finished in 97,092 ms
2017/2/20 下午3:21:5620-Feb-2017 15:21:56.539 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler [http-nio-9080]
2017/2/20 下午3:21:5620-Feb-2017 15:21:56.546 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler [ajp-nio-9009]
2017/2/20 下午3:21:5620-Feb-2017 15:21:56.557 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in 99071 ms

Could you give me some help?
Thinks

docker-pinpoint in Openshift

Hello,
I know this is not an issue with your docker-compose because I tried it and it works very well, but I've tried your hbase image in Openshift, and it shows a weird error I couldn't debug it, if you have any idea what's missing you'll be a great help, these are the logs :

OpenJDK 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0
--
  | OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
  | log4j:ERROR setFile(null,true) call failed.
  | java.io.FileNotFoundException: /opt/hbase/hbase-1.2.6/logs/SecurityAuth.audit (Permission denied)
  | at java.io.FileOutputStream.open0(Native Method)
  | at java.io.FileOutputStream.open(FileOutputStream.java:270)
  | at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
  | at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
  | at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
  | at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
  | at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
  | at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
  | at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
  | at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
  | at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
  | at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
  | at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
  | at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
  | at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
  | at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
  | at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
  | at org.apache.log4j.Logger.getLogger(Logger.java:104)
  | at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:262)
  | at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:108)
  | at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  | at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  | at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  | at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  | at org.apache.commons.logging.impl.LogFactoryImpl.createLogFromClass(LogFactoryImpl.java:1025)
  | at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:844)
  | at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
  | at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
  | at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
  | at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:655)
  | at org.apache.hadoop.hbase.regionserver.HRegionServer.<clinit>(HRegionServer.java:204)
  | 2018-10-26 13:50:05,194 INFO  [main] util.VersionInfo: HBase 1.2.6
  | 2018-10-26 13:50:05,195 INFO  [main] util.VersionInfo: Source code repository file:///home/busbey/projects/hbase/hbase-assembly/target/hbase-1.2.6 revision=Unknown
  | 2018-10-26 13:50:05,195 INFO  [main] util.VersionInfo: Compiled by busbey on Mon May 29 02:25:32 CDT 2017
  | 2018-10-26 13:50:05,195 INFO  [main] util.VersionInfo: From source with checksum 7e8ce83a648e252758e9dae1fbe779c9
  | 2018-10-26 13:50:05,558 INFO  [main] master.HMasterCommandLine: Starting a zookeeper cluster
  | 2018-10-26 13:50:05,584 INFO  [main] server.ZooKeeperServer: Server environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
  | 2018-10-26 13:50:05,584 INFO  [main] server.ZooKeeperServer: Server environment:host.name=pinpoint-hbase
  | 2018-10-26 13:50:05,584 INFO  [main] server.ZooKeeperServer: Server environment:java.version=1.8.0_111
  | 2018-10-26 13:50:05,584 INFO  [main] server.ZooKeeperServer: Server environment:java.vendor=Oracle Corporation
  | 2018-10-26 13:50:05,584 INFO  [main] server.ZooKeeperServer: Server environment:java.home=/usr/lib/jvm/java-8-openjdk-amd64/jre
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:java.class.path=/opt/hbase/hbase-1.2.6/conf:/usr/lib/jvm/java-8-openjdk-amd64/lib/tools.jar:/opt/hbase/hbase-1.2.6:/opt/hbase/hbase-1.2.6/lib/activation-1.1.jar:/opt/hbase/hbase-1.2.6/lib/aopalliance-1.0.jar:/opt/hbase/hbase-1.2.6/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase/hbase-1.2.6/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase/hbase-1.2.6/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase/hbase-1.2.6/lib/api-util-1.0.0-M20.jar:/opt/hbase/hbase-1.2.6/lib/asm-3.1.jar:/opt/hbase/hbase-1.2.6/lib/avro-1.7.4.jar:/opt/hbase/hbase-1.2.6/lib/commons-beanutils-1.7.0.jar:/opt/hbase/hbase-1.2.6/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase/hbase-1.2.6/lib/commons-cli-1.2.jar:/opt/hbase/hbase-1.2.6/lib/commons-codec-1.9.jar:/opt/hbase/hbase-1.2.6/lib/commons-collections-3.2.2.jar:/opt/hbase/hbase-1.2.6/lib/commons-compress-1.4.1.jar:/opt/hbase/hbase-1.2.6/lib/commons-configuration-1.6.jar:/opt/hbase/hbase-1.2.6/lib/commons-daemon-1.0.13.jar:/opt/hbase/hbase-1.2.6/lib/commons-digester-1.8.jar:/opt/hbase/hbase-1.2.6/lib/commons-el-1.0.jar:/opt/hbase/hbase-1.2.6/lib/commons-httpclient-3.1.jar:/opt/hbase/hbase-1.2.6/lib/commons-io-2.4.jar:/opt/hbase/hbase-1.2.6/lib/commons-lang-2.6.jar:/opt/hbase/hbase-1.2.6/lib/commons-logging-1.2.jar:/opt/hbase/hbase-1.2.6/lib/commons-math-2.2.jar:/opt/hbase/hbase-1.2.6/lib/commons-math3-3.1.1.jar:/opt/hbase/hbase-1.2.6/lib/commons-net-3.1.jar:/opt/hbase/hbase-1.2.6/lib/disruptor-3.3.0.jar:/opt/hbase/hbase-1.2.6/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase/hbase-1.2.6/lib/guava-12.0.1.jar:/opt/hbase/hbase-1.2.6/lib/guice-3.0.jar:/opt/hbase/hbase-1.2.6/lib/guice-servlet-3.0.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-annotations-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-auth-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-client-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-common-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-hdfs-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-mapreduce-client-app-2.5.1.jar:/opt/hbase/hbase-1
  | .2.6/lib/hadoop-mapreduce-client-common-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-yarn-api-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-yarn-client-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-yarn-common-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hadoop-yarn-server-common-2.5.1.jar:/opt/hbase/hbase-1.2.6/lib/hbase-annotations-1.2.6-tests.jar:/opt/hbase/hbase-1.2.6/lib/hbase-annotations-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-client-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-common-1.2.6-tests.jar:/opt/hbase/hbase-1.2.6/lib/hbase-common-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-examples-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-external-blockcache-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-hadoop-compat-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-hadoop2-compat-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-it-1.2.6-tests.jar:/opt/hbase/hbase-1.2.6/lib/hbase-it-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-prefix-tree-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-procedure-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-protocol-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-resource-bundle-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-rest-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-server-1.2.6-tests.jar:/opt/hbase/hbase-1.2.6/lib/hbase-server-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-shell-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/hbase-thrift-1.2.6.jar:/opt/hbase/hbase-1.2.6/lib/htrace-core-3.1.0-incubating.jar:/opt/hbase/hbase-1.2.6/lib/httpclient-4.2.5.jar:/opt/hbase/hbase-1.2.6/lib/httpcore-4.4.1.jar:/opt/hbase/hbase-1.2.6/lib/jackson-core-asl-1.9.13.jar:/opt/hbase/hbase-1.2.6/lib/jackson-jaxrs-1.9.13.jar:/opt/hbase/hbase-1.2.6/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase/hbase-1.2.6/lib/jackson-xc-1.9.13.jar:/opt/hbase/hbase-1.2.6/lib/jamon-runtime-2.4.1.jar:/opt/hbase/hbase-1.2.6/lib/jasper-compiler-5.5.23.jar:/opt/hbase/hbase-1.2
  | .6/lib/jasper-runtime-5.5.23.jar:/opt/hbase/hbase-1.2.6/lib/java-xmlbuilder-0.4.jar:/opt/hbase/hbase-1.2.6/lib/javax.inject-1.jar:/opt/hbase/hbase-1.2.6/lib/jaxb-api-2.2.2.jar:/opt/hbase/hbase-1.2.6/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase/hbase-1.2.6/lib/jcodings-1.0.8.jar:/opt/hbase/hbase-1.2.6/lib/jersey-client-1.9.jar:/opt/hbase/hbase-1.2.6/lib/jersey-core-1.9.jar:/opt/hbase/hbase-1.2.6/lib/jersey-guice-1.9.jar:/opt/hbase/hbase-1.2.6/lib/jersey-json-1.9.jar:/opt/hbase/hbase-1.2.6/lib/jersey-server-1.9.jar:/opt/hbase/hbase-1.2.6/lib/jets3t-0.9.0.jar:/opt/hbase/hbase-1.2.6/lib/jettison-1.3.3.jar:/opt/hbase/hbase-1.2.6/lib/jetty-6.1.26.jar:/opt/hbase/hbase-1.2.6/lib/jetty-sslengine-6.1.26.jar:/opt/hbase/hbase-1.2.6/lib/jetty-util-6.1.26.jar:/opt/hbase/hbase-1.2.6/lib/joni-2.1.2.jar:/opt/hbase/hbase-1.2.6/lib/jruby-complete-1.6.8.jar:/opt/hbase/hbase-1.2.6/lib/jsch-0.1.42.jar:/opt/hbase/hbase-1.2.6/lib/jsp-2.1-6.1.14.jar:/opt/hbase/hbase-1.2.6/lib/jsp-api-2.1-6.1.14.jar:/opt/hbase/hbase-1.2.6/lib/junit-4.12.jar:/opt/hbase/hbase-1.2.6/lib/leveldbjni-all-1.8.jar:/opt/hbase/hbase-1.2.6/lib/libthrift-0.9.3.jar:/opt/hbase/hbase-1.2.6/lib/log4j-1.2.17.jar:/opt/hbase/hbase-1.2.6/lib/metrics-core-2.2.0.jar:/opt/hbase/hbase-1.2.6/lib/netty-all-4.0.23.Final.jar:/opt/hbase/hbase-1.2.6/lib/paranamer-2.3.jar:/opt/hbase/hbase-1.2.6/lib/protobuf-java-2.5.0.jar:/opt/hbase/hbase-1.2.6/lib/servlet-api-2.5-6.1.14.jar:/opt/hbase/hbase-1.2.6/lib/servlet-api-2.5.jar:/opt/hbase/hbase-1.2.6/lib/slf4j-api-1.7.7.jar:/opt/hbase/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar:/opt/hbase/hbase-1.2.6/lib/snappy-java-1.0.4.1.jar:/opt/hbase/hbase-1.2.6/lib/spymemcached-2.11.6.jar:/opt/hbase/hbase-1.2.6/lib/xmlenc-0.52.jar:/opt/hbase/hbase-1.2.6/lib/xz-1.0.jar:/opt/hbase/hbase-1.2.6/lib/zookeeper-3.4.6.jar:
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:java.io.tmpdir=/tmp
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:java.compiler=<NA>
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:os.name=Linux
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:os.arch=amd64
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:os.version=3.10.0-693.5.2.el7.x86_64
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:user.name=?
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:user.home=?
  | 2018-10-26 13:50:05,585 INFO  [main] server.ZooKeeperServer: Server environment:user.dir=/
  | 2018-10-26 13:50:05,600 INFO  [main] server.ZooKeeperServer: Created server with tickTime 2000 minSessionTimeout 4000 maxSessionTimeout 40000 datadir /home/pinpoint/zookeeper/zookeeper_0/version-2 snapdir /home/pinpoint/zookeeper/zookeeper_0/version-2
  | 2018-10-26 13:50:05,617 INFO  [main] server.NIOServerCnxnFactory: binding to port 0.0.0.0/0.0.0.0:2181
  | 2018-10-26 13:50:05,859 INFO  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181] server.NIOServerCnxnFactory: Accepted socket connection from /127.0.0.1:58212
  | 2018-10-26 13:50:05,867 INFO  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181] server.NIOServerCnxn: Processing stat command from /127.0.0.1:58212
  | 2018-10-26 13:50:05,872 INFO  [Thread-2] server.NIOServerCnxn: Stat command output
  | 2018-10-26 13:50:05,873 INFO  [Thread-2] server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:58212 (no session established for client)
  | 2018-10-26 13:50:05,873 INFO  [main] zookeeper.MiniZooKeeperCluster: Started MiniZooKeeperCluster and ran successful 'stat' on client port=2181
  | 2018-10-26 13:50:05,873 INFO  [main] master.HMasterCommandLine: Starting up instance of localHBaseCluster; master=1, regionserversCount=1
  | 2018-10-26 13:50:06,085 WARN  [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  | 2018-10-26 13:50:06,326 INFO  [main] regionserver.RSRpcServices: master/pinpoint-hbase/172.20.0.3:0 server-side HConnection retries=350
  | 2018-10-26 13:50:06,462 INFO  [main] ipc.SimpleRpcScheduler: Using deadline as user call queue, count=3
  | 2018-10-26 13:50:06,472 INFO  [main] ipc.RpcServer: master/pinpoint-hbase/172.20.0.3:0: started 10 reader(s) listening on port=44762
  | 2018-10-26 13:50:06,703 ERROR [main] master.HMasterCommandLine: Master exiting
  | java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMasterjava.lang.NullPointerException: invalid null input: name
  | at com.sun.security.auth.UnixPrincipal.<init>(UnixPrincipal.java:71)
  | at com.sun.security.auth.module.UnixLoginModule.login(UnixLoginModule.java:133)
  | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  | at java.lang.reflect.Method.invoke(Method.java:498)
  | at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
  | at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
  | at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
  | at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
  | at java.security.AccessController.doPrivileged(Native Method)
  | at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
  | at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
  | at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:757)
  | at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)
  | at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607)
  | at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:293)
  | at org.apache.hadoop.hbase.security.User.getCurrent(User.java:191)
  | at org.apache.hadoop.hbase.security.Superusers.initialize(Superusers.java:59)
  | at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:552)
  | at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:412)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.<init>(HMasterCommandLine.java:312)
  | at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  | at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  | at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  | at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  | at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
  | at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
  | at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:222)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:137)
  | at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
  | at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
  | at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:2522)
  |  
  | at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
  | at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
  | at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:222)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:137)
  | at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
  | at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
  | at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:2522)
  | Caused by: java.io.IOException: failure to login
  | at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:782)
  | at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)
  | at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607)
  | at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:293)
  | at org.apache.hadoop.hbase.security.User.getCurrent(User.java:191)
  | at org.apache.hadoop.hbase.security.Superusers.initialize(Superusers.java:59)
  | at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:552)
  | at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:412)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.<init>(HMasterCommandLine.java:312)
  | at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  | at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  | at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  | at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  | at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
  | ... 7 more
  | Caused by: javax.security.auth.login.LoginException: java.lang.NullPointerException: invalid null input: name
  | at com.sun.security.auth.UnixPrincipal.<init>(UnixPrincipal.java:71)
  | at com.sun.security.auth.module.UnixLoginModule.login(UnixLoginModule.java:133)
  | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  | at java.lang.reflect.Method.invoke(Method.java:498)
  | at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
  | at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
  | at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
  | at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
  | at java.security.AccessController.doPrivileged(Native Method)
  | at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
  | at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
  | at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:757)
  | at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)
  | at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607)
  | at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:293)
  | at org.apache.hadoop.hbase.security.User.getCurrent(User.java:191)
  | at org.apache.hadoop.hbase.security.Superusers.initialize(Superusers.java:59)
  | at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:552)
  | at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:412)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.<init>(HMasterCommandLine.java:312)
  | at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  | at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  | at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  | at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  | at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
  | at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:220)
  | at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:222)
  | at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:137)
  | at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
  | at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
  | at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:2522)
  |  
  | at javax.security.auth.login.LoginContext.invoke(LoginContext.java:856)
  | at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
  | at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
  | at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
  | at java.security.AccessController.doPrivileged(Native Method)
  | at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
  | at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
  | at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:757)
  | ... 20 more

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.