dogbowl's People
dogbowl's Issues
Health checks not implemented
InvocationTargetException
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/21/
Traceback (most recent call last):
File "org.apache.spark.deploy.worker.DriverWrapper.java", line -1, in main
File "org.apache.spark.deploy.worker.DriverWrapper$.java", line 58, in main
File "java.lang.reflect.Method.java", line 498, in invoke
File "sun.reflect.DelegatingMethodAccessorImpl.java", line 43, in invoke
File "sun.reflect.NativeMethodAccessorImpl.java", line 62, in invoke
File "sun.reflect.NativeMethodAccessorImpl.java", line -2, in invoke0
InvocationTargetExceptionTraceback (most recent call last):
File "org.apache.spark.deploy.worker.DriverWrapper.java", line -1, in main
File "org.apache.spark.deploy.worker.DriverWrapper$.java", line 58, in main
File "java.lang.reflect.Method.java", line 498, in invoke
File "sun.reflect.DelegatingMethodAccessorImpl.java", line 43, in invoke
File "sun.reflect.NativeMethodAccessorImpl.java", line 62, in invoke
File "sun.reflect.NativeMethodAccessorImpl.java", line -2, in invoke0
File "ai.dog.bowl.job.PerformancePresenceStatsUpdate.java", line 48, in main
File "org.apache.spark.api.java.AbstractJavaRDDLike.java", line 45, in foreach
File "org.apache.spark.api.java.JavaRDDLike$class.java", line 350, in foreach
File "org.apache.spark.rdd.RDD.java", line 892, in foreach
File "org.apache.spark.rdd.RDD.java", line 358, in withScope
File "org.apache.spark.rdd.RDDOperationScope$.java", line 112, in withScope
File "org.apache.spark.rdd.RDDOperationScope$.java", line 151, in withScope
File "org.apache.spark.rdd.RDD$$anonfun$foreach$1.java", line 892, in apply
File "org.apache.spark.rdd.RDD$$anonfun$foreach$1.java", line 894, in apply
File "org.apache.spark.SparkContext.java", line 1931, in runJob
File "org.apache.spark.SparkContext.java", line 1917, in runJob
File "org.apache.spark.SparkContext.java", line 1904, in runJob
File "org.apache.spark.SparkContext.java", line 1891, in runJob
File "org.apache.spark.scheduler.DAGScheduler.java", line 632, in runJob
File "org.apache.hadoop.util.ShutdownHookManager$1.java", line 54, in run
File "org.apache.spark.util.SparkShutdownHookManager$$anon$2.java", line 177, in run
File "org.apache.spark.util.SparkShutdownHookManager.java", line 187, in runAll
File "scala.util.Try$.java", line 192, in apply
File "org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.java", line 187, in apply
File "org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.java", line 187, in apply
File "org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.java", line 187, in apply$mcV$sp
File "org.apache.spark.util.Utils$.java", line 1957, in logUncaughtExceptions
File "org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.java", line 187, in apply
File "org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.java", line 187, in apply
File "org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.java", line 187, in apply$mcV$sp
File "org.apache.spark.util.SparkShutdownHook.java", line 215, in run
File "org.apache.spark.SparkContext$$anonfun$2.java", line 559, in apply$mcV$sp
File "org.apache.spark.SparkContext.java", line 1752, in stop
File "org.apache.spark.SparkContext.java", line 1798, in org$apache$spark$SparkContext$$_stop
File "org.apache.spark.util.Utils$.java", line 1290, in tryLogNonFatalError
File "org.apache.spark.SparkContext$$anonfun$org$apache$spark$SparkContext$$_stop$8.java", line 1799, in apply$mcV$sp
File "org.apache.spark.scheduler.DAGScheduler.java", line 1604, in stop
File "org.apache.spark.util.EventLoop.java", line 83, in stop
File "org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.java", line 1685, in onStop
File "org.apache.spark.scheduler.DAGScheduler.java", line 816, in cleanUpAfterSchedulerStop
File "scala.collection.mutable.HashSet.java", line 78, in foreach
File "org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.java", line 816, in apply
File "org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.java", line 818, in apply
SparkException: Job 0 cancelled because SparkContext was shut down
timezone
java.lang.NullPointerException
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/12/
java.util.concurrent.ExecutionException: java.lang.NullPointerException
java.util.concurrent.CompletableFuture.reportGet (CompletableFuture.java:357)
java.util.concurrent.CompletableFuture.get (CompletableFuture.java:1895)
ai.dog.bowl.App.lambda$new$0 (App.java:114)
ai.dog.bowl.client.firebase.queue.Task.process (Task.java:96)
ai.dog.bowl.client.firebase.queue.QueueTask.run (QueueTask.java:111)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)
java.lang.Thread.run (Thread.java:745)
java.lang.NullPointerException
ai.dog.bowl.client.spark.SparkClient.lambda$submit$1 (SparkClient.java:66)
java.util.concurrent.CompletableFuture$AsyncSupply.run (CompletableFuture.java:1590)
java.lang.Thread.run (Thread.java:745)
java.lang.NullPointerException
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/15/
org.eclipse.jetty.util.MultiException: Multiple exceptions
org.eclipse.jetty.server.Server.doStart (Server.java:329)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:60)
sun.reflect.NativeMethodAccessorImpl.invoke0 (Unknown source)
sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke (Method.java:483)
com.intellij.rt.execution.application.AppMain.main (AppMain.java:147)
java.lang.NullPointerException
com.google.common.base.Preconditions.checkNotNull (Preconditions.java:210)
com.google.common.io.ByteStreams.copy (ByteStreams.java:65)
ai.dog.bowl.client.hdfs.HDFSClient.start (HDFSClient.java:49)
io.dropwizard.lifecycle.JettyManaged.doStart (JettyManaged.java:27)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
org.eclipse.jetty.util.component.ContainerLifeCycle.start (ContainerLifeCycle.java:132)
org.eclipse.jetty.server.Server.start (Server.java:387)
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart (ContainerLifeCycle.java:114)
org.eclipse.jetty.server.handler.AbstractHandler.doStart (AbstractHandler.java:61)
org.eclipse.jetty.server.Server.doStart (Server.java:354)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:60)
sun.reflect.NativeMethodAccessorImpl.invoke0 (Unknown source)
sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke (Method.java:483)
com.intellij.rt.execution.application.AppMain.main (AppMain.java:147)
InvocationTargetException
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/44/
Traceback (most recent call last):
File "org.apache.spark.deploy.worker.DriverWrapper.java", line -1, in main
File "org.apache.spark.deploy.worker.DriverWrapper$.java", line 58, in main
File "java.lang.reflect.Method.java", line 498, in invoke
File "sun.reflect.DelegatingMethodAccessorImpl.java", line 43, in invoke
File "sun.reflect.NativeMethodAccessorImpl.java", line 62, in invoke
File "sun.reflect.NativeMethodAccessorImpl.java", line -2, in invoke0
InvocationTargetExceptionTraceback (most recent call last):
File "org.apache.spark.deploy.worker.DriverWrapper.java", line -1, in main
File "org.apache.spark.deploy.worker.DriverWrapper$.java", line 58, in main
File "java.lang.reflect.Method.java", line 498, in invoke
File "sun.reflect.DelegatingMethodAccessorImpl.java", line 43, in invoke
File "sun.reflect.NativeMethodAccessorImpl.java", line 62, in invoke
File "sun.reflect.NativeMethodAccessorImpl.java", line -2, in invoke0
File "ai.dog.bowl.job.PerformancePresenceStatsUpdate.java", line 41, in main
File "org.apache.spark.sql.SparkSession$Builder.java", line 823, in getOrCreate
File "scala.Option.java", line 121, in getOrElse
File "org.apache.spark.sql.SparkSession$Builder$$anonfun$8.java", line 823, in apply
File "org.apache.spark.sql.SparkSession$Builder$$anonfun$8.java", line 831, in apply
File "org.apache.spark.SparkContext$.java", line 2276, in getOrCreate
File "org.apache.spark.SparkContext.java", line 546, in <init>
NullPointerException
replace maven cobertura plugin with an alternative
DateTimeParseException: Text '1488067200' could not be parsed at index 0
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/39/
Traceback (most recent call last):
File "java.lang.Thread.java", line 745, in run
File "java.util.concurrent.ThreadPoolExecutor$Worker.java", line 617, in run
File "java.util.concurrent.ThreadPoolExecutor.java", line 1142, in runWorker
File "org.apache.spark.executor.Executor$TaskRunner.java", line 274, in run
File "org.apache.spark.scheduler.Task.java", line 86, in run
File "org.apache.spark.scheduler.ResultTask.java", line 70, in runTask
File "org.apache.spark.SparkContext$$anonfun$runJob$5.java", line 1917, in apply
File "org.apache.spark.SparkContext$$anonfun$runJob$5.java", line 1917, in apply
File "org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$27.java", line 894, in apply
File "org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$27.java", line 894, in apply
File "org.apache.spark.InterruptibleIterator.java", line 28, in foreach
File "scala.collection.Iterator$class.java", line 893, in foreach
File "org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1.java", line 350, in apply
File "org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1.java", line 350, in apply
File "ai.dog.bowl.job.PerformancePresenceStatsUpdate.java", line 52, in lambda$main$f0a8df0b$1
File "ai.dog.bowl.stats.presence.UpdatePresenceStats.java", line 59, in updateEmployeeStats
File "ai.dog.bowl.repository.firebase.StatsFirebaseRestRepository.java", line 34, in retrieveAllTimePeriodEndDate
File "java.time.ZonedDateTime.java", line 597, in parse
File "java.time.format.DateTimeFormatter.java", line 1851, in parse
File "java.time.format.DateTimeFormatter.java", line 1949, in parseResolved0
DateTimeParseException: Text '1488067200' could not be parsed at index 0
Firebase task retry behaving weirdly
0 hours total duration
java.nio.channels.UnresolvedAddressException
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/20/
java.io.IOException: DataStreamer Exception:
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run (DFSOutputStream.java:563)
java.nio.channels.UnresolvedAddressException
sun.nio.ch.Net.checkAddress (Net.java:101)
sun.nio.ch.SocketChannelImpl.connect (SocketChannelImpl.java:622)
org.apache.hadoop.net.SocketIOWithTimeout.connect (SocketIOWithTimeout.java:192)
org.apache.hadoop.net.NetUtils.connect (NetUtils.java:531)
org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline (DFSOutputStream.java:1537)
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream (DFSOutputStream.java:1313)
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream (DFSOutputStream.java:1266)
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run (DFSOutputStream.java:449)
org.apache.hadoop.ipc.RemoteException: Cannot delete /.tmp/spark/spark.jar. Name node is in safe mode. The reported blocks 16 needs additional 2 blocks to reach the threshold 0.9990 of total blocks 18. The number of live datanodes 2 has reached the minimu
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/41/
org.eclipse.jetty.util.MultiException: Multiple exceptions
org.eclipse.jetty.server.Server.doStart (Server.java:329)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:74)
org.apache.hadoop.ipc.RemoteException: Cannot delete /.tmp/spark/spark.jar. Name node is in safe mode.
The reported blocks 16 needs additional 2 blocks to reach the threshold 0.9990 of total blocks 18.
The number of live datanodes 2 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1327)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:3711)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:952)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:611)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
org.apache.hadoop.ipc.Client.call (Client.java:1475)
org.apache.hadoop.ipc.Client.call (Client.java:1412)
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke (ProtobufRpcEngine.java:229)
com.sun.proxy.$Proxy41.delete (Unknown source)
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete (ClientNamenodeProtocolTranslatorPB.java:540)
sun.reflect.NativeMethodAccessorImpl.invoke0 (Unknown source)
sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke (Method.java:498)
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod (RetryInvocationHandler.java:191)
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke (RetryInvocationHandler.java:102)
com.sun.proxy.$Proxy42.delete (Unknown source)
org.apache.hadoop.hdfs.DFSClient.delete (DFSClient.java:2044)
org.apache.hadoop.hdfs.DistributedFileSystem$14.doCall (DistributedFileSystem.java:707)
org.apache.hadoop.hdfs.DistributedFileSystem$14.doCall (DistributedFileSystem.java:703)
org.apache.hadoop.fs.FileSystemLinkResolver.resolve (FileSystemLinkResolver.java:81)
org.apache.hadoop.hdfs.DistributedFileSystem.delete (DistributedFileSystem.java:714)
ai.dog.bowl.client.HDFSClient.uploadJar (HDFSClient.java:56)
ai.dog.bowl.App$1.start (App.java:105)
io.dropwizard.lifecycle.JettyManaged.doStart (JettyManaged.java:27)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
org.eclipse.jetty.util.component.ContainerLifeCycle.start (ContainerLifeCycle.java:132)
org.eclipse.jetty.server.Server.start (Server.java:387)
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart (ContainerLifeCycle.java:114)
org.eclipse.jetty.server.handler.AbstractHandler.doStart (AbstractHandler.java:61)
org.eclipse.jetty.server.Server.doStart (Server.java:354)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:74)
java.net.UnknownHostException: namenode.hadoop.development.feedeo.io
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/45/
org.eclipse.jetty.util.MultiException: Multiple exceptions
org.eclipse.jetty.server.Server.doStart (Server.java:329)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:74)
java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode.hadoop.development.feedeo.io
org.apache.hadoop.security.SecurityUtil.buildTokenService (SecurityUtil.java:378)
org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy (NameNodeProxies.java:310)
org.apache.hadoop.hdfs.NameNodeProxies.createProxy (NameNodeProxies.java:176)
org.apache.hadoop.hdfs.DFSClient.<init> (DFSClient.java:678)
org.apache.hadoop.hdfs.DFSClient.<init> (DFSClient.java:619)
org.apache.hadoop.hdfs.DistributedFileSystem.initialize (DistributedFileSystem.java:149)
org.apache.hadoop.fs.FileSystem.createFileSystem (FileSystem.java:2653)
org.apache.hadoop.fs.FileSystem.access$200 (FileSystem.java:92)
org.apache.hadoop.fs.FileSystem$Cache.getInternal (FileSystem.java:2687)
org.apache.hadoop.fs.FileSystem$Cache.get (FileSystem.java:2669)
org.apache.hadoop.fs.FileSystem.get (FileSystem.java:371)
org.apache.hadoop.fs.FileSystem.get (FileSystem.java:170)
ai.dog.bowl.client.HDFSClient.start (HDFSClient.java:39)
io.dropwizard.lifecycle.JettyManaged.doStart (JettyManaged.java:27)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
org.eclipse.jetty.util.component.ContainerLifeCycle.start (ContainerLifeCycle.java:132)
org.eclipse.jetty.server.Server.start (Server.java:387)
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart (ContainerLifeCycle.java:114)
org.eclipse.jetty.server.handler.AbstractHandler.doStart (AbstractHandler.java:61)
org.eclipse.jetty.server.Server.doStart (Server.java:354)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:74)
java.net.UnknownHostException: namenode.hadoop.development.feedeo.io
org.apache.hadoop.security.SecurityUtil.buildTokenService (SecurityUtil.java:378)
org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy (NameNodeProxies.java:310)
org.apache.hadoop.hdfs.NameNodeProxies.createProxy (NameNodeProxies.java:176)
org.apache.hadoop.hdfs.DFSClient.<init> (DFSClient.java:678)
org.apache.hadoop.hdfs.DFSClient.<init> (DFSClient.java:619)
org.apache.hadoop.hdfs.DistributedFileSystem.initialize (DistributedFileSystem.java:149)
org.apache.hadoop.fs.FileSystem.createFileSystem (FileSystem.java:2653)
org.apache.hadoop.fs.FileSystem.access$200 (FileSystem.java:92)
org.apache.hadoop.fs.FileSystem$Cache.getInternal (FileSystem.java:2687)
org.apache.hadoop.fs.FileSystem$Cache.get (FileSystem.java:2669)
org.apache.hadoop.fs.FileSystem.get (FileSystem.java:371)
org.apache.hadoop.fs.FileSystem.get (FileSystem.java:170)
ai.dog.bowl.client.HDFSClient.start (HDFSClient.java:39)
io.dropwizard.lifecycle.JettyManaged.doStart (JettyManaged.java:27)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
org.eclipse.jetty.util.component.ContainerLifeCycle.start (ContainerLifeCycle.java:132)
org.eclipse.jetty.server.Server.start (Server.java:387)
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart (ContainerLifeCycle.java:114)
org.eclipse.jetty.server.handler.AbstractHandler.doStart (AbstractHandler.java:61)
org.eclipse.jetty.server.Server.doStart (Server.java:354)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:74)
spark job should limit the amount of spark logging
java.net.UnknownHostException: spark-master.spark.development.feedeo.io: Name does not resolve
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/46/
ai.dog.bowl.client.spark.rest.FailedSparkRequestException: java.net.UnknownHostException: spark-master.spark.development.feedeo.io: Name does not resolve
ai.dog.bowl.client.spark.rest.HttpRequestUtil.executeHttpMethodAndGetResponse (HttpRequestUtil.java:29)
ai.dog.bowl.client.spark.rest.JobSubmitRequestSpecificationImpl.submit (JobSubmitRequestSpecificationImpl.java:144)
ai.dog.bowl.client.spark.SparkClient.submit (SparkClient.java:71)
ai.dog.bowl.App.lambda$new$0 (App.java:57)
ai.dog.bowl.client.firebase.queue.Task.process (Task.java:95)
ai.dog.bowl.client.firebase.queue.QueueTask.run (QueueTask.java:110)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)
java.lang.Thread.run (Thread.java:745)
java.net.UnknownHostException: spark-master.spark.development.feedeo.io: Name does not resolve
java.net.Inet6AddressImpl.lookupAllHostAddr (Unknown source)
java.net.InetAddress$2.lookupAllHostAddr (InetAddress.java:928)
java.net.InetAddress.getAddressesFromNameService (InetAddress.java:1323)
java.net.InetAddress.getAllByName0 (InetAddress.java:1276)
java.net.InetAddress.getAllByName (InetAddress.java:1192)
java.net.InetAddress.getAllByName (InetAddress.java:1126)
org.apache.http.impl.conn.SystemDefaultDnsResolver.resolve (SystemDefaultDnsResolver.java:45)
org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect (DefaultHttpClientConnectionOperator.java:111)
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect (PoolingHttpClientConnectionManager.java:353)
org.apache.http.impl.execchain.MainClientExec.establishRoute (MainClientExec.java:380)
org.apache.http.impl.execchain.MainClientExec.execute (MainClientExec.java:236)
org.apache.http.impl.execchain.ProtocolExec.execute (ProtocolExec.java:184)
org.apache.http.impl.execchain.RetryExec.execute (RetryExec.java:88)
org.apache.http.impl.execchain.RedirectExec.execute (RedirectExec.java:110)
org.apache.http.impl.client.InternalHttpClient.doExecute (InternalHttpClient.java:184)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:71)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:220)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:164)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:139)
ai.dog.bowl.client.spark.rest.HttpRequestUtil.executeHttpMethodAndGetResponse (HttpRequestUtil.java:20)
ai.dog.bowl.client.spark.rest.JobSubmitRequestSpecificationImpl.submit (JobSubmitRequestSpecificationImpl.java:144)
ai.dog.bowl.client.spark.SparkClient.submit (SparkClient.java:71)
ai.dog.bowl.App.lambda$new$0 (App.java:57)
ai.dog.bowl.client.firebase.queue.Task.process (Task.java:95)
ai.dog.bowl.client.firebase.queue.QueueTask.run (QueueTask.java:110)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)
java.lang.Thread.run (Thread.java:745)
java.nio.channels.UnresolvedAddressException
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/14/
java.io.IOException: DataStreamer Exception:
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run (DFSOutputStream.java:563)
java.nio.channels.UnresolvedAddressException
sun.nio.ch.Net.checkAddress (Net.java:101)
sun.nio.ch.SocketChannelImpl.connect (SocketChannelImpl.java:622)
org.apache.hadoop.net.SocketIOWithTimeout.connect (SocketIOWithTimeout.java:192)
org.apache.hadoop.net.NetUtils.connect (NetUtils.java:531)
org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline (DFSOutputStream.java:1537)
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream (DFSOutputStream.java:1313)
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream (DFSOutputStream.java:1266)
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run (DFSOutputStream.java:449)
org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=namenode.hadoop.development.feedeo.io/10.0.2.251:8020]
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/18/
org.eclipse.jetty.util.MultiException: Multiple exceptions
org.eclipse.jetty.server.Server.doStart (Server.java:329)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:74)
org.apache.hadoop.net.ConnectTimeoutException: Call From 782fe38b8fdf/10.172.165.66 to namenode.hadoop.development.feedeo.io:8020 failed on socket timeout exception: org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=namenode.hadoop.development.feedeo.io/10.0.2.251:8020]; For more details see: http://wiki.apache.org/hadoop/SocketTimeout
sun.reflect.NativeConstructorAccessorImpl.newInstance0 (Unknown source)
sun.reflect.NativeConstructorAccessorImpl.newInstance (NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance (DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance (Constructor.java:423)
org.apache.hadoop.net.NetUtils.wrapWithMessage (NetUtils.java:792)
org.apache.hadoop.net.NetUtils.wrapException (NetUtils.java:751)
org.apache.hadoop.ipc.Client.call (Client.java:1479)
org.apache.hadoop.ipc.Client.call (Client.java:1412)
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke (ProtobufRpcEngine.java:229)
com.sun.proxy.$Proxy41.getFileInfo (Unknown source)
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo (ClientNamenodeProtocolTranslatorPB.java:771)
sun.reflect.NativeMethodAccessorImpl.invoke0 (Unknown source)
sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke (Method.java:498)
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod (RetryInvocationHandler.java:191)
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke (RetryInvocationHandler.java:102)
com.sun.proxy.$Proxy42.getFileInfo (Unknown source)
org.apache.hadoop.hdfs.DFSClient.getFileInfo (DFSClient.java:2108)
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall (DistributedFileSystem.java:1305)
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall (DistributedFileSystem.java:1301)
org.apache.hadoop.fs.FileSystemLinkResolver.resolve (FileSystemLinkResolver.java:81)
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus (DistributedFileSystem.java:1317)
org.apache.hadoop.fs.FileSystem.exists (FileSystem.java:1424)
ai.dog.bowl.client.HDFSClient.uploadJar (HDFSClient.java:55)
ai.dog.bowl.App$1.start (App.java:105)
io.dropwizard.lifecycle.JettyManaged.doStart (JettyManaged.java:27)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
org.eclipse.jetty.util.component.ContainerLifeCycle.start (ContainerLifeCycle.java:132)
org.eclipse.jetty.server.Server.start (Server.java:387)
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart (ContainerLifeCycle.java:114)
org.eclipse.jetty.server.handler.AbstractHandler.doStart (AbstractHandler.java:61)
org.eclipse.jetty.server.Server.doStart (Server.java:354)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:74)
org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=namenode.hadoop.development.feedeo.io/10.0.2.251:8020]
org.apache.hadoop.net.NetUtils.connect (NetUtils.java:534)
org.apache.hadoop.net.NetUtils.connect (NetUtils.java:495)
org.apache.hadoop.ipc.Client$Connection.setupConnection (Client.java:614)
org.apache.hadoop.ipc.Client$Connection.setupIOstreams (Client.java:712)
org.apache.hadoop.ipc.Client$Connection.access$2900 (Client.java:375)
org.apache.hadoop.ipc.Client.getConnection (Client.java:1528)
org.apache.hadoop.ipc.Client.call (Client.java:1451)
org.apache.hadoop.ipc.Client.call (Client.java:1412)
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke (ProtobufRpcEngine.java:229)
com.sun.proxy.$Proxy41.getFileInfo (Unknown source)
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo (ClientNamenodeProtocolTranslatorPB.java:771)
sun.reflect.NativeMethodAccessorImpl.invoke0 (Unknown source)
sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke (Method.java:498)
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod (RetryInvocationHandler.java:191)
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke (RetryInvocationHandler.java:102)
com.sun.proxy.$Proxy42.getFileInfo (Unknown source)
org.apache.hadoop.hdfs.DFSClient.getFileInfo (DFSClient.java:2108)
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall (DistributedFileSystem.java:1305)
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall (DistributedFileSystem.java:1301)
org.apache.hadoop.fs.FileSystemLinkResolver.resolve (FileSystemLinkResolver.java:81)
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus (DistributedFileSystem.java:1317)
org.apache.hadoop.fs.FileSystem.exists (FileSystem.java:1424)
ai.dog.bowl.client.HDFSClient.uploadJar (HDFSClient.java:55)
ai.dog.bowl.App$1.start (App.java:105)
io.dropwizard.lifecycle.JettyManaged.doStart (JettyManaged.java:27)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
org.eclipse.jetty.util.component.ContainerLifeCycle.start (ContainerLifeCycle.java:132)
org.eclipse.jetty.server.Server.start (Server.java:387)
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart (ContainerLifeCycle.java:114)
org.eclipse.jetty.server.handler.AbstractHandler.doStart (AbstractHandler.java:61)
org.eclipse.jetty.server.Server.doStart (Server.java:354)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentC
java.net.UnknownHostException: spark-master.spark.development.feedeo.io: Name does not resolve
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/47/
ai.dog.bowl.client.spark.rest.FailedSparkRequestException: java.net.UnknownHostException: spark-master.spark.development.feedeo.io: Name does not resolve
ai.dog.bowl.client.spark.rest.HttpRequestUtil.executeHttpMethodAndGetResponse (HttpRequestUtil.java:29)
ai.dog.bowl.client.spark.rest.JobSubmitRequestSpecificationImpl.submit (JobSubmitRequestSpecificationImpl.java:144)
ai.dog.bowl.client.spark.SparkClient.submit (SparkClient.java:71)
ai.dog.bowl.App.lambda$new$0 (App.java:57)
ai.dog.bowl.client.firebase.queue.Task.process (Task.java:95)
ai.dog.bowl.client.firebase.queue.QueueTask.run (QueueTask.java:110)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)
java.lang.Thread.run (Thread.java:745)
java.net.UnknownHostException: spark-master.spark.development.feedeo.io: Name does not resolve
java.net.Inet4AddressImpl.lookupAllHostAddr (Unknown source)
java.net.InetAddress$2.lookupAllHostAddr (InetAddress.java:928)
java.net.InetAddress.getAddressesFromNameService (InetAddress.java:1323)
java.net.InetAddress.getAllByName0 (InetAddress.java:1276)
java.net.InetAddress.getAllByName (InetAddress.java:1192)
java.net.InetAddress.getAllByName (InetAddress.java:1126)
org.apache.http.impl.conn.SystemDefaultDnsResolver.resolve (SystemDefaultDnsResolver.java:45)
org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect (DefaultHttpClientConnectionOperator.java:111)
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect (PoolingHttpClientConnectionManager.java:353)
org.apache.http.impl.execchain.MainClientExec.establishRoute (MainClientExec.java:380)
org.apache.http.impl.execchain.MainClientExec.execute (MainClientExec.java:236)
org.apache.http.impl.execchain.ProtocolExec.execute (ProtocolExec.java:184)
org.apache.http.impl.execchain.RetryExec.execute (RetryExec.java:88)
org.apache.http.impl.execchain.RedirectExec.execute (RedirectExec.java:110)
org.apache.http.impl.client.InternalHttpClient.doExecute (InternalHttpClient.java:184)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:71)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:220)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:164)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:139)
ai.dog.bowl.client.spark.rest.HttpRequestUtil.executeHttpMethodAndGetResponse (HttpRequestUtil.java:20)
ai.dog.bowl.client.spark.rest.JobSubmitRequestSpecificationImpl.submit (JobSubmitRequestSpecificationImpl.java:144)
ai.dog.bowl.client.spark.SparkClient.submit (SparkClient.java:71)
ai.dog.bowl.App.lambda$new$0 (App.java:57)
ai.dog.bowl.client.firebase.queue.Task.process (Task.java:95)
ai.dog.bowl.client.firebase.queue.QueueTask.run (QueueTask.java:110)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)
java.lang.Thread.run (Thread.java:745)
docker should NAT outgoing traffic
java.net.ConnectException: Operation timed out (Connection timed out)
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/40/
ai.dog.bowl.client.spark.rest.FailedSparkRequestException: org.apache.http.conn.HttpHostConnectException: Connect to spark-master.spark.development.feedeo.io:6066 [spark-master.spark.development.feedeo.io/172.16.1.37] failed: Operation timed out (Connection timed out)
ai.dog.bowl.client.spark.rest.HttpRequestUtil.executeHttpMethodAndGetResponse (HttpRequestUtil.java:29)
ai.dog.bowl.client.spark.rest.JobSubmitRequestSpecificationImpl.submit (JobSubmitRequestSpecificationImpl.java:144)
ai.dog.bowl.client.spark.SparkClient.submit (SparkClient.java:71)
ai.dog.bowl.App.lambda$new$0 (App.java:57)
ai.dog.bowl.client.firebase.queue.Task.process (Task.java:95)
ai.dog.bowl.client.firebase.queue.QueueTask.run (QueueTask.java:110)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)
java.lang.Thread.run (Thread.java:745)
org.apache.http.conn.HttpHostConnectException: Connect to spark-master.spark.development.feedeo.io:6066 [spark-master.spark.development.feedeo.io/172.16.1.37] failed: Operation timed out (Connection timed out)
org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect (DefaultHttpClientConnectionOperator.java:151)
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect (PoolingHttpClientConnectionManager.java:353)
org.apache.http.impl.execchain.MainClientExec.establishRoute (MainClientExec.java:380)
org.apache.http.impl.execchain.MainClientExec.execute (MainClientExec.java:236)
org.apache.http.impl.execchain.ProtocolExec.execute (ProtocolExec.java:184)
org.apache.http.impl.execchain.RetryExec.execute (RetryExec.java:88)
org.apache.http.impl.execchain.RedirectExec.execute (RedirectExec.java:110)
org.apache.http.impl.client.InternalHttpClient.doExecute (InternalHttpClient.java:184)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:71)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:220)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:164)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:139)
ai.dog.bowl.client.spark.rest.HttpRequestUtil.executeHttpMethodAndGetResponse (HttpRequestUtil.java:20)
ai.dog.bowl.client.spark.rest.JobSubmitRequestSpecificationImpl.submit (JobSubmitRequestSpecificationImpl.java:144)
ai.dog.bowl.client.spark.SparkClient.submit (SparkClient.java:71)
ai.dog.bowl.App.lambda$new$0 (App.java:57)
ai.dog.bowl.client.firebase.queue.Task.process (Task.java:95)
ai.dog.bowl.client.firebase.queue.QueueTask.run (QueueTask.java:110)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)
java.lang.Thread.run (Thread.java:745)
java.net.ConnectException: Operation timed out (Connection timed out)
java.net.PlainSocketImpl.socketConnect (Unknown source)
java.net.AbstractPlainSocketImpl.doConnect (AbstractPlainSocketImpl.java:350)
java.net.AbstractPlainSocketImpl.connectToAddress (AbstractPlainSocketImpl.java:206)
java.net.AbstractPlainSocketImpl.connect (AbstractPlainSocketImpl.java:188)
java.net.SocksSocketImpl.connect (SocksSocketImpl.java:392)
java.net.Socket.connect (Socket.java:589)
org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket (PlainConnectionSocketFactory.java:74)
org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect (DefaultHttpClientConnectionOperator.java:134)
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect (PoolingHttpClientConnectionManager.java:353)
org.apache.http.impl.execchain.MainClientExec.establishRoute (MainClientExec.java:380)
org.apache.http.impl.execchain.MainClientExec.execute (MainClientExec.java:236)
org.apache.http.impl.execchain.ProtocolExec.execute (ProtocolExec.java:184)
org.apache.http.impl.execchain.RetryExec.execute (RetryExec.java:88)
org.apache.http.impl.execchain.RedirectExec.execute (RedirectExec.java:110)
org.apache.http.impl.client.InternalHttpClient.doExecute (InternalHttpClient.java:184)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:71)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:220)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:164)
org.apache.http.impl.client.CloseableHttpClient.execute (CloseableHttpClient.java:139)
ai.dog.bowl.client.spark.rest.HttpRequestUtil.executeHttpMethodAndGetResponse (HttpRequestUtil.java:20)
ai.dog.bowl.client.spark.rest.JobSubmitRequestSpecificationImpl.submit (JobSubmitRequestSpecificationImpl.java:144)
ai.dog.bowl.client.spark.SparkClient.submit (SparkClient.java:71)
ai.dog.bowl.App.lambda$new$0 (App.java:57)
ai.dog.bowl.client.firebase.queue.Task.process (Task.java:95)
ai.dog.bowl.client.firebase.queue.QueueTask.run (QueueTask.java:110)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)
java.lang.Thread.run (Thread.java:745)
DateTimeParseException: Text '1488067200' could not be parsed at index 0
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/39/
Traceback (most recent call last):
File "java.lang.Thread.java", line 745, in run
File "java.util.concurrent.ThreadPoolExecutor$Worker.java", line 617, in run
File "java.util.concurrent.ThreadPoolExecutor.java", line 1142, in runWorker
File "org.apache.spark.executor.Executor$TaskRunner.java", line 274, in run
File "org.apache.spark.scheduler.Task.java", line 86, in run
File "org.apache.spark.scheduler.ResultTask.java", line 70, in runTask
File "org.apache.spark.SparkContext$$anonfun$runJob$5.java", line 1917, in apply
File "org.apache.spark.SparkContext$$anonfun$runJob$5.java", line 1917, in apply
File "org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$27.java", line 894, in apply
File "org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$27.java", line 894, in apply
File "org.apache.spark.InterruptibleIterator.java", line 28, in foreach
File "scala.collection.Iterator$class.java", line 893, in foreach
File "org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1.java", line 350, in apply
File "org.apache.spark.api.java.JavaRDDLike$$anonfun$foreach$1.java", line 350, in apply
File "ai.dog.bowl.job.PerformancePresenceStatsUpdate.java", line 52, in lambda$main$f0a8df0b$1
File "ai.dog.bowl.stats.presence.UpdatePresenceStats.java", line 59, in updateEmployeeStats
File "ai.dog.bowl.repository.firebase.StatsFirebaseRestRepository.java", line 34, in retrieveAllTimePeriodEndDate
File "java.time.ZonedDateTime.java", line 597, in parse
File "java.time.format.DateTimeFormatter.java", line 1851, in parse
File "java.time.format.DateTimeFormatter.java", line 1949, in parseResolved0
DateTimeParseException: Text '1488067200' could not be parsed at index 0
org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=namenode.hadoop.development.feedeo.io/172.16.1.88:8020]
View details in Rollbar: https://rollbar.com/hfreire/dogbowl/items/13/
org.eclipse.jetty.util.MultiException: Multiple exceptions
org.eclipse.jetty.server.Server.doStart (Server.java:329)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:60)
org.apache.hadoop.net.ConnectTimeoutException: Call From b266f7c299f9/172.17.0.4 to namenode.hadoop.development.feedeo.io:8020 failed on socket timeout exception: org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=namenode.hadoop.development.feedeo.io/172.16.1.88:8020]; For more details see: http://wiki.apache.org/hadoop/SocketTimeout
sun.reflect.NativeConstructorAccessorImpl.newInstance0 (Unknown source)
sun.reflect.NativeConstructorAccessorImpl.newInstance (NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance (DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance (Constructor.java:423)
org.apache.hadoop.net.NetUtils.wrapWithMessage (NetUtils.java:792)
org.apache.hadoop.net.NetUtils.wrapException (NetUtils.java:751)
org.apache.hadoop.ipc.Client.call (Client.java:1479)
org.apache.hadoop.ipc.Client.call (Client.java:1412)
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke (ProtobufRpcEngine.java:229)
com.sun.proxy.$Proxy41.getFileInfo (Unknown source)
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo (ClientNamenodeProtocolTranslatorPB.java:771)
sun.reflect.NativeMethodAccessorImpl.invoke0 (Unknown source)
sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke (Method.java:498)
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod (RetryInvocationHandler.java:191)
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke (RetryInvocationHandler.java:102)
com.sun.proxy.$Proxy42.getFileInfo (Unknown source)
org.apache.hadoop.hdfs.DFSClient.getFileInfo (DFSClient.java:2108)
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall (DistributedFileSystem.java:1305)
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall (DistributedFileSystem.java:1301)
org.apache.hadoop.fs.FileSystemLinkResolver.resolve (FileSystemLinkResolver.java:81)
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus (DistributedFileSystem.java:1317)
org.apache.hadoop.fs.FileSystem.exists (FileSystem.java:1424)
ai.dog.bowl.client.hdfs.HDFSClient.start (HDFSClient.java:43)
io.dropwizard.lifecycle.JettyManaged.doStart (JettyManaged.java:27)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
org.eclipse.jetty.util.component.ContainerLifeCycle.start (ContainerLifeCycle.java:132)
org.eclipse.jetty.server.Server.start (Server.java:387)
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart (ContainerLifeCycle.java:114)
org.eclipse.jetty.server.handler.AbstractHandler.doStart (AbstractHandler.java:61)
org.eclipse.jetty.server.Server.doStart (Server.java:354)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java:77)
io.dropwizard.cli.Cli.run (Cli.java:70)
io.dropwizard.Application.run (Application.java:80)
ai.dog.bowl.App.main (App.java:60)
org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=namenode.hadoop.development.feedeo.io/172.16.1.88:8020]
org.apache.hadoop.net.NetUtils.connect (NetUtils.java:534)
org.apache.hadoop.net.NetUtils.connect (NetUtils.java:495)
org.apache.hadoop.ipc.Client$Connection.setupConnection (Client.java:614)
org.apache.hadoop.ipc.Client$Connection.setupIOstreams (Client.java:712)
org.apache.hadoop.ipc.Client$Connection.access$2900 (Client.java:375)
org.apache.hadoop.ipc.Client.getConnection (Client.java:1528)
org.apache.hadoop.ipc.Client.call (Client.java:1451)
org.apache.hadoop.ipc.Client.call (Client.java:1412)
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke (ProtobufRpcEngine.java:229)
com.sun.proxy.$Proxy41.getFileInfo (Unknown source)
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo (ClientNamenodeProtocolTranslatorPB.java:771)
sun.reflect.NativeMethodAccessorImpl.invoke0 (Unknown source)
sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke (Method.java:498)
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod (RetryInvocationHandler.java:191)
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke (RetryInvocationHandler.java:102)
com.sun.proxy.$Proxy42.getFileInfo (Unknown source)
org.apache.hadoop.hdfs.DFSClient.getFileInfo (DFSClient.java:2108)
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall (DistributedFileSystem.java:1305)
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall (DistributedFileSystem.java:1301)
org.apache.hadoop.fs.FileSystemLinkResolver.resolve (FileSystemLinkResolver.java:81)
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus (DistributedFileSystem.java:1317)
org.apache.hadoop.fs.FileSystem.exists (FileSystem.java:1424)
ai.dog.bowl.client.hdfs.HDFSClient.start (HDFSClient.java:43)
io.dropwizard.lifecycle.JettyManaged.doStart (JettyManaged.java:27)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
org.eclipse.jetty.util.component.ContainerLifeCycle.start (ContainerLifeCycle.java:132)
org.eclipse.jetty.server.Server.start (Server.java:387)
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart (ContainerLifeCycle.java:114)
org.eclipse.jetty.server.handler.AbstractHandler.doStart (AbstractHandler.java:61)
org.eclipse.jetty.server.Server.doStart (Server.java:354)
org.eclipse.jetty.util.component.AbstractLifeCycle.start (AbstractLifeCycle.java:68)
io.dropwizard.cli.ServerCommand.run (ServerCommand.java:43)
io.dropwizard.cli.EnvironmentCommand.run (EnvironmentCommand.java:41)
io.dropwizard.cli.ConfiguredCommand.run (ConfiguredCommand.java
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.