Giter Site home page Giter Site logo

Comments (3)

velvia avatar velvia commented on May 5, 2024

I didn't look through all the logs, but one suspicious line is:

[2015-01-27 13:14:00,470] INFO spark.jobserver.JarUtils$ []
[akka://test/user/$c] - Loading object no.such.class$ using loader
spark.jobserver.util.ContextURLClassLoader@7bbe1c14
[2015-01-27 13:14:00,472] INFO spark.jobserver.JarUtils$ []
[akka://test/user/$c] - Loading class no.such.class using loader
spark.jobserver.util.ContextURLClassLoader@7bbe1c14

What was the command or request you sent?

On Tue, Jan 27, 2015 at 6:23 AM, jamborta [email protected] wrote:

hi all,

I am tying to run server_deploy.sh, it seems that it fails to create the
sparkcontext with the first test, using spark.jobserver.WordCountExample,
not sure what the problem might be, as I can run that example separately.

here is my job-server-test.log:

[2015-01-27 13:13:44,630] INFO .jobserver.JobManagerActor [] [akka://test/user/$a] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:13:44,869] INFO k.jobserver.JobStatusActor [] [akka://test/user/$a/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:13:44,872] INFO k.jobserver.JobResultActor [] [akka://test/user/$a/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:13:45,035] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:13:45,035] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:13:45,040] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:13:45,051] INFO .jobserver.JobManagerActor [] [akka://test/user/$b] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:13:45,055] INFO k.jobserver.JobResultActor [] [akka://test/user/$b/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:13:45,059] INFO k.jobserver.JobStatusActor [] [akka://test/user/$b/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:13:45,229] WARN rg.apache.spark.util.Utils [] [akka://test/user/$b] - Your hostname, tamas-laptop resolves to a loopback address: 127.0.1.1; using 10.1.3.213 instead (on interface eth0)
[2015-01-27 13:13:45,230] WARN rg.apache.spark.util.Utils [] [akka://test/user/$b] - Set SPARK_LOCAL_IP if you need to bind to another address
[2015-01-27 13:13:45,609] INFO ache.spark.SecurityManager [] [akka://test/user/$b] - Changing view acls to: tja01
[2015-01-27 13:13:45,610] INFO ache.spark.SecurityManager [] [akka://test/user/$b] - Changing modify acls to: tja01
[2015-01-27 13:13:45,611] INFO ache.spark.SecurityManager [] [akka://test/user/$b] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:13:45,908] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$b] - Slf4jLogger started
[2015-01-27 13:13:45,999] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:13:46,450] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://[email protected]:34516]
[2015-01-27 13:13:46,463] INFO rg.apache.spark.util.Utils [] [akka://test/user/$b] - Successfully started service 'sparkDriver' on port 34516.
[2015-01-27 13:13:46,498] INFO org.apache.spark.SparkEnv [] [akka://test/user/$b] - Registering MapOutputTracker
[2015-01-27 13:13:46,525] INFO org.apache.spark.SparkEnv [] [akka://test/user/$b] - Registering BlockManagerMaster
[2015-01-27 13:13:46,560] INFO k.storage.DiskBlockManager [] [akka://test/user/$b] - Created local directory at /tmp/spark-local-20150127131346-c001
[2015-01-27 13:13:46,572] INFO .spark.storage.MemoryStore [] [akka://test/user/$b] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:13:47,301] WARN doop.util.NativeCodeLoader [] [akka://test/user/$b] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[2015-01-27 13:13:47,548] INFO pache.spark.HttpFileServer [] [akka://test/user/$b] - HTTP File server directory is /tmp/spark-34bbc817-ea02-4871-924d-dfb34763e456
[2015-01-27 13:13:47,564] INFO rg.apache.spark.HttpServer [] [akka://test/user/$b] - Starting HTTP Server
[2015-01-27 13:13:47,799] INFO clipse.jetty.server.Server [] [akka://test/user/$b] - jetty-8.1.14.v20131031
[2015-01-27 13:13:47,825] INFO y.server.AbstractConnector [] [akka://test/user/$b] - Started [email protected]:60616
[2015-01-27 13:13:47,826] INFO rg.apache.spark.util.Utils [] [akka://test/user/$b] - Successfully started service 'HTTP file server' on port 60616.
[2015-01-27 13:13:53,012] INFO clipse.jetty.server.Server [] [akka://test/user/$b] - jetty-8.1.14.v20131031
[2015-01-27 13:13:53,034] INFO y.server.AbstractConnector [] [akka://test/user/$b] - Started [email protected]:34008
[2015-01-27 13:13:53,034] INFO rg.apache.spark.util.Utils [] [akka://test/user/$b] - Successfully started service 'SparkUI' on port 34008.
[2015-01-27 13:13:53,040] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$b] - Started SparkUI at http://10.1.3.213:34008
[2015-01-27 13:13:53,294] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://[email protected]:34516/user/HeartbeatReceiver
[2015-01-27 http://[email protected]:34516/user/HeartbeatReceiver%5B2015-01-27 13:13:53,528] INFO .NettyBlockTransferService [] [akka://test/user/$b] - Server created on 46782
[2015-01-27 13:13:53,532] INFO storage.BlockManagerMaster [] [akka://test/user/$b] - Trying to register BlockManager
[2015-01-27 13:13:53,535] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:46782 with 681.8 MB RAM, BlockManagerId(, localhost, 46782)
[2015-01-27 13:13:53,540] INFO storage.BlockManagerMaster [] [akka://test/user/$b] - Registered BlockManager
[2015-01-27 13:13:53,851] INFO .jobserver.RddManagerActor [] [akka://test/user/$b/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:13:53,853] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:13:53,853] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:13:53,854] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:13:53,854] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:13:53,870] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:13:53,871] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:13:53,871] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:13:53,871] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:13:53,871] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:13:53,875] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:13:53,932] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://10.1.3.213:34008
[2015-01-27 13:13:53,935] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:13:54,996] INFO apOutputTrackerMasterActor [] [akka://test/user/$b] - MapOutputTrackerActor stopped!
[2015-01-27 13:13:55,082] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:13:55,083] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:13:55,085] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:13:55,099] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:13:55,099] INFO rovider$RemotingTerminator [] [akka.tcp://[email protected]:34516/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:13:55,100] INFO .jobserver.JobManagerActor [] [akka://test/user/$c] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:13:55,101] INFO rovider$RemotingTerminator [] [akka.tcp://[email protected]:34516/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:13:55,102] INFO k.jobserver.JobStatusActor [] [akka://test/user/$c/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:13:55,102] INFO k.jobserver.JobResultActor [] [akka://test/user/$c/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:13:55,137] INFO ache.spark.SecurityManager [] [akka://test/user/$c] - Changing view acls to: tja01
[2015-01-27 13:13:55,137] INFO ache.spark.SecurityManager [] [akka://test/user/$c] - Changing modify acls to: tja01
[2015-01-27 13:13:55,138] INFO ache.spark.SecurityManager [] [akka://test/user/$c] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:13:55,169] INFO rovider$RemotingTerminator [] [akka.tcp://[email protected]:34516/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:13:55,249] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$c] - Slf4jLogger started
[2015-01-27 13:13:55,259] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:13:55,280] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:49266]
[2015-01-27 13:13:55,282] INFO rg.apache.spark.util.Utils [] [akka://test/user/$c] - Successfully started service 'sparkDriver' on port 49266.
[2015-01-27 13:13:55,283] INFO org.apache.spark.SparkEnv [] [akka://test/user/$c] - Registering MapOutputTracker
[2015-01-27 13:13:55,285] INFO org.apache.spark.SparkEnv [] [akka://test/user/$c] - Registering BlockManagerMaster
[2015-01-27 13:13:55,287] INFO k.storage.DiskBlockManager [] [akka://test/user/$c] - Created local directory at /tmp/spark-local-20150127131355-fede
[2015-01-27 13:13:55,288] INFO .spark.storage.MemoryStore [] [akka://test/user/$c] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:13:55,290] INFO pache.spark.HttpFileServer [] [akka://test/user/$c] - HTTP File server directory is /tmp/spark-443848ba-82ea-4c59-bc86-16a71dc8bd65
[2015-01-27 13:13:55,291] INFO rg.apache.spark.HttpServer [] [akka://test/user/$c] - Starting HTTP Server
[2015-01-27 13:13:55,292] INFO clipse.jetty.server.Server [] [akka://test/user/$c] - jetty-8.1.14.v20131031
[2015-01-27 13:13:55,299] INFO y.server.AbstractConnector [] [akka://test/user/$c] - Started [email protected]:58253
[2015-01-27 13:13:55,299] INFO rg.apache.spark.util.Utils [] [akka://test/user/$c] - Successfully started service 'HTTP file server' on port 58253.
[2015-01-27 13:14:00,325] INFO clipse.jetty.server.Server [] [akka://test/user/$c] - jetty-8.1.14.v20131031
[2015-01-27 13:14:00,333] INFO y.server.AbstractConnector [] [akka://test/user/$c] - Started [email protected]:58755
[2015-01-27 13:14:00,334] INFO rg.apache.spark.util.Utils [] [akka://test/user/$c] - Successfully started service 'SparkUI' on port 58755.
[2015-01-27 13:14:00,334] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$c] - Started SparkUI at http://localhost:58755
[2015-01-27 13:14:00,410] INFO pache.spark.util.AkkaUtils [] [akka://test/user/$c] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:49266/user/HeartbeatReceiver
[2015-01-27 13:14:00,417] INFO .NettyBlockTransferService [] [akka://test/user/$c] - Server created on 55801
[2015-01-27 13:14:00,417] INFO storage.BlockManagerMaster [] [akka://test/user/$c] - Trying to register BlockManager
[2015-01-27 13:14:00,418] INFO ge.BlockManagerMasterActor [] [akka://test/user/$c] - Registering block manager localhost:55801 with 681.8 MB RAM, BlockManagerId(, localhost, 55801)
[2015-01-27 13:14:00,418] INFO storage.BlockManagerMaster [] [akka://test/user/$c] - Registered BlockManager
[2015-01-27 13:14:00,431] INFO .jobserver.RddManagerActor [] [akka://test/user/$c/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:00,433] INFO .jobserver.JobManagerActor [] [akka://test/user/$c] - Loading class no.such.class for app notajar
[2015-01-27 13:14:00,462] INFO .apache.spark.SparkContext [] [akka://test/user/$c] - Added JAR /tmp/InMemoryDAO722024764359244436.jar at http://10.1.3.213:58253/jars/InMemoryDAO722024764359244436.jar with timestamp 1422364440461
[2015-01-27 13:14:00,469] INFO util.ContextURLClassLoader [] [akka://test/user/$c] - Added URL file:/tmp/InMemoryDAO722024764359244436.jar to ContextURLClassLoader
[2015-01-27 13:14:00,470] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$c] - Loading object no.such.class$ using loader spark.jobserver.util.ContextURLClassLoader@7bbe1c14
[2015-01-27 13:14:00,472] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$c] - Loading class no.such.class using loader spark.jobserver.util.ContextURLClassLoader@7bbe1c14
[2015-01-27 13:14:00,473] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:00,473] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:00,473] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:00,473] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:00,484] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:00,485] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:00,485] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:00,485] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:00,486] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:00,486] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:00,486] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:00,486] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:00,487] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:00,487] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:00,487] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:00,487] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:00,488] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:00,488] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:00,488] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:00,489] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:00,489] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:00,489] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:00,490] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:00,491] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:00,491] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:00,491] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:00,491] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:00,543] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:58755
[2015-01-27 13:14:00,544] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:01,603] INFO apOutputTrackerMasterActor [] [akka://test/user/$c] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:01,616] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:01,616] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:01,617] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:01,620] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:01,622] INFO .jobserver.JobManagerActor [] [akka://test/user/$d] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:01,623] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49266/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:01,625] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49266/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:01,627] INFO k.jobserver.JobStatusActor [] [akka://test/user/$d/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:01,628] INFO k.jobserver.JobResultActor [] [akka://test/user/$d/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:01,636] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49266/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:01,646] INFO ache.spark.SecurityManager [] [akka://test/user/$d] - Changing view acls to: tja01
[2015-01-27 13:14:01,663] INFO ache.spark.SecurityManager [] [akka://test/user/$d] - Changing modify acls to: tja01
[2015-01-27 13:14:01,663] INFO ache.spark.SecurityManager [] [akka://test/user/$d] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:01,800] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$d] - Slf4jLogger started
[2015-01-27 13:14:01,816] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:01,864] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:44175]
[2015-01-27 13:14:01,865] INFO rg.apache.spark.util.Utils [] [akka://test/user/$d] - Successfully started service 'sparkDriver' on port 44175.
[2015-01-27 13:14:01,866] INFO org.apache.spark.SparkEnv [] [akka://test/user/$d] - Registering MapOutputTracker
[2015-01-27 13:14:01,868] INFO org.apache.spark.SparkEnv [] [akka://test/user/$d] - Registering BlockManagerMaster
[2015-01-27 13:14:01,869] INFO k.storage.DiskBlockManager [] [akka://test/user/$d] - Created local directory at /tmp/spark-local-20150127131401-3d1f
[2015-01-27 13:14:01,870] INFO .spark.storage.MemoryStore [] [akka://test/user/$d] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:01,872] INFO pache.spark.HttpFileServer [] [akka://test/user/$d] - HTTP File server directory is /tmp/spark-98b99407-490b-4beb-9da9-3c753cd2e8b9
[2015-01-27 13:14:01,872] INFO rg.apache.spark.HttpServer [] [akka://test/user/$d] - Starting HTTP Server
[2015-01-27 13:14:01,874] INFO clipse.jetty.server.Server [] [akka://test/user/$d] - jetty-8.1.14.v20131031
[2015-01-27 13:14:01,882] INFO y.server.AbstractConnector [] [akka://test/user/$d] - Started [email protected]:48214
[2015-01-27 13:14:01,882] INFO rg.apache.spark.util.Utils [] [akka://test/user/$d] - Successfully started service 'HTTP file server' on port 48214.
[2015-01-27 13:14:06,911] INFO clipse.jetty.server.Server [] [akka://test/user/$d] - jetty-8.1.14.v20131031
[2015-01-27 13:14:06,919] INFO y.server.AbstractConnector [] [akka://test/user/$d] - Started [email protected]:52772
[2015-01-27 13:14:06,919] INFO rg.apache.spark.util.Utils [] [akka://test/user/$d] - Successfully started service 'SparkUI' on port 52772.
[2015-01-27 13:14:06,920] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$d] - Started SparkUI at http://localhost:52772
[2015-01-27 13:14:06,980] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:44175/user/HeartbeatReceiver
[2015-01-27 13:14:06,981] INFO .NettyBlockTransferService [] [akka://test/user/$d] - Server created on 49782
[2015-01-27 13:14:06,981] INFO storage.BlockManagerMaster [] [akka://test/user/$d] - Trying to register BlockManager
[2015-01-27 13:14:06,982] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:49782 with 681.8 MB RAM, BlockManagerId(, localhost, 49782)
[2015-01-27 13:14:06,982] INFO storage.BlockManagerMaster [] [akka://test/user/$d] - Registered BlockManager
[2015-01-27 13:14:06,991] INFO .jobserver.RddManagerActor [] [akka://test/user/$d/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:06,992] INFO .jobserver.JobManagerActor [] [akka://test/user/$d] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:06,993] INFO .apache.spark.SparkContext [] [akka://test/user/$d] - Added JAR /tmp/InMemoryDAO8320471003519538213.jar at http://10.1.3.213:48214/jars/InMemoryDAO8320471003519538213.jar with timestamp 1422364446993
[2015-01-27 13:14:07,000] INFO util.ContextURLClassLoader [] [akka://test/user/$d] - Added URL file:/tmp/InMemoryDAO8320471003519538213.jar to ContextURLClassLoader
[2015-01-27 13:14:07,000] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$d] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@237ac531
[2015-01-27 13:14:07,006] INFO .jobserver.JobManagerActor [] [akka://test/user/$d] - Starting Spark job 1f1b4a8c-edde-49c0-855d-238efb9ff8f0 [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:07,006] INFO k.jobserver.JobResultActor [] [akka://test/user/$d/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 1f1b4a8c-edde-49c0-855d-238efb9ff8f0
[2015-01-27 13:14:07,007] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:07,008] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:07,009] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:07,009] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:07,010] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:07,019] WARN .jobserver.JobManagerActor [] [] - Exception from job 1f1b4a8c-edde-49c0-855d-238efb9ff8f0:
java.lang.Throwable: No input.string config param
at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:213)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:07,023] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:07,024] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:07,025] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:07,025] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:07,025] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:07,026] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:07,026] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:07,026] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:07,026] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:07,027] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:07,027] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:07,027] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:07,028] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:07,028] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:07,028] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:07,029] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:07,029] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:07,029] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:07,029] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:07,030] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:07,030] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:07,031] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:07,031] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:07,082] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:52772
[2015-01-27 13:14:07,083] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:08,135] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:08,138] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:08,139] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:08,139] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:08,141] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:08,142] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44175/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:08,143] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44175/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:08,150] INFO .jobserver.JobManagerActor [] [akka://test/user/$e] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:08,152] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44175/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:08,153] INFO k.jobserver.JobStatusActor [] [akka://test/user/$e/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:08,154] INFO k.jobserver.JobResultActor [] [akka://test/user/$e/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:08,172] INFO ache.spark.SecurityManager [] [akka://test/user/$e] - Changing view acls to: tja01
[2015-01-27 13:14:08,172] INFO ache.spark.SecurityManager [] [akka://test/user/$e] - Changing modify acls to: tja01
[2015-01-27 13:14:08,172] INFO ache.spark.SecurityManager [] [akka://test/user/$e] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:08,279] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$e] - Slf4jLogger started
[2015-01-27 13:14:08,287] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:08,302] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:56015]
[2015-01-27 13:14:08,303] INFO rg.apache.spark.util.Utils [] [akka://test/user/$e] - Successfully started service 'sparkDriver' on port 56015.
[2015-01-27 13:14:08,304] INFO org.apache.spark.SparkEnv [] [akka://test/user/$e] - Registering MapOutputTracker
[2015-01-27 13:14:08,305] INFO org.apache.spark.SparkEnv [] [akka://test/user/$e] - Registering BlockManagerMaster
[2015-01-27 13:14:08,306] INFO k.storage.DiskBlockManager [] [akka://test/user/$e] - Created local directory at /tmp/spark-local-20150127131408-2dbe
[2015-01-27 13:14:08,307] INFO .spark.storage.MemoryStore [] [akka://test/user/$e] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:08,309] INFO pache.spark.HttpFileServer [] [akka://test/user/$e] - HTTP File server directory is /tmp/spark-ef2db7f2-dfce-4e9d-8ac5-aa9f6386f6bf
[2015-01-27 13:14:08,309] INFO rg.apache.spark.HttpServer [] [akka://test/user/$e] - Starting HTTP Server
[2015-01-27 13:14:08,311] INFO clipse.jetty.server.Server [] [akka://test/user/$e] - jetty-8.1.14.v20131031
[2015-01-27 13:14:08,312] INFO y.server.AbstractConnector [] [akka://test/user/$e] - Started [email protected]:43668
[2015-01-27 13:14:08,313] INFO rg.apache.spark.util.Utils [] [akka://test/user/$e] - Successfully started service 'HTTP file server' on port 43668.
[2015-01-27 13:14:13,331] INFO clipse.jetty.server.Server [] [akka://test/user/$e] - jetty-8.1.14.v20131031
[2015-01-27 13:14:13,342] INFO y.server.AbstractConnector [] [akka://test/user/$e] - Started [email protected]:43117
[2015-01-27 13:14:13,343] INFO rg.apache.spark.util.Utils [] [akka://test/user/$e] - Successfully started service 'SparkUI' on port 43117.
[2015-01-27 13:14:13,343] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$e] - Started SparkUI at http://localhost:43117
[2015-01-27 13:14:13,408] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:56015/user/HeartbeatReceiver
[2015-01-27 13:14:13,411] INFO .NettyBlockTransferService [] [akka://test/user/$e] - Server created on 36598
[2015-01-27 13:14:13,411] INFO storage.BlockManagerMaster [] [akka://test/user/$e] - Trying to register BlockManager
[2015-01-27 13:14:13,412] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:36598 with 681.8 MB RAM, BlockManagerId(, localhost, 36598)
[2015-01-27 13:14:13,412] INFO storage.BlockManagerMaster [] [akka://test/user/$e] - Registered BlockManager
[2015-01-27 13:14:13,419] INFO .jobserver.RddManagerActor [] [akka://test/user/$e/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:13,420] INFO .jobserver.JobManagerActor [] [akka://test/user/$e] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:13,423] INFO .apache.spark.SparkContext [] [akka://test/user/$e] - Added JAR /tmp/InMemoryDAO2098972855628514058.jar at http://10.1.3.213:43668/jars/InMemoryDAO2098972855628514058.jar with timestamp 1422364453423
[2015-01-27 13:14:13,429] INFO util.ContextURLClassLoader [] [akka://test/user/$e] - Added URL file:/tmp/InMemoryDAO2098972855628514058.jar to ContextURLClassLoader
[2015-01-27 13:14:13,429] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$e] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@23f9a8ff
[2015-01-27 13:14:13,433] INFO .jobserver.JobManagerActor [] [akka://test/user/$e] - Starting Spark job 6bd35ede-445c-4303-95c8-61f8e3d6689b [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:13,433] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:13,433] INFO k.jobserver.JobResultActor [] [akka://test/user/$e/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 6bd35ede-445c-4303-95c8-61f8e3d6689b
[2015-01-27 13:14:13,437] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:13,437] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:13,437] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:13,438] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:13,449] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:13,449] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:13,450] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:13,450] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:13,450] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:13,450] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:13,451] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:13,451] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:13,451] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:13,451] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:13,452] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:13,452] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:13,452] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:13,452] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:13,453] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:13,453] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:13,453] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:13,453] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:13,507] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:43117
[2015-01-27 13:14:13,507] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:13,570] WARN .jobserver.JobManagerActor [] [] - Exception from job 6bd35ede-445c-4303-95c8-61f8e3d6689b:
org.apache.spark.SparkException: SparkContext has been shutdown
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1277)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1300)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1314)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1328)
at org.apache.spark.rdd.RDD.collect(RDD.scala:780)
at spark.jobserver.WordCountExample$.runJob(WordCountExample.scala:32)
at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:219)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:14,561] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:14,565] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:14,565] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:14,566] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:14,566] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:14,568] INFO .jobserver.JobManagerActor [] [akka://test/user/$f] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:14,569] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:56015/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:14,573] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:56015/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:14,576] INFO k.jobserver.JobStatusActor [] [akka://test/user/$f/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:14,576] INFO k.jobserver.JobResultActor [] [akka://test/user/$f/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:14,581] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:56015/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:14,593] INFO ache.spark.SecurityManager [] [akka://test/user/$f] - Changing view acls to: tja01
[2015-01-27 13:14:14,593] INFO ache.spark.SecurityManager [] [akka://test/user/$f] - Changing modify acls to: tja01
[2015-01-27 13:14:14,593] INFO ache.spark.SecurityManager [] [akka://test/user/$f] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:14,635] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$f] - Slf4jLogger started
[2015-01-27 13:14:14,642] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:14,656] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:42388]
[2015-01-27 13:14:14,657] INFO rg.apache.spark.util.Utils [] [akka://test/user/$f] - Successfully started service 'sparkDriver' on port 42388.
[2015-01-27 13:14:14,657] INFO org.apache.spark.SparkEnv [] [akka://test/user/$f] - Registering MapOutputTracker
[2015-01-27 13:14:14,658] INFO org.apache.spark.SparkEnv [] [akka://test/user/$f] - Registering BlockManagerMaster
[2015-01-27 13:14:14,659] INFO k.storage.DiskBlockManager [] [akka://test/user/$f] - Created local directory at /tmp/spark-local-20150127131414-bc68
[2015-01-27 13:14:14,659] INFO .spark.storage.MemoryStore [] [akka://test/user/$f] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:14,660] INFO pache.spark.HttpFileServer [] [akka://test/user/$f] - HTTP File server directory is /tmp/spark-50d4eb9b-ac3c-43d9-ba59-dbd30c5576bc
[2015-01-27 13:14:14,660] INFO rg.apache.spark.HttpServer [] [akka://test/user/$f] - Starting HTTP Server
[2015-01-27 13:14:14,661] INFO clipse.jetty.server.Server [] [akka://test/user/$f] - jetty-8.1.14.v20131031
[2015-01-27 13:14:14,662] INFO y.server.AbstractConnector [] [akka://test/user/$f] - Started [email protected]:37978
[2015-01-27 13:14:14,662] INFO rg.apache.spark.util.Utils [] [akka://test/user/$f] - Successfully started service 'HTTP file server' on port 37978.
[2015-01-27 13:14:19,675] INFO clipse.jetty.server.Server [] [akka://test/user/$f] - jetty-8.1.14.v20131031
[2015-01-27 13:14:19,691] INFO y.server.AbstractConnector [] [akka://test/user/$f] - Started [email protected]:48028
[2015-01-27 13:14:19,692] INFO rg.apache.spark.util.Utils [] [akka://test/user/$f] - Successfully started service 'SparkUI' on port 48028.
[2015-01-27 13:14:19,692] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$f] - Started SparkUI at http://localhost:48028
[2015-01-27 13:14:19,754] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:42388/user/HeartbeatReceiver
[2015-01-27 13:14:19,768] INFO .NettyBlockTransferService [] [akka://test/user/$f] - Server created on 34558
[2015-01-27 13:14:19,768] INFO storage.BlockManagerMaster [] [akka://test/user/$f] - Trying to register BlockManager
[2015-01-27 13:14:19,769] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:34558 with 681.8 MB RAM, BlockManagerId(, localhost, 34558)
[2015-01-27 13:14:19,769] INFO storage.BlockManagerMaster [] [akka://test/user/$f] - Registered BlockManager
[2015-01-27 13:14:19,779] INFO .jobserver.JobManagerActor [] [akka://test/user/$f] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:19,780] INFO .jobserver.RddManagerActor [] [akka://test/user/$f/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:19,788] INFO .apache.spark.SparkContext [] [akka://test/user/$f] - Added JAR /tmp/InMemoryDAO6787955189121296071.jar at http://10.1.3.213:37978/jars/InMemoryDAO6787955189121296071.jar with timestamp 1422364459787
[2015-01-27 13:14:19,793] INFO util.ContextURLClassLoader [] [akka://test/user/$f] - Added URL file:/tmp/InMemoryDAO6787955189121296071.jar to ContextURLClassLoader
[2015-01-27 13:14:19,794] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$f] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@6fdc750b
[2015-01-27 13:14:19,796] INFO .jobserver.JobManagerActor [] [akka://test/user/$f] - Starting Spark job dea82364-5c96-4f85-bc0b-333ebd3cf3d4 [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:19,796] INFO k.jobserver.JobResultActor [] [akka://test/user/$f/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID dea82364-5c96-4f85-bc0b-333ebd3cf3d4
[2015-01-27 13:14:19,796] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:19,796] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:19,796] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:19,796] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:19,797] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:19,814] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:19,873] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:48028
[2015-01-27 13:14:19,873] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:19,877] INFO .apache.spark.SparkContext [] [] - Starting job: collect at WordCountExample.scala:32
[2015-01-27 13:14:19,878] WARN .jobserver.JobManagerActor [] [] - Exception from job dea82364-5c96-4f85-bc0b-333ebd3cf3d4:
java.lang.NullPointerException
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1282)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1300)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1314)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1328)
at org.apache.spark.rdd.RDD.collect(RDD.scala:780)
at spark.jobserver.WordCountExample$.runJob(WordCountExample.scala:32)
at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:219)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:20,936] INFO apOutputTrackerMasterActor [] [akka://test/user/$f] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:20,940] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:20,941] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:20,942] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:20,943] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:20,946] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:42388/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:20,947] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:42388/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:20,951] INFO .jobserver.JobManagerActor [] [akka://test/user/$g] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:20,954] INFO k.jobserver.JobStatusActor [] [akka://test/user/$g/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:20,954] INFO k.jobserver.JobResultActor [] [akka://test/user/$g/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:20,962] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:42388/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:20,966] INFO ache.spark.SecurityManager [] [akka://test/user/$g] - Changing view acls to: tja01
[2015-01-27 13:14:20,966] INFO ache.spark.SecurityManager [] [akka://test/user/$g] - Changing modify acls to: tja01
[2015-01-27 13:14:20,967] INFO ache.spark.SecurityManager [] [akka://test/user/$g] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:21,020] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$g] - Slf4jLogger started
[2015-01-27 13:14:21,026] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:21,035] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:33466]
[2015-01-27 13:14:21,036] INFO rg.apache.spark.util.Utils [] [akka://test/user/$g] - Successfully started service 'sparkDriver' on port 33466.
[2015-01-27 13:14:21,036] INFO org.apache.spark.SparkEnv [] [akka://test/user/$g] - Registering MapOutputTracker
[2015-01-27 13:14:21,037] INFO org.apache.spark.SparkEnv [] [akka://test/user/$g] - Registering BlockManagerMaster
[2015-01-27 13:14:21,038] INFO k.storage.DiskBlockManager [] [akka://test/user/$g] - Created local directory at /tmp/spark-local-20150127131421-1f64
[2015-01-27 13:14:21,038] INFO .spark.storage.MemoryStore [] [akka://test/user/$g] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:21,040] INFO pache.spark.HttpFileServer [] [akka://test/user/$g] - HTTP File server directory is /tmp/spark-a4cc300f-faf3-4065-9763-b07ba8f68cc9
[2015-01-27 13:14:21,040] INFO rg.apache.spark.HttpServer [] [akka://test/user/$g] - Starting HTTP Server
[2015-01-27 13:14:21,041] INFO clipse.jetty.server.Server [] [akka://test/user/$g] - jetty-8.1.14.v20131031
[2015-01-27 13:14:21,042] INFO y.server.AbstractConnector [] [akka://test/user/$g] - Started [email protected]:50655
[2015-01-27 13:14:21,042] INFO rg.apache.spark.util.Utils [] [akka://test/user/$g] - Successfully started service 'HTTP file server' on port 50655.
[2015-01-27 13:14:26,056] INFO clipse.jetty.server.Server [] [akka://test/user/$g] - jetty-8.1.14.v20131031
[2015-01-27 13:14:26,066] INFO y.server.AbstractConnector [] [akka://test/user/$g] - Started [email protected]:49996
[2015-01-27 13:14:26,073] INFO rg.apache.spark.util.Utils [] [akka://test/user/$g] - Successfully started service 'SparkUI' on port 49996.
[2015-01-27 13:14:26,073] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$g] - Started SparkUI at http://localhost:49996
[2015-01-27 13:14:26,107] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:33466/user/HeartbeatReceiver
[2015-01-27 13:14:26,110] INFO .NettyBlockTransferService [] [akka://test/user/$g] - Server created on 45721
[2015-01-27 13:14:26,110] INFO storage.BlockManagerMaster [] [akka://test/user/$g] - Trying to register BlockManager
[2015-01-27 13:14:26,111] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:45721 with 681.8 MB RAM, BlockManagerId(, localhost, 45721)
[2015-01-27 13:14:26,112] INFO storage.BlockManagerMaster [] [akka://test/user/$g] - Registered BlockManager
[2015-01-27 13:14:26,118] INFO .jobserver.RddManagerActor [] [akka://test/user/$g/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:26,118] INFO .jobserver.JobManagerActor [] [akka://test/user/$g] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:26,119] INFO .apache.spark.SparkContext [] [akka://test/user/$g] - Added JAR /tmp/InMemoryDAO8452775048902259754.jar at http://10.1.3.213:50655/jars/InMemoryDAO8452775048902259754.jar with timestamp 1422364466119
[2015-01-27 13:14:26,123] INFO util.ContextURLClassLoader [] [akka://test/user/$g] - Added URL file:/tmp/InMemoryDAO8452775048902259754.jar to ContextURLClassLoader
[2015-01-27 13:14:26,123] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$g] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@4eb7a276
[2015-01-27 13:14:26,126] INFO .jobserver.JobManagerActor [] [akka://test/user/$g] - Starting Spark job f22f07f7-1c5c-4c8e-848b-513d8ba83a21 [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:26,126] INFO k.jobserver.JobResultActor [] [akka://test/user/$g/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID f22f07f7-1c5c-4c8e-848b-513d8ba83a21
[2015-01-27 13:14:26,126] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:26,127] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:26,131] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:26,132] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:26,132] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:26,143] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:26,143] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:26,143] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:26,147] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:26,147] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:26,148] INFO .apache.spark.SparkContext [] [] - Starting job: collect at WordCountExample.scala:32
[2015-01-27 13:14:26,168] INFO ark.scheduler.DAGScheduler [] [] - Registering RDD 1 (map at WordCountExample.scala:32)
[2015-01-27 13:14:26,170] INFO ark.scheduler.DAGScheduler [] [] - Got job 0 (collect at WordCountExample.scala:32) with 4 output partitions (allowLocal=false)
[2015-01-27 13:14:26,171] INFO ark.scheduler.DAGScheduler [] [] - Final stage: Stage 1(collect at WordCountExample.scala:32)
[2015-01-27 13:14:26,172] INFO ark.scheduler.DAGScheduler [] [] - Parents of final stage: List(Stage 0)
[2015-01-27 13:14:26,177] INFO ark.scheduler.DAGScheduler [] [] - Missing parents: List(Stage 0)
[2015-01-27 13:14:26,198] INFO ark.scheduler.DAGScheduler [] [] - Submitting Stage 0 (MappedRDD[1] at map at WordCountExample.scala:32), which has no missing parents
[2015-01-27 13:14:26,198] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:49996
[2015-01-27 13:14:26,199] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:26,351] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(2384) called with curMem=0, maxMem=714866688
[2015-01-27 13:14:26,354] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0 stored as values in memory (estimated size 2.3 KB, free 681.7 MB)
[2015-01-27 13:14:26,398] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(1716) called with curMem=2384, maxMem=714866688
[2015-01-27 13:14:26,398] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0_piece0 stored as bytes in memory (estimated size 1716.0 B, free 681.7 MB)
[2015-01-27 13:14:26,401] INFO k.storage.BlockManagerInfo [] [] - Added broadcast_0_piece0 in memory on localhost:45721 (size: 1716.0 B, free: 681.7 MB)
[2015-01-27 13:14:26,402] INFO storage.BlockManagerMaster [] [] - Updated info of block broadcast_0_piece0
[2015-01-27 13:14:26,404] INFO .apache.spark.SparkContext [] [] - Created broadcast 0 from broadcast at DAGScheduler.scala:838
[2015-01-27 13:14:26,425] INFO ark.scheduler.DAGScheduler [] [] - Submitting 4 missing tasks from Stage 0 (MappedRDD[1] at map at WordCountExample.scala:32)
[2015-01-27 13:14:26,427] INFO cheduler.TaskSchedulerImpl [] [] - Adding task set 0.0 with 4 tasks
[2015-01-27 13:14:26,451] INFO ark.scheduler.DAGScheduler [] [] - Job 0 failed: collect at WordCountExample.scala:32, took 0.301746 s
[2015-01-27 13:14:26,451] WARN .jobserver.JobManagerActor [] [] - Exception from job f22f07f7-1c5c-4c8e-848b-513d8ba83a21:
org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:702)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:701)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:701)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.postStop(DAGScheduler.scala:1428)
at akka.actor.Actor$class.aroun

from spark-jobserver.

jamborta avatar jamborta commented on May 5, 2024

this is just running the deploy script that probably runs the wordcount example as a test for sbt assembly.

from spark-jobserver.

zeitos avatar zeitos commented on May 5, 2024

I'm going to close this issue, @jamborta fell free to reopen or file another one if it's not working with the current version.

from spark-jobserver.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.