I am tying to run server_deploy.sh, it seems that it fails to create the sparkcontext with the first test, using spark.jobserver.WordCountExample, not sure what the problem might be, as I can run that example separately.
[2015-01-27 13:13:44,630] INFO .jobserver.JobManagerActor [] [akka://test/user/$a] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:13:44,869] INFO k.jobserver.JobStatusActor [] [akka://test/user/$a/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:13:44,872] INFO k.jobserver.JobResultActor [] [akka://test/user/$a/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:13:45,035] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:13:45,035] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:13:45,040] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:13:45,051] INFO .jobserver.JobManagerActor [] [akka://test/user/$b] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:13:45,055] INFO k.jobserver.JobResultActor [] [akka://test/user/$b/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:13:45,059] INFO k.jobserver.JobStatusActor [] [akka://test/user/$b/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:13:45,229] WARN rg.apache.spark.util.Utils [] [akka://test/user/$b] - Your hostname, tamas-laptop resolves to a loopback address: 127.0.1.1; using 10.1.3.213 instead (on interface eth0)
[2015-01-27 13:13:45,230] WARN rg.apache.spark.util.Utils [] [akka://test/user/$b] - Set SPARK_LOCAL_IP if you need to bind to another address
[2015-01-27 13:13:45,609] INFO ache.spark.SecurityManager [] [akka://test/user/$b] - Changing view acls to: tja01
[2015-01-27 13:13:45,610] INFO ache.spark.SecurityManager [] [akka://test/user/$b] - Changing modify acls to: tja01
[2015-01-27 13:13:45,611] INFO ache.spark.SecurityManager [] [akka://test/user/$b] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:13:45,908] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$b] - Slf4jLogger started
[2015-01-27 13:13:45,999] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:13:46,450] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://[email protected]:34516]
[2015-01-27 13:13:46,463] INFO rg.apache.spark.util.Utils [] [akka://test/user/$b] - Successfully started service 'sparkDriver' on port 34516.
[2015-01-27 13:13:46,498] INFO org.apache.spark.SparkEnv [] [akka://test/user/$b] - Registering MapOutputTracker
[2015-01-27 13:13:46,525] INFO org.apache.spark.SparkEnv [] [akka://test/user/$b] - Registering BlockManagerMaster
[2015-01-27 13:13:46,560] INFO k.storage.DiskBlockManager [] [akka://test/user/$b] - Created local directory at /tmp/spark-local-20150127131346-c001
[2015-01-27 13:13:46,572] INFO .spark.storage.MemoryStore [] [akka://test/user/$b] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:13:47,301] WARN doop.util.NativeCodeLoader [] [akka://test/user/$b] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[2015-01-27 13:13:47,548] INFO pache.spark.HttpFileServer [] [akka://test/user/$b] - HTTP File server directory is /tmp/spark-34bbc817-ea02-4871-924d-dfb34763e456
[2015-01-27 13:13:47,564] INFO rg.apache.spark.HttpServer [] [akka://test/user/$b] - Starting HTTP Server
[2015-01-27 13:13:47,799] INFO clipse.jetty.server.Server [] [akka://test/user/$b] - jetty-8.1.14.v20131031
[2015-01-27 13:13:47,825] INFO y.server.AbstractConnector [] [akka://test/user/$b] - Started [email protected]:60616
[2015-01-27 13:13:47,826] INFO rg.apache.spark.util.Utils [] [akka://test/user/$b] - Successfully started service 'HTTP file server' on port 60616.
[2015-01-27 13:13:53,012] INFO clipse.jetty.server.Server [] [akka://test/user/$b] - jetty-8.1.14.v20131031
[2015-01-27 13:13:53,034] INFO y.server.AbstractConnector [] [akka://test/user/$b] - Started [email protected]:34008
[2015-01-27 13:13:53,034] INFO rg.apache.spark.util.Utils [] [akka://test/user/$b] - Successfully started service 'SparkUI' on port 34008.
[2015-01-27 13:13:53,040] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$b] - Started SparkUI at http://10.1.3.213:34008
[2015-01-27 13:13:53,294] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://[email protected]:34516/user/HeartbeatReceiver
[2015-01-27 13:13:53,528] INFO .NettyBlockTransferService [] [akka://test/user/$b] - Server created on 46782
[2015-01-27 13:13:53,532] INFO storage.BlockManagerMaster [] [akka://test/user/$b] - Trying to register BlockManager
[2015-01-27 13:13:53,535] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:46782 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 46782)
[2015-01-27 13:13:53,540] INFO storage.BlockManagerMaster [] [akka://test/user/$b] - Registered BlockManager
[2015-01-27 13:13:53,851] INFO .jobserver.RddManagerActor [] [akka://test/user/$b/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:13:53,853] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:13:53,853] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:13:53,854] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:13:53,854] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:13:53,870] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:13:53,871] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:13:53,871] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:13:53,871] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:13:53,871] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:13:53,872] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:13:53,873] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:13:53,874] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:13:53,875] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:13:53,932] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://10.1.3.213:34008
[2015-01-27 13:13:53,935] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:13:54,996] INFO apOutputTrackerMasterActor [] [akka://test/user/$b] - MapOutputTrackerActor stopped!
[2015-01-27 13:13:55,082] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:13:55,083] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:13:55,085] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:13:55,099] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:13:55,099] INFO rovider$RemotingTerminator [] [akka.tcp://[email protected]:34516/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:13:55,100] INFO .jobserver.JobManagerActor [] [akka://test/user/$c] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:13:55,101] INFO rovider$RemotingTerminator [] [akka.tcp://[email protected]:34516/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:13:55,102] INFO k.jobserver.JobStatusActor [] [akka://test/user/$c/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:13:55,102] INFO k.jobserver.JobResultActor [] [akka://test/user/$c/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:13:55,137] INFO ache.spark.SecurityManager [] [akka://test/user/$c] - Changing view acls to: tja01
[2015-01-27 13:13:55,137] INFO ache.spark.SecurityManager [] [akka://test/user/$c] - Changing modify acls to: tja01
[2015-01-27 13:13:55,138] INFO ache.spark.SecurityManager [] [akka://test/user/$c] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:13:55,169] INFO rovider$RemotingTerminator [] [akka.tcp://[email protected]:34516/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:13:55,249] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$c] - Slf4jLogger started
[2015-01-27 13:13:55,259] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:13:55,280] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:49266]
[2015-01-27 13:13:55,282] INFO rg.apache.spark.util.Utils [] [akka://test/user/$c] - Successfully started service 'sparkDriver' on port 49266.
[2015-01-27 13:13:55,283] INFO org.apache.spark.SparkEnv [] [akka://test/user/$c] - Registering MapOutputTracker
[2015-01-27 13:13:55,285] INFO org.apache.spark.SparkEnv [] [akka://test/user/$c] - Registering BlockManagerMaster
[2015-01-27 13:13:55,287] INFO k.storage.DiskBlockManager [] [akka://test/user/$c] - Created local directory at /tmp/spark-local-20150127131355-fede
[2015-01-27 13:13:55,288] INFO .spark.storage.MemoryStore [] [akka://test/user/$c] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:13:55,290] INFO pache.spark.HttpFileServer [] [akka://test/user/$c] - HTTP File server directory is /tmp/spark-443848ba-82ea-4c59-bc86-16a71dc8bd65
[2015-01-27 13:13:55,291] INFO rg.apache.spark.HttpServer [] [akka://test/user/$c] - Starting HTTP Server
[2015-01-27 13:13:55,292] INFO clipse.jetty.server.Server [] [akka://test/user/$c] - jetty-8.1.14.v20131031
[2015-01-27 13:13:55,299] INFO y.server.AbstractConnector [] [akka://test/user/$c] - Started [email protected]:58253
[2015-01-27 13:13:55,299] INFO rg.apache.spark.util.Utils [] [akka://test/user/$c] - Successfully started service 'HTTP file server' on port 58253.
[2015-01-27 13:14:00,325] INFO clipse.jetty.server.Server [] [akka://test/user/$c] - jetty-8.1.14.v20131031
[2015-01-27 13:14:00,333] INFO y.server.AbstractConnector [] [akka://test/user/$c] - Started [email protected]:58755
[2015-01-27 13:14:00,334] INFO rg.apache.spark.util.Utils [] [akka://test/user/$c] - Successfully started service 'SparkUI' on port 58755.
[2015-01-27 13:14:00,334] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$c] - Started SparkUI at http://localhost:58755
[2015-01-27 13:14:00,410] INFO pache.spark.util.AkkaUtils [] [akka://test/user/$c] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:49266/user/HeartbeatReceiver
[2015-01-27 13:14:00,417] INFO .NettyBlockTransferService [] [akka://test/user/$c] - Server created on 55801
[2015-01-27 13:14:00,417] INFO storage.BlockManagerMaster [] [akka://test/user/$c] - Trying to register BlockManager
[2015-01-27 13:14:00,418] INFO ge.BlockManagerMasterActor [] [akka://test/user/$c] - Registering block manager localhost:55801 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 55801)
[2015-01-27 13:14:00,418] INFO storage.BlockManagerMaster [] [akka://test/user/$c] - Registered BlockManager
[2015-01-27 13:14:00,431] INFO .jobserver.RddManagerActor [] [akka://test/user/$c/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:00,433] INFO .jobserver.JobManagerActor [] [akka://test/user/$c] - Loading class no.such.class for app notajar
[2015-01-27 13:14:00,462] INFO .apache.spark.SparkContext [] [akka://test/user/$c] - Added JAR /tmp/InMemoryDAO722024764359244436.jar at http://10.1.3.213:58253/jars/InMemoryDAO722024764359244436.jar with timestamp 1422364440461
[2015-01-27 13:14:00,469] INFO util.ContextURLClassLoader [] [akka://test/user/$c] - Added URL file:/tmp/InMemoryDAO722024764359244436.jar to ContextURLClassLoader
[2015-01-27 13:14:00,470] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$c] - Loading object no.such.class$ using loader spark.jobserver.util.ContextURLClassLoader@7bbe1c14
[2015-01-27 13:14:00,472] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$c] - Loading class no.such.class using loader spark.jobserver.util.ContextURLClassLoader@7bbe1c14
[2015-01-27 13:14:00,473] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:00,473] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:00,473] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:00,473] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:00,484] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:00,485] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:00,485] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:00,485] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:00,486] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:00,486] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:00,486] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:00,486] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:00,487] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:00,487] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:00,487] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:00,487] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:00,488] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:00,488] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:00,488] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:00,489] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:00,489] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:00,489] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:00,490] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:00,491] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:00,491] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:00,491] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:00,491] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:00,543] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:58755
[2015-01-27 13:14:00,544] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:01,603] INFO apOutputTrackerMasterActor [] [akka://test/user/$c] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:01,616] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:01,616] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:01,617] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:01,620] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:01,622] INFO .jobserver.JobManagerActor [] [akka://test/user/$d] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:01,623] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49266/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:01,625] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49266/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:01,627] INFO k.jobserver.JobStatusActor [] [akka://test/user/$d/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:01,628] INFO k.jobserver.JobResultActor [] [akka://test/user/$d/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:01,636] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49266/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:01,646] INFO ache.spark.SecurityManager [] [akka://test/user/$d] - Changing view acls to: tja01
[2015-01-27 13:14:01,663] INFO ache.spark.SecurityManager [] [akka://test/user/$d] - Changing modify acls to: tja01
[2015-01-27 13:14:01,663] INFO ache.spark.SecurityManager [] [akka://test/user/$d] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:01,800] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$d] - Slf4jLogger started
[2015-01-27 13:14:01,816] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:01,864] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:44175]
[2015-01-27 13:14:01,865] INFO rg.apache.spark.util.Utils [] [akka://test/user/$d] - Successfully started service 'sparkDriver' on port 44175.
[2015-01-27 13:14:01,866] INFO org.apache.spark.SparkEnv [] [akka://test/user/$d] - Registering MapOutputTracker
[2015-01-27 13:14:01,868] INFO org.apache.spark.SparkEnv [] [akka://test/user/$d] - Registering BlockManagerMaster
[2015-01-27 13:14:01,869] INFO k.storage.DiskBlockManager [] [akka://test/user/$d] - Created local directory at /tmp/spark-local-20150127131401-3d1f
[2015-01-27 13:14:01,870] INFO .spark.storage.MemoryStore [] [akka://test/user/$d] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:01,872] INFO pache.spark.HttpFileServer [] [akka://test/user/$d] - HTTP File server directory is /tmp/spark-98b99407-490b-4beb-9da9-3c753cd2e8b9
[2015-01-27 13:14:01,872] INFO rg.apache.spark.HttpServer [] [akka://test/user/$d] - Starting HTTP Server
[2015-01-27 13:14:01,874] INFO clipse.jetty.server.Server [] [akka://test/user/$d] - jetty-8.1.14.v20131031
[2015-01-27 13:14:01,882] INFO y.server.AbstractConnector [] [akka://test/user/$d] - Started [email protected]:48214
[2015-01-27 13:14:01,882] INFO rg.apache.spark.util.Utils [] [akka://test/user/$d] - Successfully started service 'HTTP file server' on port 48214.
[2015-01-27 13:14:06,911] INFO clipse.jetty.server.Server [] [akka://test/user/$d] - jetty-8.1.14.v20131031
[2015-01-27 13:14:06,919] INFO y.server.AbstractConnector [] [akka://test/user/$d] - Started [email protected]:52772
[2015-01-27 13:14:06,919] INFO rg.apache.spark.util.Utils [] [akka://test/user/$d] - Successfully started service 'SparkUI' on port 52772.
[2015-01-27 13:14:06,920] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$d] - Started SparkUI at http://localhost:52772
[2015-01-27 13:14:06,980] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:44175/user/HeartbeatReceiver
[2015-01-27 13:14:06,981] INFO .NettyBlockTransferService [] [akka://test/user/$d] - Server created on 49782
[2015-01-27 13:14:06,981] INFO storage.BlockManagerMaster [] [akka://test/user/$d] - Trying to register BlockManager
[2015-01-27 13:14:06,982] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:49782 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 49782)
[2015-01-27 13:14:06,982] INFO storage.BlockManagerMaster [] [akka://test/user/$d] - Registered BlockManager
[2015-01-27 13:14:06,991] INFO .jobserver.RddManagerActor [] [akka://test/user/$d/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:06,992] INFO .jobserver.JobManagerActor [] [akka://test/user/$d] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:06,993] INFO .apache.spark.SparkContext [] [akka://test/user/$d] - Added JAR /tmp/InMemoryDAO8320471003519538213.jar at http://10.1.3.213:48214/jars/InMemoryDAO8320471003519538213.jar with timestamp 1422364446993
[2015-01-27 13:14:07,000] INFO util.ContextURLClassLoader [] [akka://test/user/$d] - Added URL file:/tmp/InMemoryDAO8320471003519538213.jar to ContextURLClassLoader
[2015-01-27 13:14:07,000] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$d] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@237ac531
[2015-01-27 13:14:07,006] INFO .jobserver.JobManagerActor [] [akka://test/user/$d] - Starting Spark job 1f1b4a8c-edde-49c0-855d-238efb9ff8f0 [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:07,006] INFO k.jobserver.JobResultActor [] [akka://test/user/$d/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 1f1b4a8c-edde-49c0-855d-238efb9ff8f0
[2015-01-27 13:14:07,007] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:07,008] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:07,009] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:07,009] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:07,010] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:07,019] WARN .jobserver.JobManagerActor [] [] - Exception from job 1f1b4a8c-edde-49c0-855d-238efb9ff8f0:
java.lang.Throwable: No input.string config param
at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:213)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:07,023] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:07,024] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:07,025] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:07,025] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:07,025] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:07,026] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:07,026] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:07,026] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:07,026] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:07,027] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:07,027] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:07,027] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:07,028] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:07,028] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:07,028] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:07,029] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:07,029] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:07,029] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:07,029] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:07,030] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:07,030] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:07,031] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:07,031] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:07,082] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:52772
[2015-01-27 13:14:07,083] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:08,135] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:08,138] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:08,139] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:08,139] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:08,141] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:08,142] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44175/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:08,143] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44175/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:08,150] INFO .jobserver.JobManagerActor [] [akka://test/user/$e] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:08,152] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44175/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:08,153] INFO k.jobserver.JobStatusActor [] [akka://test/user/$e/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:08,154] INFO k.jobserver.JobResultActor [] [akka://test/user/$e/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:08,172] INFO ache.spark.SecurityManager [] [akka://test/user/$e] - Changing view acls to: tja01
[2015-01-27 13:14:08,172] INFO ache.spark.SecurityManager [] [akka://test/user/$e] - Changing modify acls to: tja01
[2015-01-27 13:14:08,172] INFO ache.spark.SecurityManager [] [akka://test/user/$e] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:08,279] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$e] - Slf4jLogger started
[2015-01-27 13:14:08,287] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:08,302] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:56015]
[2015-01-27 13:14:08,303] INFO rg.apache.spark.util.Utils [] [akka://test/user/$e] - Successfully started service 'sparkDriver' on port 56015.
[2015-01-27 13:14:08,304] INFO org.apache.spark.SparkEnv [] [akka://test/user/$e] - Registering MapOutputTracker
[2015-01-27 13:14:08,305] INFO org.apache.spark.SparkEnv [] [akka://test/user/$e] - Registering BlockManagerMaster
[2015-01-27 13:14:08,306] INFO k.storage.DiskBlockManager [] [akka://test/user/$e] - Created local directory at /tmp/spark-local-20150127131408-2dbe
[2015-01-27 13:14:08,307] INFO .spark.storage.MemoryStore [] [akka://test/user/$e] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:08,309] INFO pache.spark.HttpFileServer [] [akka://test/user/$e] - HTTP File server directory is /tmp/spark-ef2db7f2-dfce-4e9d-8ac5-aa9f6386f6bf
[2015-01-27 13:14:08,309] INFO rg.apache.spark.HttpServer [] [akka://test/user/$e] - Starting HTTP Server
[2015-01-27 13:14:08,311] INFO clipse.jetty.server.Server [] [akka://test/user/$e] - jetty-8.1.14.v20131031
[2015-01-27 13:14:08,312] INFO y.server.AbstractConnector [] [akka://test/user/$e] - Started [email protected]:43668
[2015-01-27 13:14:08,313] INFO rg.apache.spark.util.Utils [] [akka://test/user/$e] - Successfully started service 'HTTP file server' on port 43668.
[2015-01-27 13:14:13,331] INFO clipse.jetty.server.Server [] [akka://test/user/$e] - jetty-8.1.14.v20131031
[2015-01-27 13:14:13,342] INFO y.server.AbstractConnector [] [akka://test/user/$e] - Started [email protected]:43117
[2015-01-27 13:14:13,343] INFO rg.apache.spark.util.Utils [] [akka://test/user/$e] - Successfully started service 'SparkUI' on port 43117.
[2015-01-27 13:14:13,343] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$e] - Started SparkUI at http://localhost:43117
[2015-01-27 13:14:13,408] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:56015/user/HeartbeatReceiver
[2015-01-27 13:14:13,411] INFO .NettyBlockTransferService [] [akka://test/user/$e] - Server created on 36598
[2015-01-27 13:14:13,411] INFO storage.BlockManagerMaster [] [akka://test/user/$e] - Trying to register BlockManager
[2015-01-27 13:14:13,412] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:36598 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 36598)
[2015-01-27 13:14:13,412] INFO storage.BlockManagerMaster [] [akka://test/user/$e] - Registered BlockManager
[2015-01-27 13:14:13,419] INFO .jobserver.RddManagerActor [] [akka://test/user/$e/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:13,420] INFO .jobserver.JobManagerActor [] [akka://test/user/$e] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:13,423] INFO .apache.spark.SparkContext [] [akka://test/user/$e] - Added JAR /tmp/InMemoryDAO2098972855628514058.jar at http://10.1.3.213:43668/jars/InMemoryDAO2098972855628514058.jar with timestamp 1422364453423
[2015-01-27 13:14:13,429] INFO util.ContextURLClassLoader [] [akka://test/user/$e] - Added URL file:/tmp/InMemoryDAO2098972855628514058.jar to ContextURLClassLoader
[2015-01-27 13:14:13,429] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$e] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@23f9a8ff
[2015-01-27 13:14:13,433] INFO .jobserver.JobManagerActor [] [akka://test/user/$e] - Starting Spark job 6bd35ede-445c-4303-95c8-61f8e3d6689b [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:13,433] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:13,433] INFO k.jobserver.JobResultActor [] [akka://test/user/$e/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 6bd35ede-445c-4303-95c8-61f8e3d6689b
[2015-01-27 13:14:13,437] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:13,437] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:13,437] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:13,438] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:13,449] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:13,449] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:13,450] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:13,450] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:13,450] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:13,450] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:13,451] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:13,451] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:13,451] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:13,451] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:13,452] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:13,452] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:13,452] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:13,452] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:13,453] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:13,453] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:13,453] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:13,453] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:13,454] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:13,507] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:43117
[2015-01-27 13:14:13,507] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:13,570] WARN .jobserver.JobManagerActor [] [] - Exception from job 6bd35ede-445c-4303-95c8-61f8e3d6689b:
org.apache.spark.SparkException: SparkContext has been shutdown
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1277)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1300)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1314)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1328)
at org.apache.spark.rdd.RDD.collect(RDD.scala:780)
at spark.jobserver.WordCountExample$.runJob(WordCountExample.scala:32)
at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:219)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:14,561] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:14,565] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:14,565] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:14,566] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:14,566] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:14,568] INFO .jobserver.JobManagerActor [] [akka://test/user/$f] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:14,569] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:56015/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:14,573] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:56015/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:14,576] INFO k.jobserver.JobStatusActor [] [akka://test/user/$f/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:14,576] INFO k.jobserver.JobResultActor [] [akka://test/user/$f/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:14,581] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:56015/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:14,593] INFO ache.spark.SecurityManager [] [akka://test/user/$f] - Changing view acls to: tja01
[2015-01-27 13:14:14,593] INFO ache.spark.SecurityManager [] [akka://test/user/$f] - Changing modify acls to: tja01
[2015-01-27 13:14:14,593] INFO ache.spark.SecurityManager [] [akka://test/user/$f] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:14,635] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$f] - Slf4jLogger started
[2015-01-27 13:14:14,642] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:14,656] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:42388]
[2015-01-27 13:14:14,657] INFO rg.apache.spark.util.Utils [] [akka://test/user/$f] - Successfully started service 'sparkDriver' on port 42388.
[2015-01-27 13:14:14,657] INFO org.apache.spark.SparkEnv [] [akka://test/user/$f] - Registering MapOutputTracker
[2015-01-27 13:14:14,658] INFO org.apache.spark.SparkEnv [] [akka://test/user/$f] - Registering BlockManagerMaster
[2015-01-27 13:14:14,659] INFO k.storage.DiskBlockManager [] [akka://test/user/$f] - Created local directory at /tmp/spark-local-20150127131414-bc68
[2015-01-27 13:14:14,659] INFO .spark.storage.MemoryStore [] [akka://test/user/$f] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:14,660] INFO pache.spark.HttpFileServer [] [akka://test/user/$f] - HTTP File server directory is /tmp/spark-50d4eb9b-ac3c-43d9-ba59-dbd30c5576bc
[2015-01-27 13:14:14,660] INFO rg.apache.spark.HttpServer [] [akka://test/user/$f] - Starting HTTP Server
[2015-01-27 13:14:14,661] INFO clipse.jetty.server.Server [] [akka://test/user/$f] - jetty-8.1.14.v20131031
[2015-01-27 13:14:14,662] INFO y.server.AbstractConnector [] [akka://test/user/$f] - Started [email protected]:37978
[2015-01-27 13:14:14,662] INFO rg.apache.spark.util.Utils [] [akka://test/user/$f] - Successfully started service 'HTTP file server' on port 37978.
[2015-01-27 13:14:19,675] INFO clipse.jetty.server.Server [] [akka://test/user/$f] - jetty-8.1.14.v20131031
[2015-01-27 13:14:19,691] INFO y.server.AbstractConnector [] [akka://test/user/$f] - Started [email protected]:48028
[2015-01-27 13:14:19,692] INFO rg.apache.spark.util.Utils [] [akka://test/user/$f] - Successfully started service 'SparkUI' on port 48028.
[2015-01-27 13:14:19,692] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$f] - Started SparkUI at http://localhost:48028
[2015-01-27 13:14:19,754] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:42388/user/HeartbeatReceiver
[2015-01-27 13:14:19,768] INFO .NettyBlockTransferService [] [akka://test/user/$f] - Server created on 34558
[2015-01-27 13:14:19,768] INFO storage.BlockManagerMaster [] [akka://test/user/$f] - Trying to register BlockManager
[2015-01-27 13:14:19,769] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:34558 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 34558)
[2015-01-27 13:14:19,769] INFO storage.BlockManagerMaster [] [akka://test/user/$f] - Registered BlockManager
[2015-01-27 13:14:19,779] INFO .jobserver.JobManagerActor [] [akka://test/user/$f] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:19,780] INFO .jobserver.RddManagerActor [] [akka://test/user/$f/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:19,788] INFO .apache.spark.SparkContext [] [akka://test/user/$f] - Added JAR /tmp/InMemoryDAO6787955189121296071.jar at http://10.1.3.213:37978/jars/InMemoryDAO6787955189121296071.jar with timestamp 1422364459787
[2015-01-27 13:14:19,793] INFO util.ContextURLClassLoader [] [akka://test/user/$f] - Added URL file:/tmp/InMemoryDAO6787955189121296071.jar to ContextURLClassLoader
[2015-01-27 13:14:19,794] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$f] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@6fdc750b
[2015-01-27 13:14:19,796] INFO .jobserver.JobManagerActor [] [akka://test/user/$f] - Starting Spark job dea82364-5c96-4f85-bc0b-333ebd3cf3d4 [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:19,796] INFO k.jobserver.JobResultActor [] [akka://test/user/$f/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID dea82364-5c96-4f85-bc0b-333ebd3cf3d4
[2015-01-27 13:14:19,796] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:19,796] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:19,796] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:19,796] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:19,797] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:19,810] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:19,811] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:19,812] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:19,813] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:19,814] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:19,873] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:48028
[2015-01-27 13:14:19,873] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:19,877] INFO .apache.spark.SparkContext [] [] - Starting job: collect at WordCountExample.scala:32
[2015-01-27 13:14:19,878] WARN .jobserver.JobManagerActor [] [] - Exception from job dea82364-5c96-4f85-bc0b-333ebd3cf3d4:
java.lang.NullPointerException
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1282)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1300)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1314)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1328)
at org.apache.spark.rdd.RDD.collect(RDD.scala:780)
at spark.jobserver.WordCountExample$.runJob(WordCountExample.scala:32)
at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:219)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:20,936] INFO apOutputTrackerMasterActor [] [akka://test/user/$f] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:20,940] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:20,941] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:20,942] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:20,943] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:20,946] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:42388/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:20,947] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:42388/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:20,951] INFO .jobserver.JobManagerActor [] [akka://test/user/$g] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:20,954] INFO k.jobserver.JobStatusActor [] [akka://test/user/$g/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:20,954] INFO k.jobserver.JobResultActor [] [akka://test/user/$g/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:20,962] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:42388/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:20,966] INFO ache.spark.SecurityManager [] [akka://test/user/$g] - Changing view acls to: tja01
[2015-01-27 13:14:20,966] INFO ache.spark.SecurityManager [] [akka://test/user/$g] - Changing modify acls to: tja01
[2015-01-27 13:14:20,967] INFO ache.spark.SecurityManager [] [akka://test/user/$g] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:21,020] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$g] - Slf4jLogger started
[2015-01-27 13:14:21,026] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:21,035] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:33466]
[2015-01-27 13:14:21,036] INFO rg.apache.spark.util.Utils [] [akka://test/user/$g] - Successfully started service 'sparkDriver' on port 33466.
[2015-01-27 13:14:21,036] INFO org.apache.spark.SparkEnv [] [akka://test/user/$g] - Registering MapOutputTracker
[2015-01-27 13:14:21,037] INFO org.apache.spark.SparkEnv [] [akka://test/user/$g] - Registering BlockManagerMaster
[2015-01-27 13:14:21,038] INFO k.storage.DiskBlockManager [] [akka://test/user/$g] - Created local directory at /tmp/spark-local-20150127131421-1f64
[2015-01-27 13:14:21,038] INFO .spark.storage.MemoryStore [] [akka://test/user/$g] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:21,040] INFO pache.spark.HttpFileServer [] [akka://test/user/$g] - HTTP File server directory is /tmp/spark-a4cc300f-faf3-4065-9763-b07ba8f68cc9
[2015-01-27 13:14:21,040] INFO rg.apache.spark.HttpServer [] [akka://test/user/$g] - Starting HTTP Server
[2015-01-27 13:14:21,041] INFO clipse.jetty.server.Server [] [akka://test/user/$g] - jetty-8.1.14.v20131031
[2015-01-27 13:14:21,042] INFO y.server.AbstractConnector [] [akka://test/user/$g] - Started [email protected]:50655
[2015-01-27 13:14:21,042] INFO rg.apache.spark.util.Utils [] [akka://test/user/$g] - Successfully started service 'HTTP file server' on port 50655.
[2015-01-27 13:14:26,056] INFO clipse.jetty.server.Server [] [akka://test/user/$g] - jetty-8.1.14.v20131031
[2015-01-27 13:14:26,066] INFO y.server.AbstractConnector [] [akka://test/user/$g] - Started [email protected]:49996
[2015-01-27 13:14:26,073] INFO rg.apache.spark.util.Utils [] [akka://test/user/$g] - Successfully started service 'SparkUI' on port 49996.
[2015-01-27 13:14:26,073] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$g] - Started SparkUI at http://localhost:49996
[2015-01-27 13:14:26,107] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:33466/user/HeartbeatReceiver
[2015-01-27 13:14:26,110] INFO .NettyBlockTransferService [] [akka://test/user/$g] - Server created on 45721
[2015-01-27 13:14:26,110] INFO storage.BlockManagerMaster [] [akka://test/user/$g] - Trying to register BlockManager
[2015-01-27 13:14:26,111] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:45721 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 45721)
[2015-01-27 13:14:26,112] INFO storage.BlockManagerMaster [] [akka://test/user/$g] - Registered BlockManager
[2015-01-27 13:14:26,118] INFO .jobserver.RddManagerActor [] [akka://test/user/$g/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:26,118] INFO .jobserver.JobManagerActor [] [akka://test/user/$g] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:26,119] INFO .apache.spark.SparkContext [] [akka://test/user/$g] - Added JAR /tmp/InMemoryDAO8452775048902259754.jar at http://10.1.3.213:50655/jars/InMemoryDAO8452775048902259754.jar with timestamp 1422364466119
[2015-01-27 13:14:26,123] INFO util.ContextURLClassLoader [] [akka://test/user/$g] - Added URL file:/tmp/InMemoryDAO8452775048902259754.jar to ContextURLClassLoader
[2015-01-27 13:14:26,123] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$g] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@4eb7a276
[2015-01-27 13:14:26,126] INFO .jobserver.JobManagerActor [] [akka://test/user/$g] - Starting Spark job f22f07f7-1c5c-4c8e-848b-513d8ba83a21 [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:26,126] INFO k.jobserver.JobResultActor [] [akka://test/user/$g/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID f22f07f7-1c5c-4c8e-848b-513d8ba83a21
[2015-01-27 13:14:26,126] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:26,127] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:26,131] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:26,132] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:26,132] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:26,143] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:26,143] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:26,143] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:26,144] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:26,145] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:26,146] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:26,147] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:26,147] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:26,148] INFO .apache.spark.SparkContext [] [] - Starting job: collect at WordCountExample.scala:32
[2015-01-27 13:14:26,168] INFO ark.scheduler.DAGScheduler [] [] - Registering RDD 1 (map at WordCountExample.scala:32)
[2015-01-27 13:14:26,170] INFO ark.scheduler.DAGScheduler [] [] - Got job 0 (collect at WordCountExample.scala:32) with 4 output partitions (allowLocal=false)
[2015-01-27 13:14:26,171] INFO ark.scheduler.DAGScheduler [] [] - Final stage: Stage 1(collect at WordCountExample.scala:32)
[2015-01-27 13:14:26,172] INFO ark.scheduler.DAGScheduler [] [] - Parents of final stage: List(Stage 0)
[2015-01-27 13:14:26,177] INFO ark.scheduler.DAGScheduler [] [] - Missing parents: List(Stage 0)
[2015-01-27 13:14:26,198] INFO ark.scheduler.DAGScheduler [] [] - Submitting Stage 0 (MappedRDD[1] at map at WordCountExample.scala:32), which has no missing parents
[2015-01-27 13:14:26,198] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:49996
[2015-01-27 13:14:26,199] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:26,351] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(2384) called with curMem=0, maxMem=714866688
[2015-01-27 13:14:26,354] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0 stored as values in memory (estimated size 2.3 KB, free 681.7 MB)
[2015-01-27 13:14:26,398] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(1716) called with curMem=2384, maxMem=714866688
[2015-01-27 13:14:26,398] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0_piece0 stored as bytes in memory (estimated size 1716.0 B, free 681.7 MB)
[2015-01-27 13:14:26,401] INFO k.storage.BlockManagerInfo [] [] - Added broadcast_0_piece0 in memory on localhost:45721 (size: 1716.0 B, free: 681.7 MB)
[2015-01-27 13:14:26,402] INFO storage.BlockManagerMaster [] [] - Updated info of block broadcast_0_piece0
[2015-01-27 13:14:26,404] INFO .apache.spark.SparkContext [] [] - Created broadcast 0 from broadcast at DAGScheduler.scala:838
[2015-01-27 13:14:26,425] INFO ark.scheduler.DAGScheduler [] [] - Submitting 4 missing tasks from Stage 0 (MappedRDD[1] at map at WordCountExample.scala:32)
[2015-01-27 13:14:26,427] INFO cheduler.TaskSchedulerImpl [] [] - Adding task set 0.0 with 4 tasks
[2015-01-27 13:14:26,451] INFO ark.scheduler.DAGScheduler [] [] - Job 0 failed: collect at WordCountExample.scala:32, took 0.301746 s
[2015-01-27 13:14:26,451] WARN .jobserver.JobManagerActor [] [] - Exception from job f22f07f7-1c5c-4c8e-848b-513d8ba83a21:
org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:702)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:701)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:701)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.postStop(DAGScheduler.scala:1428)
at akka.actor.Actor$class.aroundPostStop(Actor.scala:475)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundPostStop(DAGScheduler.scala:1375)
at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
at akka.actor.ActorCell.terminate(ActorCell.scala:369)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:241)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:26,476] INFO k.scheduler.TaskSetManager [] [] - Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1330 bytes)
[2015-01-27 13:14:26,480] INFO k.scheduler.TaskSetManager [] [] - Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1337 bytes)
[2015-01-27 13:14:26,481] INFO k.scheduler.TaskSetManager [] [] - Starting task 2.0 in stage 0.0 (TID 2, localhost, PROCESS_LOCAL, 1340 bytes)
[2015-01-27 13:14:26,482] INFO k.scheduler.TaskSetManager [] [] - Starting task 3.0 in stage 0.0 (TID 3, localhost, PROCESS_LOCAL, 1337 bytes)
[2015-01-27 13:14:26,494] ERROR ka.actor.OneForOneStrategy [] [akka://sparkDriver/user/LocalBackendActor] - Task org.apache.spark.executor.Executor$TaskRunner@656a0389 rejected from java.util.concurrent.ThreadPoolExecutor@130e4b63[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.executor.Executor$TaskRunner@656a0389 rejected from java.util.concurrent.ThreadPoolExecutor@130e4b63[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
at org.apache.spark.executor.Executor.launchTask(Executor.scala:128)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$reviveOffers$1.apply(LocalBackend.scala:78)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$reviveOffers$1.apply(LocalBackend.scala:76)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.scheduler.local.LocalActor.reviveOffers(LocalBackend.scala:76)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$receiveWithLogging$1.applyOrElse(LocalBackend.scala:58)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.local.LocalActor.aroundReceive(LocalBackend.scala:43)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:26,495] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:33466/user/HeartbeatReceiver
[2015-01-27 13:14:26,503] ERROR ka.actor.OneForOneStrategy [] [akka://sparkDriver/user/LocalBackendActor] - Actor not found for: ActorSelection[Anchor(akka://sparkDriver/), Path(/user/HeartbeatReceiver)]
akka.actor.PostRestartException: exception post restart (class java.util.concurrent.RejectedExecutionException)
at akka.actor.dungeon.FaultHandling$$anonfun$6.apply(FaultHandling.scala:249)
at akka.actor.dungeon.FaultHandling$$anonfun$6.apply(FaultHandling.scala:247)
at akka.actor.dungeon.FaultHandling$$anonfun$handleNonFatalOrInterruptedException$1.applyOrElse(FaultHandling.scala:302)
at akka.actor.dungeon.FaultHandling$$anonfun$handleNonFatalOrInterruptedException$1.applyOrElse(FaultHandling.scala:297)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at akka.actor.dungeon.FaultHandling$class.finishRecreate(FaultHandling.scala:247)
at akka.actor.dungeon.FaultHandling$class.faultRecreate(FaultHandling.scala:76)
at akka.actor.ActorCell.faultRecreate(ActorCell.scala:369)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:459)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka://sparkDriver/), Path(/user/HeartbeatReceiver)]
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at scala.concurrent.impl.Promise$DefaultPromise.scala$concurrent$impl$Promise$DefaultPromise$$dispatchOrAddCallback(Promise.scala:280)
at scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:270)
at akka.actor.ActorSelection.resolveOne(ActorSelection.scala:63)
at akka.actor.ActorSelection.resolveOne(ActorSelection.scala:80)
at org.apache.spark.util.AkkaUtils$.makeDriverRef(AkkaUtils.scala:213)
at org.apache.spark.executor.Executor.startDriverHeartbeater(Executor.scala:369)
at org.apache.spark.executor.Executor.<init>(Executor.scala:122)
at org.apache.spark.scheduler.local.LocalActor.<init>(LocalBackend.scala:53)
at org.apache.spark.scheduler.local.LocalBackend$$anonfun$start$1.apply(LocalBackend.scala:96)
at org.apache.spark.scheduler.local.LocalBackend$$anonfun$start$1.apply(LocalBackend.scala:96)
at akka.actor.TypedCreatorFunctionConsumer.produce(Props.scala:343)
at akka.actor.Props.newActor(Props.scala:252)
at akka.actor.ActorCell.newActor(ActorCell.scala:552)
at akka.actor.dungeon.FaultHandling$class.finishRecreate(FaultHandling.scala:234)
... 11 more
[2015-01-27 13:14:27,261] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:27,268] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:27,268] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:27,269] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:27,270] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:27,274] INFO .jobserver.JobManagerActor [] [akka://test/user/$h] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:27,274] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:33466/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:27,275] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:33466/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:27,279] INFO k.jobserver.JobStatusActor [] [akka://test/user/$h/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:27,280] INFO k.jobserver.JobResultActor [] [akka://test/user/$h/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:27,287] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:33466/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:27,290] INFO ache.spark.SecurityManager [] [akka://test/user/$h] - Changing view acls to: tja01
[2015-01-27 13:14:27,290] INFO ache.spark.SecurityManager [] [akka://test/user/$h] - Changing modify acls to: tja01
[2015-01-27 13:14:27,290] INFO ache.spark.SecurityManager [] [akka://test/user/$h] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:27,378] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$h] - Slf4jLogger started
[2015-01-27 13:14:27,385] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:27,399] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:36349]
[2015-01-27 13:14:27,400] INFO rg.apache.spark.util.Utils [] [akka://test/user/$h] - Successfully started service 'sparkDriver' on port 36349.
[2015-01-27 13:14:27,401] INFO org.apache.spark.SparkEnv [] [akka://test/user/$h] - Registering MapOutputTracker
[2015-01-27 13:14:27,402] INFO org.apache.spark.SparkEnv [] [akka://test/user/$h] - Registering BlockManagerMaster
[2015-01-27 13:14:27,403] INFO k.storage.DiskBlockManager [] [akka://test/user/$h] - Created local directory at /tmp/spark-local-20150127131427-db39
[2015-01-27 13:14:27,403] INFO .spark.storage.MemoryStore [] [akka://test/user/$h] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:27,404] INFO pache.spark.HttpFileServer [] [akka://test/user/$h] - HTTP File server directory is /tmp/spark-bc54f317-76a1-4fc5-b3b3-7681d6a8010a
[2015-01-27 13:14:27,404] INFO rg.apache.spark.HttpServer [] [akka://test/user/$h] - Starting HTTP Server
[2015-01-27 13:14:27,405] INFO clipse.jetty.server.Server [] [akka://test/user/$h] - jetty-8.1.14.v20131031
[2015-01-27 13:14:27,410] INFO y.server.AbstractConnector [] [akka://test/user/$h] - Started [email protected]:42366
[2015-01-27 13:14:27,410] INFO rg.apache.spark.util.Utils [] [akka://test/user/$h] - Successfully started service 'HTTP file server' on port 42366.
[2015-01-27 13:14:32,426] INFO clipse.jetty.server.Server [] [akka://test/user/$h] - jetty-8.1.14.v20131031
[2015-01-27 13:14:32,440] INFO y.server.AbstractConnector [] [akka://test/user/$h] - Started [email protected]:51422
[2015-01-27 13:14:32,440] INFO rg.apache.spark.util.Utils [] [akka://test/user/$h] - Successfully started service 'SparkUI' on port 51422.
[2015-01-27 13:14:32,441] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$h] - Started SparkUI at http://localhost:51422
[2015-01-27 13:14:32,481] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:36349/user/HeartbeatReceiver
[2015-01-27 13:14:32,511] INFO .NettyBlockTransferService [] [akka://test/user/$h] - Server created on 56071
[2015-01-27 13:14:32,512] INFO storage.BlockManagerMaster [] [akka://test/user/$h] - Trying to register BlockManager
[2015-01-27 13:14:32,512] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:56071 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 56071)
[2015-01-27 13:14:32,512] INFO storage.BlockManagerMaster [] [akka://test/user/$h] - Registered BlockManager
[2015-01-27 13:14:32,518] INFO .jobserver.RddManagerActor [] [akka://test/user/$h/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:32,518] INFO .jobserver.JobManagerActor [] [akka://test/user/$h] - Loading class spark.jobserver.WordCountExample for app demo
[2015-01-27 13:14:32,519] INFO .apache.spark.SparkContext [] [akka://test/user/$h] - Added JAR /tmp/InMemoryDAO9098106076186307131.jar at http://10.1.3.213:42366/jars/InMemoryDAO9098106076186307131.jar with timestamp 1422364472519
[2015-01-27 13:14:32,523] INFO util.ContextURLClassLoader [] [akka://test/user/$h] - Added URL file:/tmp/InMemoryDAO9098106076186307131.jar to ContextURLClassLoader
[2015-01-27 13:14:32,523] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$h] - Loading object spark.jobserver.WordCountExample$ using loader spark.jobserver.util.ContextURLClassLoader@45ea5f99
[2015-01-27 13:14:32,525] INFO .jobserver.JobManagerActor [] [akka://test/user/$h] - Starting Spark job 251b8900-7e2b-48df-a14f-0356d7fbe72d [spark.jobserver.WordCountExample]...
[2015-01-27 13:14:32,525] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:32,525] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:32,526] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:32,529] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:32,529] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:32,540] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:32,540] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:32,541] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:32,543] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:32,543] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:32,544] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:32,544] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:32,544] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:32,545] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:32,545] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:32,545] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:32,546] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:32,546] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:32,546] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:32,546] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:32,547] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:32,547] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:32,547] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:32,548] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:32,548] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:32,549] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:32,549] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:32,549] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:32,550] INFO .apache.spark.SparkContext [] [] - Starting job: collect at WordCountExample.scala:32
[2015-01-27 13:14:32,551] INFO ark.scheduler.DAGScheduler [] [] - Registering RDD 1 (map at WordCountExample.scala:32)
[2015-01-27 13:14:32,552] INFO ark.scheduler.DAGScheduler [] [] - Got job 0 (collect at WordCountExample.scala:32) with 4 output partitions (allowLocal=false)
[2015-01-27 13:14:32,552] INFO ark.scheduler.DAGScheduler [] [] - Final stage: Stage 1(collect at WordCountExample.scala:32)
[2015-01-27 13:14:32,552] INFO ark.scheduler.DAGScheduler [] [] - Parents of final stage: List(Stage 0)
[2015-01-27 13:14:32,554] INFO ark.scheduler.DAGScheduler [] [] - Missing parents: List(Stage 0)
[2015-01-27 13:14:32,556] INFO ark.scheduler.DAGScheduler [] [] - Submitting Stage 0 (MappedRDD[1] at map at WordCountExample.scala:32), which has no missing parents
[2015-01-27 13:14:32,559] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(2384) called with curMem=0, maxMem=714866688
[2015-01-27 13:14:32,560] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0 stored as values in memory (estimated size 2.3 KB, free 681.7 MB)
[2015-01-27 13:14:32,562] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(1716) called with curMem=2384, maxMem=714866688
[2015-01-27 13:14:32,563] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0_piece0 stored as bytes in memory (estimated size 1716.0 B, free 681.7 MB)
[2015-01-27 13:14:32,564] INFO k.storage.BlockManagerInfo [] [] - Added broadcast_0_piece0 in memory on localhost:56071 (size: 1716.0 B, free: 681.7 MB)
[2015-01-27 13:14:32,565] INFO storage.BlockManagerMaster [] [] - Updated info of block broadcast_0_piece0
[2015-01-27 13:14:32,566] INFO .apache.spark.SparkContext [] [] - Created broadcast 0 from broadcast at DAGScheduler.scala:838
[2015-01-27 13:14:32,568] INFO ark.scheduler.DAGScheduler [] [] - Submitting 4 missing tasks from Stage 0 (MappedRDD[1] at map at WordCountExample.scala:32)
[2015-01-27 13:14:32,568] INFO cheduler.TaskSchedulerImpl [] [] - Adding task set 0.0 with 4 tasks
[2015-01-27 13:14:32,572] INFO k.scheduler.TaskSetManager [] [] - Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1330 bytes)
[2015-01-27 13:14:32,573] INFO k.scheduler.TaskSetManager [] [] - Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1337 bytes)
[2015-01-27 13:14:32,574] INFO k.scheduler.TaskSetManager [] [] - Starting task 2.0 in stage 0.0 (TID 2, localhost, PROCESS_LOCAL, 1340 bytes)
[2015-01-27 13:14:32,576] INFO k.scheduler.TaskSetManager [] [] - Starting task 3.0 in stage 0.0 (TID 3, localhost, PROCESS_LOCAL, 1337 bytes)
[2015-01-27 13:14:32,582] INFO he.spark.executor.Executor [] [] - Running task 0.0 in stage 0.0 (TID 0)
[2015-01-27 13:14:32,583] INFO he.spark.executor.Executor [] [] - Running task 1.0 in stage 0.0 (TID 1)
[2015-01-27 13:14:32,583] INFO he.spark.executor.Executor [] [] - Running task 2.0 in stage 0.0 (TID 2)
[2015-01-27 13:14:32,585] INFO he.spark.executor.Executor [] [] - Running task 3.0 in stage 0.0 (TID 3)
[2015-01-27 13:14:32,592] INFO he.spark.executor.Executor [] [] - Fetching http://10.1.3.213:42366/jars/InMemoryDAO9098106076186307131.jar with timestamp 1422364472519
[2015-01-27 13:14:32,600] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:51422
[2015-01-27 13:14:32,601] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:32,602] INFO ark.scheduler.DAGScheduler [] [] - Job 0 failed: collect at WordCountExample.scala:32, took 0.051083 s
[2015-01-27 13:14:32,602] WARN .jobserver.JobManagerActor [] [] - Exception from job 251b8900-7e2b-48df-a14f-0356d7fbe72d:
org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:702)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:701)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:701)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.postStop(DAGScheduler.scala:1428)
at akka.actor.Actor$class.aroundPostStop(Actor.scala:475)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundPostStop(DAGScheduler.scala:1375)
at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
at akka.actor.ActorCell.terminate(ActorCell.scala:369)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:32,624] INFO rg.apache.spark.util.Utils [] [] - Fetching http://10.1.3.213:42366/jars/InMemoryDAO9098106076186307131.jar to /tmp/fetchFileTemp167125581953179899.tmp
[2015-01-27 13:14:32,732] INFO he.spark.executor.Executor [] [] - Adding file:/tmp/spark-1e3520d7-33c3-4152-a51f-fe5cbb931e0c/InMemoryDAO9098106076186307131.jar to class loader
[2015-01-27 13:14:32,832] INFO he.spark.executor.Executor [] [] - Finished task 2.0 in stage 0.0 (TID 2). 840 bytes result sent to driver
[2015-01-27 13:14:32,832] INFO he.spark.executor.Executor [] [] - Finished task 3.0 in stage 0.0 (TID 3). 840 bytes result sent to driver
[2015-01-27 13:14:32,832] INFO he.spark.executor.Executor [] [] - Finished task 0.0 in stage 0.0 (TID 0). 840 bytes result sent to driver
[2015-01-27 13:14:32,832] INFO he.spark.executor.Executor [] [] - Finished task 1.0 in stage 0.0 (TID 1). 840 bytes result sent to driver
[2015-01-27 13:14:32,834] ERROR cheduler.TaskSchedulerImpl [] [] - Exception in statusUpdate
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.scheduler.TaskResultGetter$$anon$2@5df7a803 rejected from java.util.concurrent.ThreadPoolExecutor@2e378d27[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
at org.apache.spark.scheduler.TaskResultGetter.enqueueSuccessfulTask(TaskResultGetter.scala:47)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:301)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:298)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.TaskSchedulerImpl.liftedTree2$1(TaskSchedulerImpl.scala:298)
at org.apache.spark.scheduler.TaskSchedulerImpl.statusUpdate(TaskSchedulerImpl.scala:283)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$receiveWithLogging$1.applyOrElse(LocalBackend.scala:61)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.local.LocalActor.aroundReceive(LocalBackend.scala:43)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:32,837] ERROR cheduler.TaskSchedulerImpl [] [] - Exception in statusUpdate
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.scheduler.TaskResultGetter$$anon$2@4ac62bc0 rejected from java.util.concurrent.ThreadPoolExecutor@2e378d27[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
at org.apache.spark.scheduler.TaskResultGetter.enqueueSuccessfulTask(TaskResultGetter.scala:47)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:301)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:298)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.TaskSchedulerImpl.liftedTree2$1(TaskSchedulerImpl.scala:298)
at org.apache.spark.scheduler.TaskSchedulerImpl.statusUpdate(TaskSchedulerImpl.scala:283)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$receiveWithLogging$1.applyOrElse(LocalBackend.scala:61)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.local.LocalActor.aroundReceive(LocalBackend.scala:43)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:32,838] ERROR cheduler.TaskSchedulerImpl [] [] - Exception in statusUpdate
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.scheduler.TaskResultGetter$$anon$2@66ce9af7 rejected from java.util.concurrent.ThreadPoolExecutor@2e378d27[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
at org.apache.spark.scheduler.TaskResultGetter.enqueueSuccessfulTask(TaskResultGetter.scala:47)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:301)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:298)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.TaskSchedulerImpl.liftedTree2$1(TaskSchedulerImpl.scala:298)
at org.apache.spark.scheduler.TaskSchedulerImpl.statusUpdate(TaskSchedulerImpl.scala:283)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$receiveWithLogging$1.applyOrElse(LocalBackend.scala:61)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.local.LocalActor.aroundReceive(LocalBackend.scala:43)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:32,839] ERROR cheduler.TaskSchedulerImpl [] [] - Exception in statusUpdate
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.scheduler.TaskResultGetter$$anon$2@1f9bbca8 rejected from java.util.concurrent.ThreadPoolExecutor@2e378d27[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
at org.apache.spark.scheduler.TaskResultGetter.enqueueSuccessfulTask(TaskResultGetter.scala:47)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:301)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:298)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.TaskSchedulerImpl.liftedTree2$1(TaskSchedulerImpl.scala:298)
at org.apache.spark.scheduler.TaskSchedulerImpl.statusUpdate(TaskSchedulerImpl.scala:283)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$receiveWithLogging$1.applyOrElse(LocalBackend.scala:61)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.local.LocalActor.aroundReceive(LocalBackend.scala:43)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:33,654] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:33,660] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:33,660] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:33,661] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:33,665] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:33,665] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:36349/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:33,666] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:36349/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:33,668] INFO .jobserver.JobManagerActor [] [akka://test/user/$i] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:33,669] INFO k.jobserver.JobStatusActor [] [akka://test/user/$i/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:33,670] INFO k.jobserver.JobResultActor [] [akka://test/user/$i/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:33,678] INFO ache.spark.SecurityManager [] [akka://test/user/$i] - Changing view acls to: tja01
[2015-01-27 13:14:33,679] INFO ache.spark.SecurityManager [] [akka://test/user/$i] - Changing modify acls to: tja01
[2015-01-27 13:14:33,679] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:36349/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:33,679] INFO ache.spark.SecurityManager [] [akka://test/user/$i] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:33,738] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$i] - Slf4jLogger started
[2015-01-27 13:14:33,745] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:33,758] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:33782]
[2015-01-27 13:14:33,759] INFO rg.apache.spark.util.Utils [] [akka://test/user/$i] - Successfully started service 'sparkDriver' on port 33782.
[2015-01-27 13:14:33,763] INFO org.apache.spark.SparkEnv [] [akka://test/user/$i] - Registering MapOutputTracker
[2015-01-27 13:14:33,764] INFO org.apache.spark.SparkEnv [] [akka://test/user/$i] - Registering BlockManagerMaster
[2015-01-27 13:14:33,765] INFO k.storage.DiskBlockManager [] [akka://test/user/$i] - Created local directory at /tmp/spark-local-20150127131433-710d
[2015-01-27 13:14:33,765] INFO .spark.storage.MemoryStore [] [akka://test/user/$i] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:33,767] INFO pache.spark.HttpFileServer [] [akka://test/user/$i] - HTTP File server directory is /tmp/spark-caa38a12-b787-41e3-aaf5-fdbdc1d8e77d
[2015-01-27 13:14:33,767] INFO rg.apache.spark.HttpServer [] [akka://test/user/$i] - Starting HTTP Server
[2015-01-27 13:14:33,768] INFO clipse.jetty.server.Server [] [akka://test/user/$i] - jetty-8.1.14.v20131031
[2015-01-27 13:14:33,795] INFO y.server.AbstractConnector [] [akka://test/user/$i] - Started [email protected]:36798
[2015-01-27 13:14:33,795] INFO rg.apache.spark.util.Utils [] [akka://test/user/$i] - Successfully started service 'HTTP file server' on port 36798.
[2015-01-27 13:14:38,843] INFO clipse.jetty.server.Server [] [akka://test/user/$i] - jetty-8.1.14.v20131031
[2015-01-27 13:14:38,854] INFO y.server.AbstractConnector [] [akka://test/user/$i] - Started [email protected]:44998
[2015-01-27 13:14:38,854] INFO rg.apache.spark.util.Utils [] [akka://test/user/$i] - Successfully started service 'SparkUI' on port 44998.
[2015-01-27 13:14:38,854] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$i] - Started SparkUI at http://localhost:44998
[2015-01-27 13:14:38,873] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:33782/user/HeartbeatReceiver
[2015-01-27 13:14:38,875] INFO .NettyBlockTransferService [] [akka://test/user/$i] - Server created on 38722
[2015-01-27 13:14:38,875] INFO storage.BlockManagerMaster [] [akka://test/user/$i] - Trying to register BlockManager
[2015-01-27 13:14:38,875] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:38722 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 38722)
[2015-01-27 13:14:38,876] INFO storage.BlockManagerMaster [] [akka://test/user/$i] - Registered BlockManager
[2015-01-27 13:14:38,880] INFO .jobserver.RddManagerActor [] [akka://test/user/$i/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:38,881] INFO .jobserver.JobManagerActor [] [akka://test/user/$i] - Loading class spark.jobserver.MyErrorJob for app demo
[2015-01-27 13:14:38,881] INFO .apache.spark.SparkContext [] [akka://test/user/$i] - Added JAR /tmp/InMemoryDAO7796836459895580557.jar at http://10.1.3.213:36798/jars/InMemoryDAO7796836459895580557.jar with timestamp 1422364478881
[2015-01-27 13:14:38,884] INFO util.ContextURLClassLoader [] [akka://test/user/$i] - Added URL file:/tmp/InMemoryDAO7796836459895580557.jar to ContextURLClassLoader
[2015-01-27 13:14:38,884] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$i] - Loading object spark.jobserver.MyErrorJob$ using loader spark.jobserver.util.ContextURLClassLoader@7af89f7a
[2015-01-27 13:14:38,885] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$i] - Loading class spark.jobserver.MyErrorJob using loader spark.jobserver.util.ContextURLClassLoader@7af89f7a
[2015-01-27 13:14:38,887] INFO .jobserver.JobManagerActor [] [akka://test/user/$i] - Starting Spark job 6950182c-d954-4168-bf4f-17dcb284ecbb [spark.jobserver.MyErrorJob]...
[2015-01-27 13:14:38,887] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:38,888] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:38,888] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:38,888] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:38,888] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:38,890] WARN .jobserver.JobManagerActor [] [] - Exception from job 6950182c-d954-4168-bf4f-17dcb284ecbb:
java.lang.IllegalArgumentException: Foobar
at spark.jobserver.MyErrorJob.runJob(SparkTestJobs.scala:14)
at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:219)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:38,899] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:38,900] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:38,900] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:38,901] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:38,901] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:38,902] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:38,902] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:38,903] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:38,903] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:38,904] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:38,904] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:38,904] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:38,905] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:38,905] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:38,906] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:38,906] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:38,907] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:38,907] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:38,908] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:38,908] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:38,909] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:38,909] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:38,910] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:38,961] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:44998
[2015-01-27 13:14:38,962] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:40,015] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:40,020] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:40,021] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:40,022] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:40,024] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:40,026] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:33782/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:40,027] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:33782/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:40,028] INFO .jobserver.JobManagerActor [] [akka://test/user/$j] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:40,029] INFO k.jobserver.JobStatusActor [] [akka://test/user/$j/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:40,029] INFO k.jobserver.JobResultActor [] [akka://test/user/$j/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:40,035] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:33782/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:40,036] INFO ache.spark.SecurityManager [] [akka://test/user/$j] - Changing view acls to: tja01
[2015-01-27 13:14:40,036] INFO ache.spark.SecurityManager [] [akka://test/user/$j] - Changing modify acls to: tja01
[2015-01-27 13:14:40,036] INFO ache.spark.SecurityManager [] [akka://test/user/$j] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:40,079] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$j] - Slf4jLogger started
[2015-01-27 13:14:40,085] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:40,093] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:36268]
[2015-01-27 13:14:40,094] INFO rg.apache.spark.util.Utils [] [akka://test/user/$j] - Successfully started service 'sparkDriver' on port 36268.
[2015-01-27 13:14:40,094] INFO org.apache.spark.SparkEnv [] [akka://test/user/$j] - Registering MapOutputTracker
[2015-01-27 13:14:40,095] INFO org.apache.spark.SparkEnv [] [akka://test/user/$j] - Registering BlockManagerMaster
[2015-01-27 13:14:40,096] INFO k.storage.DiskBlockManager [] [akka://test/user/$j] - Created local directory at /tmp/spark-local-20150127131440-ca6f
[2015-01-27 13:14:40,096] INFO .spark.storage.MemoryStore [] [akka://test/user/$j] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:40,097] INFO pache.spark.HttpFileServer [] [akka://test/user/$j] - HTTP File server directory is /tmp/spark-f9795fb2-7372-49d7-95d7-4dec42b9af62
[2015-01-27 13:14:40,097] INFO rg.apache.spark.HttpServer [] [akka://test/user/$j] - Starting HTTP Server
[2015-01-27 13:14:40,098] INFO clipse.jetty.server.Server [] [akka://test/user/$j] - jetty-8.1.14.v20131031
[2015-01-27 13:14:40,099] INFO y.server.AbstractConnector [] [akka://test/user/$j] - Started [email protected]:43259
[2015-01-27 13:14:40,099] INFO rg.apache.spark.util.Utils [] [akka://test/user/$j] - Successfully started service 'HTTP file server' on port 43259.
[2015-01-27 13:14:45,111] INFO clipse.jetty.server.Server [] [akka://test/user/$j] - jetty-8.1.14.v20131031
[2015-01-27 13:14:45,117] INFO y.server.AbstractConnector [] [akka://test/user/$j] - Started [email protected]:35186
[2015-01-27 13:14:45,117] INFO rg.apache.spark.util.Utils [] [akka://test/user/$j] - Successfully started service 'SparkUI' on port 35186.
[2015-01-27 13:14:45,117] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$j] - Started SparkUI at http://localhost:35186
[2015-01-27 13:14:45,138] INFO pache.spark.util.AkkaUtils [] [akka://test/user/$j] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:36268/user/HeartbeatReceiver
[2015-01-27 13:14:45,143] INFO .NettyBlockTransferService [] [akka://test/user/$j] - Server created on 41968
[2015-01-27 13:14:45,143] INFO storage.BlockManagerMaster [] [akka://test/user/$j] - Trying to register BlockManager
[2015-01-27 13:14:45,143] INFO ge.BlockManagerMasterActor [] [akka://test/user/$j] - Registering block manager localhost:41968 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 41968)
[2015-01-27 13:14:45,144] INFO storage.BlockManagerMaster [] [akka://test/user/$j] - Registered BlockManager
[2015-01-27 13:14:45,148] INFO .jobserver.RddManagerActor [] [akka://test/user/$j/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:45,148] INFO .jobserver.JobManagerActor [] [akka://test/user/$j] - Loading class spark.jobserver.ConfigCheckerJob for app demo
[2015-01-27 13:14:45,149] INFO .apache.spark.SparkContext [] [akka://test/user/$j] - Added JAR /tmp/InMemoryDAO9202477903903725816.jar at http://10.1.3.213:43259/jars/InMemoryDAO9202477903903725816.jar with timestamp 1422364485149
[2015-01-27 13:14:45,152] INFO util.ContextURLClassLoader [] [akka://test/user/$j] - Added URL file:/tmp/InMemoryDAO9202477903903725816.jar to ContextURLClassLoader
[2015-01-27 13:14:45,152] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$j] - Loading object spark.jobserver.ConfigCheckerJob$ using loader spark.jobserver.util.ContextURLClassLoader@600b9fdc
[2015-01-27 13:14:45,153] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$j] - Loading class spark.jobserver.ConfigCheckerJob using loader spark.jobserver.util.ContextURLClassLoader@600b9fdc
[2015-01-27 13:14:45,154] INFO .jobserver.JobManagerActor [] [akka://test/user/$j] - Starting Spark job 4c1e37e3-fee8-4603-bdf7-e1607ec0c5c0 [spark.jobserver.ConfigCheckerJob]...
[2015-01-27 13:14:45,154] INFO k.jobserver.JobResultActor [] [akka://test/user/$j/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 4c1e37e3-fee8-4603-bdf7-e1607ec0c5c0
[2015-01-27 13:14:45,154] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:45,155] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:45,155] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:45,156] INFO k.jobserver.JobStatusActor [] [akka://test/user/$j/status-actor] - Job 4c1e37e3-fee8-4603-bdf7-e1607ec0c5c0 started
[2015-01-27 13:14:45,159] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:45,160] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:45,170] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:45,171] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:45,171] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:45,171] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:45,171] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:45,171] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:45,171] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:45,171] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:45,172] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:45,172] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:45,172] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:45,172] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:45,172] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:45,172] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:45,172] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:45,172] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:45,173] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:45,173] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:45,173] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:45,173] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:45,173] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:45,173] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:45,173] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:45,224] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:35186
[2015-01-27 13:14:45,224] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:46,277] INFO apOutputTrackerMasterActor [] [akka://test/user/$j] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:46,281] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:46,281] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:46,282] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:46,283] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:46,284] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:36268/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:46,285] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:36268/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:46,287] INFO .jobserver.JobManagerActor [] [akka://test/user/$k] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:46,288] INFO k.jobserver.JobStatusActor [] [akka://test/user/$k/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:46,288] INFO k.jobserver.JobResultActor [] [akka://test/user/$k/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:46,294] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:36268/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:46,296] INFO ache.spark.SecurityManager [] [akka://test/user/$k] - Changing view acls to: tja01
[2015-01-27 13:14:46,296] INFO ache.spark.SecurityManager [] [akka://test/user/$k] - Changing modify acls to: tja01
[2015-01-27 13:14:46,296] INFO ache.spark.SecurityManager [] [akka://test/user/$k] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:46,334] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$k] - Slf4jLogger started
[2015-01-27 13:14:46,342] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:46,357] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:57257]
[2015-01-27 13:14:46,358] INFO rg.apache.spark.util.Utils [] [akka://test/user/$k] - Successfully started service 'sparkDriver' on port 57257.
[2015-01-27 13:14:46,358] INFO org.apache.spark.SparkEnv [] [akka://test/user/$k] - Registering MapOutputTracker
[2015-01-27 13:14:46,359] INFO org.apache.spark.SparkEnv [] [akka://test/user/$k] - Registering BlockManagerMaster
[2015-01-27 13:14:46,361] INFO k.storage.DiskBlockManager [] [akka://test/user/$k] - Created local directory at /tmp/spark-local-20150127131446-cb64
[2015-01-27 13:14:46,361] INFO .spark.storage.MemoryStore [] [akka://test/user/$k] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:46,362] INFO pache.spark.HttpFileServer [] [akka://test/user/$k] - HTTP File server directory is /tmp/spark-2b07d776-fab3-47d9-8173-03ca0decbee5
[2015-01-27 13:14:46,362] INFO rg.apache.spark.HttpServer [] [akka://test/user/$k] - Starting HTTP Server
[2015-01-27 13:14:46,363] INFO clipse.jetty.server.Server [] [akka://test/user/$k] - jetty-8.1.14.v20131031
[2015-01-27 13:14:46,372] INFO y.server.AbstractConnector [] [akka://test/user/$k] - Started [email protected]:57449
[2015-01-27 13:14:46,372] INFO rg.apache.spark.util.Utils [] [akka://test/user/$k] - Successfully started service 'HTTP file server' on port 57449.
[2015-01-27 13:14:51,431] INFO clipse.jetty.server.Server [] [akka://test/user/$k] - jetty-8.1.14.v20131031
[2015-01-27 13:14:51,448] INFO y.server.AbstractConnector [] [akka://test/user/$k] - Started [email protected]:38517
[2015-01-27 13:14:51,448] INFO rg.apache.spark.util.Utils [] [akka://test/user/$k] - Successfully started service 'SparkUI' on port 38517.
[2015-01-27 13:14:51,448] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$k] - Started SparkUI at http://localhost:38517
[2015-01-27 13:14:51,491] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:57257/user/HeartbeatReceiver
[2015-01-27 13:14:51,499] INFO .NettyBlockTransferService [] [akka://test/user/$k] - Server created on 40291
[2015-01-27 13:14:51,500] INFO storage.BlockManagerMaster [] [akka://test/user/$k] - Trying to register BlockManager
[2015-01-27 13:14:51,500] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:40291 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 40291)
[2015-01-27 13:14:51,501] INFO storage.BlockManagerMaster [] [akka://test/user/$k] - Registered BlockManager
[2015-01-27 13:14:51,510] INFO .jobserver.RddManagerActor [] [akka://test/user/$k/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:51,510] INFO .jobserver.JobManagerActor [] [akka://test/user/$k] - Loading class spark.jobserver.ZookeeperJob for app demo
[2015-01-27 13:14:51,511] INFO .apache.spark.SparkContext [] [akka://test/user/$k] - Added JAR /tmp/InMemoryDAO575330615886762321.jar at http://10.1.3.213:57449/jars/InMemoryDAO575330615886762321.jar with timestamp 1422364491511
[2015-01-27 13:14:51,514] INFO util.ContextURLClassLoader [] [akka://test/user/$k] - Added URL file:/tmp/InMemoryDAO575330615886762321.jar to ContextURLClassLoader
[2015-01-27 13:14:51,514] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$k] - Loading object spark.jobserver.ZookeeperJob$ using loader spark.jobserver.util.ContextURLClassLoader@3129517
[2015-01-27 13:14:51,515] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$k] - Loading class spark.jobserver.ZookeeperJob using loader spark.jobserver.util.ContextURLClassLoader@3129517
[2015-01-27 13:14:51,517] INFO .jobserver.JobManagerActor [] [akka://test/user/$k] - Starting Spark job 1e0b5ae7-83e6-4f73-bda1-5f5ffa34f537 [spark.jobserver.ZookeeperJob]...
[2015-01-27 13:14:51,517] INFO k.jobserver.JobResultActor [] [akka://test/user/$k/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 1e0b5ae7-83e6-4f73-bda1-5f5ffa34f537
[2015-01-27 13:14:51,517] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:51,517] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:51,517] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:51,518] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:51,519] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:51,529] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:51,530] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:51,530] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:51,530] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:51,530] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:51,530] INFO .apache.spark.SparkContext [] [] - Starting job: collect at SparkTestJobs.scala:74
[2015-01-27 13:14:51,530] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:51,530] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:51,530] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:51,531] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:51,531] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:51,531] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:51,531] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:51,531] INFO ark.scheduler.DAGScheduler [] [] - Got job 0 (collect at SparkTestJobs.scala:74) with 4 output partitions (allowLocal=false)
[2015-01-27 13:14:51,531] INFO ark.scheduler.DAGScheduler [] [] - Final stage: Stage 0(collect at SparkTestJobs.scala:74)
[2015-01-27 13:14:51,531] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:51,531] INFO ark.scheduler.DAGScheduler [] [] - Parents of final stage: List()
[2015-01-27 13:14:51,531] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:51,531] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:51,531] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:51,532] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:51,532] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:51,532] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:51,532] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:51,532] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:51,532] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:51,532] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:51,532] INFO ark.scheduler.DAGScheduler [] [] - Missing parents: List()
[2015-01-27 13:14:51,533] INFO ark.scheduler.DAGScheduler [] [] - Submitting Stage 0 (FilteredRDD[1] at filter at SparkTestJobs.scala:74), which has no missing parents
[2015-01-27 13:14:51,535] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(1680) called with curMem=0, maxMem=714866688
[2015-01-27 13:14:51,536] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0 stored as values in memory (estimated size 1680.0 B, free 681.7 MB)
[2015-01-27 13:14:51,537] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(1219) called with curMem=1680, maxMem=714866688
[2015-01-27 13:14:51,538] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0_piece0 stored as bytes in memory (estimated size 1219.0 B, free 681.7 MB)
[2015-01-27 13:14:51,539] INFO k.storage.BlockManagerInfo [] [] - Added broadcast_0_piece0 in memory on localhost:40291 (size: 1219.0 B, free: 681.7 MB)
[2015-01-27 13:14:51,539] INFO storage.BlockManagerMaster [] [] - Updated info of block broadcast_0_piece0
[2015-01-27 13:14:51,540] INFO .apache.spark.SparkContext [] [] - Created broadcast 0 from broadcast at DAGScheduler.scala:838
[2015-01-27 13:14:51,543] INFO ark.scheduler.DAGScheduler [] [] - Submitting 4 missing tasks from Stage 0 (FilteredRDD[1] at filter at SparkTestJobs.scala:74)
[2015-01-27 13:14:51,544] INFO cheduler.TaskSchedulerImpl [] [] - Adding task set 0.0 with 4 tasks
[2015-01-27 13:14:51,545] INFO k.scheduler.TaskSetManager [] [] - Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1340 bytes)
[2015-01-27 13:14:51,546] INFO k.scheduler.TaskSetManager [] [] - Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1397 bytes)
[2015-01-27 13:14:51,547] INFO k.scheduler.TaskSetManager [] [] - Starting task 2.0 in stage 0.0 (TID 2, localhost, PROCESS_LOCAL, 1397 bytes)
[2015-01-27 13:14:51,548] INFO k.scheduler.TaskSetManager [] [] - Starting task 3.0 in stage 0.0 (TID 3, localhost, PROCESS_LOCAL, 1399 bytes)
[2015-01-27 13:14:51,548] INFO he.spark.executor.Executor [] [] - Running task 0.0 in stage 0.0 (TID 0)
[2015-01-27 13:14:51,548] INFO he.spark.executor.Executor [] [] - Running task 1.0 in stage 0.0 (TID 1)
[2015-01-27 13:14:51,549] INFO he.spark.executor.Executor [] [] - Fetching http://10.1.3.213:57449/jars/InMemoryDAO575330615886762321.jar with timestamp 1422364491511
[2015-01-27 13:14:51,549] INFO he.spark.executor.Executor [] [] - Running task 2.0 in stage 0.0 (TID 2)
[2015-01-27 13:14:51,550] INFO he.spark.executor.Executor [] [] - Running task 3.0 in stage 0.0 (TID 3)
[2015-01-27 13:14:51,563] INFO rg.apache.spark.util.Utils [] [] - Fetching http://10.1.3.213:57449/jars/InMemoryDAO575330615886762321.jar to /tmp/fetchFileTemp3338585275882279297.tmp
[2015-01-27 13:14:51,570] INFO he.spark.executor.Executor [] [] - Adding file:/tmp/spark-10e2421c-dfd6-4e05-b5dd-8dc0d291dfb3/InMemoryDAO575330615886762321.jar to class loader
[2015-01-27 13:14:51,578] INFO he.spark.executor.Executor [] [] - Finished task 2.0 in stage 0.0 (TID 2). 617 bytes result sent to driver
[2015-01-27 13:14:51,579] INFO he.spark.executor.Executor [] [] - Finished task 1.0 in stage 0.0 (TID 1). 617 bytes result sent to driver
[2015-01-27 13:14:51,580] INFO he.spark.executor.Executor [] [] - Finished task 0.0 in stage 0.0 (TID 0). 617 bytes result sent to driver
[2015-01-27 13:14:51,580] INFO he.spark.executor.Executor [] [] - Finished task 3.0 in stage 0.0 (TID 3). 692 bytes result sent to driver
[2015-01-27 13:14:51,584] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:38517
[2015-01-27 13:14:51,584] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:51,585] INFO ark.scheduler.DAGScheduler [] [] - Job 0 failed: collect at SparkTestJobs.scala:74, took 0.054222 s
[2015-01-27 13:14:51,585] WARN .jobserver.JobManagerActor [] [] - Exception from job 1e0b5ae7-83e6-4f73-bda1-5f5ffa34f537:
org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:702)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:701)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:701)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.postStop(DAGScheduler.scala:1428)
at akka.actor.Actor$class.aroundPostStop(Actor.scala:475)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundPostStop(DAGScheduler.scala:1375)
at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
at akka.actor.ActorCell.terminate(ActorCell.scala:369)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:14:51,588] INFO k.scheduler.TaskSetManager [] [] - Finished task 3.0 in stage 0.0 (TID 3) in 35 ms on localhost (1/4)
[2015-01-27 13:14:51,588] INFO k.scheduler.TaskSetManager [] [] - Finished task 1.0 in stage 0.0 (TID 1) in 43 ms on localhost (2/4)
[2015-01-27 13:14:51,589] INFO k.scheduler.TaskSetManager [] [] - Finished task 2.0 in stage 0.0 (TID 2) in 43 ms on localhost (3/4)
[2015-01-27 13:14:51,590] INFO k.scheduler.TaskSetManager [] [] - Finished task 0.0 in stage 0.0 (TID 0) in 45 ms on localhost (4/4)
[2015-01-27 13:14:51,591] INFO cheduler.TaskSchedulerImpl [] [] - Removed TaskSet 0.0, whose tasks have all completed, from pool
[2015-01-27 13:14:52,637] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:52,640] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:52,640] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:52,641] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:52,642] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:52,642] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:57257/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:52,643] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:57257/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:52,643] INFO .jobserver.JobManagerActor [] [akka://test/user/$l] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:52,647] INFO k.jobserver.JobStatusActor [] [akka://test/user/$l/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:52,647] INFO k.jobserver.JobResultActor [] [akka://test/user/$l/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:52,654] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:57257/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:52,657] INFO ache.spark.SecurityManager [] [akka://test/user/$l] - Changing view acls to: tja01
[2015-01-27 13:14:52,658] INFO ache.spark.SecurityManager [] [akka://test/user/$l] - Changing modify acls to: tja01
[2015-01-27 13:14:52,658] INFO ache.spark.SecurityManager [] [akka://test/user/$l] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:52,688] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$l] - Slf4jLogger started
[2015-01-27 13:14:52,694] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:52,702] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:46813]
[2015-01-27 13:14:52,703] INFO rg.apache.spark.util.Utils [] [akka://test/user/$l] - Successfully started service 'sparkDriver' on port 46813.
[2015-01-27 13:14:52,704] INFO org.apache.spark.SparkEnv [] [akka://test/user/$l] - Registering MapOutputTracker
[2015-01-27 13:14:52,704] INFO org.apache.spark.SparkEnv [] [akka://test/user/$l] - Registering BlockManagerMaster
[2015-01-27 13:14:52,705] INFO k.storage.DiskBlockManager [] [akka://test/user/$l] - Created local directory at /tmp/spark-local-20150127131452-6998
[2015-01-27 13:14:52,705] INFO .spark.storage.MemoryStore [] [akka://test/user/$l] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:52,706] INFO pache.spark.HttpFileServer [] [akka://test/user/$l] - HTTP File server directory is /tmp/spark-549dfa56-7df5-477f-9e46-9a66cf7aa198
[2015-01-27 13:14:52,706] INFO rg.apache.spark.HttpServer [] [akka://test/user/$l] - Starting HTTP Server
[2015-01-27 13:14:52,707] INFO clipse.jetty.server.Server [] [akka://test/user/$l] - jetty-8.1.14.v20131031
[2015-01-27 13:14:52,708] INFO y.server.AbstractConnector [] [akka://test/user/$l] - Started [email protected]:45111
[2015-01-27 13:14:52,708] INFO rg.apache.spark.util.Utils [] [akka://test/user/$l] - Successfully started service 'HTTP file server' on port 45111.
[2015-01-27 13:14:57,722] INFO clipse.jetty.server.Server [] [akka://test/user/$l] - jetty-8.1.14.v20131031
[2015-01-27 13:14:57,734] INFO y.server.AbstractConnector [] [akka://test/user/$l] - Started [email protected]:55198
[2015-01-27 13:14:57,734] INFO rg.apache.spark.util.Utils [] [akka://test/user/$l] - Successfully started service 'SparkUI' on port 55198.
[2015-01-27 13:14:57,735] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$l] - Started SparkUI at http://localhost:55198
[2015-01-27 13:14:57,775] INFO pache.spark.util.AkkaUtils [] [akka://test/user/$l] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:46813/user/HeartbeatReceiver
[2015-01-27 13:14:57,777] INFO .NettyBlockTransferService [] [akka://test/user/$l] - Server created on 35147
[2015-01-27 13:14:57,777] INFO storage.BlockManagerMaster [] [akka://test/user/$l] - Trying to register BlockManager
[2015-01-27 13:14:57,778] INFO ge.BlockManagerMasterActor [] [akka://test/user/$l] - Registering block manager localhost:35147 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 35147)
[2015-01-27 13:14:57,778] INFO storage.BlockManagerMaster [] [akka://test/user/$l] - Registered BlockManager
[2015-01-27 13:14:57,782] INFO .jobserver.RddManagerActor [] [akka://test/user/$l/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:14:57,783] INFO .jobserver.JobManagerActor [] [akka://test/user/$l] - Loading class spark.jobserver.SleepJob for app demo
[2015-01-27 13:14:57,783] INFO .apache.spark.SparkContext [] [akka://test/user/$l] - Added JAR /tmp/InMemoryDAO4651919385015759310.jar at http://10.1.3.213:45111/jars/InMemoryDAO4651919385015759310.jar with timestamp 1422364497783
[2015-01-27 13:14:57,786] INFO util.ContextURLClassLoader [] [akka://test/user/$l] - Added URL file:/tmp/InMemoryDAO4651919385015759310.jar to ContextURLClassLoader
[2015-01-27 13:14:57,786] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$l] - Loading object spark.jobserver.SleepJob$ using loader spark.jobserver.util.ContextURLClassLoader@2691238a
[2015-01-27 13:14:57,787] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$l] - Loading class spark.jobserver.SleepJob using loader spark.jobserver.util.ContextURLClassLoader@2691238a
[2015-01-27 13:14:57,788] INFO k.jobserver.JobResultActor [] [akka://test/user/$l/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 7805b79d-4ecf-4f84-93cf-3e0a42b39843
[2015-01-27 13:14:57,789] INFO .jobserver.JobManagerActor [] [akka://test/user/$l] - Starting Spark job 7805b79d-4ecf-4f84-93cf-3e0a42b39843 [spark.jobserver.SleepJob]...
[2015-01-27 13:14:57,789] INFO .jobserver.JobManagerActor [] [akka://test/user/$l] - Loading class spark.jobserver.SleepJob for app demo
[2015-01-27 13:14:57,789] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:57,789] INFO .jobserver.JobManagerActor [] [akka://test/user/$l] - Starting Spark job 4319e2b4-e13c-4c3a-857a-e5a94b88b225 [spark.jobserver.SleepJob]...
[2015-01-27 13:14:57,789] INFO k.jobserver.JobResultActor [] [akka://test/user/$l/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 4319e2b4-e13c-4c3a-857a-e5a94b88b225
[2015-01-27 13:14:57,789] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:14:57,790] INFO .jobserver.JobManagerActor [] [akka://test/user/$l] - Loading class spark.jobserver.SleepJob for app demo
[2015-01-27 13:14:57,790] INFO .jobserver.JobManagerActor [] [akka://test/user/$l] - Starting Spark job 7d45a9fb-4120-47e3-884c-f9836530aad2 [spark.jobserver.SleepJob]...
[2015-01-27 13:14:57,791] INFO k.jobserver.JobResultActor [] [akka://test/user/$l/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID 7d45a9fb-4120-47e3-884c-f9836530aad2
[2015-01-27 13:14:57,792] INFO k.jobserver.JobStatusActor [] [akka://test/user/$l/status-actor] - Job 7805b79d-4ecf-4f84-93cf-3e0a42b39843 started
[2015-01-27 13:14:57,792] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:14:57,792] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:14:57,793] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:14:57,793] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:14:57,804] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:14:57,804] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:14:57,805] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:14:57,805] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:14:57,806] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:14:57,806] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:14:57,806] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:14:57,807] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:14:57,807] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:14:57,807] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:14:57,807] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:14:57,807] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:14:57,807] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:14:57,807] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:14:57,807] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:14:57,808] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:14:57,808] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:14:57,808] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:14:57,808] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:14:57,808] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:14:57,808] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:14:57,808] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:14:57,808] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:14:57,860] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:55198
[2015-01-27 13:14:57,860] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:14:58,913] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:14:58,917] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:14:58,918] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:14:58,921] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:14:58,922] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:14:58,926] INFO .jobserver.JobManagerActor [] [akka://test/user/$m] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:14:58,929] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:46813/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:14:58,930] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:46813/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:14:58,946] INFO k.jobserver.JobStatusActor [] [akka://test/user/$m/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:14:58,947] INFO k.jobserver.JobResultActor [] [akka://test/user/$m/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:14:58,961] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:46813/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:14:58,966] INFO ache.spark.SecurityManager [] [akka://test/user/$m] - Changing view acls to: tja01
[2015-01-27 13:14:58,967] INFO ache.spark.SecurityManager [] [akka://test/user/$m] - Changing modify acls to: tja01
[2015-01-27 13:14:58,967] INFO ache.spark.SecurityManager [] [akka://test/user/$m] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:14:59,002] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$m] - Slf4jLogger started
[2015-01-27 13:14:59,008] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:14:59,015] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:44346]
[2015-01-27 13:14:59,016] INFO rg.apache.spark.util.Utils [] [akka://test/user/$m] - Successfully started service 'sparkDriver' on port 44346.
[2015-01-27 13:14:59,016] INFO org.apache.spark.SparkEnv [] [akka://test/user/$m] - Registering MapOutputTracker
[2015-01-27 13:14:59,017] INFO org.apache.spark.SparkEnv [] [akka://test/user/$m] - Registering BlockManagerMaster
[2015-01-27 13:14:59,018] INFO k.storage.DiskBlockManager [] [akka://test/user/$m] - Created local directory at /tmp/spark-local-20150127131459-34a1
[2015-01-27 13:14:59,018] INFO .spark.storage.MemoryStore [] [akka://test/user/$m] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:14:59,019] INFO pache.spark.HttpFileServer [] [akka://test/user/$m] - HTTP File server directory is /tmp/spark-596dd07f-798d-408a-8851-11d23a582ba6
[2015-01-27 13:14:59,019] INFO rg.apache.spark.HttpServer [] [akka://test/user/$m] - Starting HTTP Server
[2015-01-27 13:14:59,019] INFO clipse.jetty.server.Server [] [akka://test/user/$m] - jetty-8.1.14.v20131031
[2015-01-27 13:14:59,020] INFO y.server.AbstractConnector [] [akka://test/user/$m] - Started [email protected]:46487
[2015-01-27 13:14:59,021] INFO rg.apache.spark.util.Utils [] [akka://test/user/$m] - Successfully started service 'HTTP file server' on port 46487.
[2015-01-27 13:15:04,034] INFO clipse.jetty.server.Server [] [akka://test/user/$m] - jetty-8.1.14.v20131031
[2015-01-27 13:15:04,046] INFO y.server.AbstractConnector [] [akka://test/user/$m] - Started [email protected]:39195
[2015-01-27 13:15:04,046] INFO rg.apache.spark.util.Utils [] [akka://test/user/$m] - Successfully started service 'SparkUI' on port 39195.
[2015-01-27 13:15:04,047] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$m] - Started SparkUI at http://localhost:39195
[2015-01-27 13:15:04,077] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:44346/user/HeartbeatReceiver
[2015-01-27 13:15:04,080] INFO .NettyBlockTransferService [] [akka://test/user/$m] - Server created on 55389
[2015-01-27 13:15:04,080] INFO storage.BlockManagerMaster [] [akka://test/user/$m] - Trying to register BlockManager
[2015-01-27 13:15:04,081] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:55389 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 55389)
[2015-01-27 13:15:04,081] INFO storage.BlockManagerMaster [] [akka://test/user/$m] - Registered BlockManager
[2015-01-27 13:15:04,085] INFO .jobserver.RddManagerActor [] [akka://test/user/$m/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:15:04,086] INFO .jobserver.JobManagerActor [] [akka://test/user/$m] - Loading class spark.jobserver.SimpleObjectJob for app demo
[2015-01-27 13:15:04,086] INFO .apache.spark.SparkContext [] [akka://test/user/$m] - Added JAR /tmp/InMemoryDAO7626494765007423818.jar at http://10.1.3.213:46487/jars/InMemoryDAO7626494765007423818.jar with timestamp 1422364504086
[2015-01-27 13:15:04,089] INFO util.ContextURLClassLoader [] [akka://test/user/$m] - Added URL file:/tmp/InMemoryDAO7626494765007423818.jar to ContextURLClassLoader
[2015-01-27 13:15:04,089] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$m] - Loading object spark.jobserver.SimpleObjectJob$ using loader spark.jobserver.util.ContextURLClassLoader@6a0245ff
[2015-01-27 13:15:04,091] INFO .jobserver.JobManagerActor [] [akka://test/user/$m] - Starting Spark job a6b4c092-f5cb-44a4-ad4f-f04cf1a9d903 [spark.jobserver.SimpleObjectJob]...
[2015-01-27 13:15:04,091] INFO k.jobserver.JobResultActor [] [akka://test/user/$m/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID a6b4c092-f5cb-44a4-ad4f-f04cf1a9d903
[2015-01-27 13:15:04,091] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:15:04,092] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:15:04,092] INFO k.jobserver.JobStatusActor [] [akka://test/user/$m/status-actor] - Job a6b4c092-f5cb-44a4-ad4f-f04cf1a9d903 started
[2015-01-27 13:15:04,092] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:15:04,092] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:15:04,092] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:15:04,104] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:15:04,105] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:15:04,105] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:15:04,106] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:15:04,106] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:15:04,106] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:15:04,106] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:15:04,107] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:15:04,107] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:15:04,107] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:15:04,107] INFO .apache.spark.SparkContext [] [] - Starting job: collect at SparkTestJobs.scala:81
[2015-01-27 13:15:04,107] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:15:04,108] INFO ark.scheduler.DAGScheduler [] [] - Got job 0 (collect at SparkTestJobs.scala:81) with 4 output partitions (allowLocal=false)
[2015-01-27 13:15:04,108] INFO ark.scheduler.DAGScheduler [] [] - Final stage: Stage 0(collect at SparkTestJobs.scala:81)
[2015-01-27 13:15:04,108] INFO ark.scheduler.DAGScheduler [] [] - Parents of final stage: List()
[2015-01-27 13:15:04,108] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:15:04,108] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:15:04,109] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:15:04,110] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:15:04,110] INFO ark.scheduler.DAGScheduler [] [] - Missing parents: List()
[2015-01-27 13:15:04,110] INFO ark.scheduler.DAGScheduler [] [] - Submitting Stage 0 (ParallelCollectionRDD[0] at parallelize at SparkTestJobs.scala:80), which has no missing parents
[2015-01-27 13:15:04,112] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(1128) called with curMem=0, maxMem=714866688
[2015-01-27 13:15:04,112] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0 stored as values in memory (estimated size 1128.0 B, free 681.7 MB)
[2015-01-27 13:15:04,114] INFO .spark.storage.MemoryStore [] [] - ensureFreeSpace(872) called with curMem=1128, maxMem=714866688
[2015-01-27 13:15:04,114] INFO .spark.storage.MemoryStore [] [] - Block broadcast_0_piece0 stored as bytes in memory (estimated size 872.0 B, free 681.7 MB)
[2015-01-27 13:15:04,115] INFO k.storage.BlockManagerInfo [] [akka://test/user/$m] - Added broadcast_0_piece0 in memory on localhost:55389 (size: 872.0 B, free: 681.7 MB)
[2015-01-27 13:15:04,115] INFO storage.BlockManagerMaster [] [] - Updated info of block broadcast_0_piece0
[2015-01-27 13:15:04,116] INFO .apache.spark.SparkContext [] [] - Created broadcast 0 from broadcast at DAGScheduler.scala:838
[2015-01-27 13:15:04,119] INFO ark.scheduler.DAGScheduler [] [] - Submitting 4 missing tasks from Stage 0 (ParallelCollectionRDD[0] at parallelize at SparkTestJobs.scala:80)
[2015-01-27 13:15:04,119] INFO cheduler.TaskSchedulerImpl [] [] - Adding task set 0.0 with 4 tasks
[2015-01-27 13:15:04,123] INFO k.scheduler.TaskSetManager [] [akka://test/user/$m] - Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1273 bytes)
[2015-01-27 13:15:04,124] INFO k.scheduler.TaskSetManager [] [akka://test/user/$m] - Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1277 bytes)
[2015-01-27 13:15:04,125] INFO k.scheduler.TaskSetManager [] [akka://test/user/$m] - Starting task 2.0 in stage 0.0 (TID 2, localhost, PROCESS_LOCAL, 1277 bytes)
[2015-01-27 13:15:04,126] INFO k.scheduler.TaskSetManager [] [akka://test/user/$m] - Starting task 3.0 in stage 0.0 (TID 3, localhost, PROCESS_LOCAL, 1277 bytes)
[2015-01-27 13:15:04,126] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Running task 0.0 in stage 0.0 (TID 0)
[2015-01-27 13:15:04,126] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Running task 1.0 in stage 0.0 (TID 1)
[2015-01-27 13:15:04,126] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Fetching http://10.1.3.213:46487/jars/InMemoryDAO7626494765007423818.jar with timestamp 1422364504086
[2015-01-27 13:15:04,127] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Running task 3.0 in stage 0.0 (TID 3)
[2015-01-27 13:15:04,127] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Running task 2.0 in stage 0.0 (TID 2)
[2015-01-27 13:15:04,146] INFO rg.apache.spark.util.Utils [] [akka://test/user/$m] - Fetching http://10.1.3.213:46487/jars/InMemoryDAO7626494765007423818.jar to /tmp/fetchFileTemp1421123736807285146.tmp
[2015-01-27 13:15:04,150] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Adding file:/tmp/spark-d89424f0-9e65-4956-b565-56eff909606e/InMemoryDAO7626494765007423818.jar to class loader
[2015-01-27 13:15:04,161] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Finished task 0.0 in stage 0.0 (TID 0). 594 bytes result sent to driver
[2015-01-27 13:15:04,161] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Finished task 3.0 in stage 0.0 (TID 3). 598 bytes result sent to driver
[2015-01-27 13:15:04,161] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Finished task 1.0 in stage 0.0 (TID 1). 598 bytes result sent to driver
[2015-01-27 13:15:04,161] INFO he.spark.executor.Executor [] [akka://test/user/$m] - Finished task 2.0 in stage 0.0 (TID 2). 598 bytes result sent to driver
[2015-01-27 13:15:04,163] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:39195
[2015-01-27 13:15:04,163] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:15:04,163] INFO k.scheduler.TaskSetManager [] [akka://test/user/$m] - Finished task 0.0 in stage 0.0 (TID 0) in 40 ms on localhost (1/4)
[2015-01-27 13:15:04,164] ERROR cheduler.TaskSchedulerImpl [] [akka://test/user/$m] - Exception in statusUpdate
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.scheduler.TaskResultGetter$$anon$2@f4a7f52 rejected from java.util.concurrent.ThreadPoolExecutor@52dcd527[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 1]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
at org.apache.spark.scheduler.TaskResultGetter.enqueueSuccessfulTask(TaskResultGetter.scala:47)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:301)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:298)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.TaskSchedulerImpl.liftedTree2$1(TaskSchedulerImpl.scala:298)
at org.apache.spark.scheduler.TaskSchedulerImpl.statusUpdate(TaskSchedulerImpl.scala:283)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$receiveWithLogging$1.applyOrElse(LocalBackend.scala:61)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.local.LocalActor.aroundReceive(LocalBackend.scala:43)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:15:04,166] ERROR cheduler.TaskSchedulerImpl [] [akka://test/user/$m] - Exception in statusUpdate
java.util.concurrent.RejectedExecutionException: Task org.apache.spark.scheduler.TaskResultGetter$$anon$2@7341d044 rejected from java.util.concurrent.ThreadPoolExecutor@52dcd527[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 1]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
at org.apache.spark.scheduler.TaskResultGetter.enqueueSuccessfulTask(TaskResultGetter.scala:47)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:301)
at org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$liftedTree2$1$1.apply(TaskSchedulerImpl.scala:298)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.TaskSchedulerImpl.liftedTree2$1(TaskSchedulerImpl.scala:298)
at org.apache.spark.scheduler.TaskSchedulerImpl.statusUpdate(TaskSchedulerImpl.scala:283)
at org.apache.spark.scheduler.local.LocalActor$$anonfun$receiveWithLogging$1.applyOrElse(LocalBackend.scala:61)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.spark.scheduler.local.LocalActor.aroundReceive(LocalBackend.scala:43)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:15:04,167] INFO ark.scheduler.DAGScheduler [] [] - Job 0 failed: collect at SparkTestJobs.scala:81, took 0.059633 s
[2015-01-27 13:15:04,167] WARN .jobserver.JobManagerActor [] [] - Exception from job a6b4c092-f5cb-44a4-ad4f-f04cf1a9d903:
org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:702)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:701)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:701)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.postStop(DAGScheduler.scala:1428)
at akka.actor.Actor$class.aroundPostStop(Actor.scala:475)
at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundPostStop(DAGScheduler.scala:1375)
at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
at akka.actor.ActorCell.terminate(ActorCell.scala:369)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:241)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:15:04,169] INFO k.scheduler.TaskSetManager [] [akka://test/user/$m] - Finished task 3.0 in stage 0.0 (TID 3) in 43 ms on localhost (2/4)
[2015-01-27 13:15:05,216] INFO apOutputTrackerMasterActor [] [akka://test/user/$m] - MapOutputTrackerActor stopped!
[2015-01-27 13:15:05,220] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:15:05,221] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:15:05,222] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:15:05,223] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:15:05,226] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44346/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:15:05,227] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44346/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:15:05,228] INFO .jobserver.JobManagerActor [] [akka://test/user/$n] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:15:05,230] INFO k.jobserver.JobStatusActor [] [akka://test/user/$n/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:15:05,230] INFO k.jobserver.JobResultActor [] [akka://test/user/$n/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:15:05,239] INFO ache.spark.SecurityManager [] [akka://test/user/$n] - Changing view acls to: tja01
[2015-01-27 13:15:05,239] INFO ache.spark.SecurityManager [] [akka://test/user/$n] - Changing modify acls to: tja01
[2015-01-27 13:15:05,239] INFO ache.spark.SecurityManager [] [akka://test/user/$n] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:15:05,247] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:44346/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:15:05,268] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$n] - Slf4jLogger started
[2015-01-27 13:15:05,272] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:15:05,280] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:60401]
[2015-01-27 13:15:05,280] INFO rg.apache.spark.util.Utils [] [akka://test/user/$n] - Successfully started service 'sparkDriver' on port 60401.
[2015-01-27 13:15:05,281] INFO org.apache.spark.SparkEnv [] [akka://test/user/$n] - Registering MapOutputTracker
[2015-01-27 13:15:05,281] INFO org.apache.spark.SparkEnv [] [akka://test/user/$n] - Registering BlockManagerMaster
[2015-01-27 13:15:05,282] INFO k.storage.DiskBlockManager [] [akka://test/user/$n] - Created local directory at /tmp/spark-local-20150127131505-8cc1
[2015-01-27 13:15:05,282] INFO .spark.storage.MemoryStore [] [akka://test/user/$n] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:15:05,283] INFO pache.spark.HttpFileServer [] [akka://test/user/$n] - HTTP File server directory is /tmp/spark-14aad482-5263-4167-b7a5-b17c2083c756
[2015-01-27 13:15:05,283] INFO rg.apache.spark.HttpServer [] [akka://test/user/$n] - Starting HTTP Server
[2015-01-27 13:15:05,284] INFO clipse.jetty.server.Server [] [akka://test/user/$n] - jetty-8.1.14.v20131031
[2015-01-27 13:15:05,285] INFO y.server.AbstractConnector [] [akka://test/user/$n] - Started [email protected]:58628
[2015-01-27 13:15:05,285] INFO rg.apache.spark.util.Utils [] [akka://test/user/$n] - Successfully started service 'HTTP file server' on port 58628.
[2015-01-27 13:15:10,299] INFO clipse.jetty.server.Server [] [akka://test/user/$n] - jetty-8.1.14.v20131031
[2015-01-27 13:15:10,305] INFO y.server.AbstractConnector [] [akka://test/user/$n] - Started [email protected]:40894
[2015-01-27 13:15:10,305] INFO rg.apache.spark.util.Utils [] [akka://test/user/$n] - Successfully started service 'SparkUI' on port 40894.
[2015-01-27 13:15:10,306] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$n] - Started SparkUI at http://localhost:40894
[2015-01-27 13:15:10,332] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:60401/user/HeartbeatReceiver
[2015-01-27 13:15:10,334] INFO .NettyBlockTransferService [] [akka://test/user/$n] - Server created on 59467
[2015-01-27 13:15:10,334] INFO storage.BlockManagerMaster [] [akka://test/user/$n] - Trying to register BlockManager
[2015-01-27 13:15:10,335] INFO ge.BlockManagerMasterActor [] [akka://test/user/$n] - Registering block manager localhost:59467 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 59467)
[2015-01-27 13:15:10,335] INFO storage.BlockManagerMaster [] [akka://test/user/$n] - Registered BlockManager
[2015-01-27 13:15:10,339] INFO .jobserver.RddManagerActor [] [akka://test/user/$n/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:15:10,340] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:15:10,340] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:15:10,341] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:15:10,341] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:15:10,357] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:15:10,357] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:15:10,357] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:15:10,358] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:15:10,358] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:15:10,358] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:15:10,358] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:15:10,358] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:15:10,358] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:15:10,359] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:15:10,359] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:15:10,359] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:15:10,359] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:15:10,359] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:15:10,359] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:15:10,360] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:15:10,360] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:15:10,360] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:15:10,360] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:15:10,360] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:15:10,361] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:15:10,361] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:15:10,361] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:15:10,412] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:40894
[2015-01-27 13:15:10,413] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:15:11,466] INFO apOutputTrackerMasterActor [] [akka://test/user/$n] - MapOutputTrackerActor stopped!
[2015-01-27 13:15:11,471] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:15:11,472] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:15:11,473] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:15:11,473] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:15:11,475] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:60401/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:15:11,475] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:60401/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:15:11,476] INFO .jobserver.JobManagerActor [] [akka://test/user/$o] - Starting actor spark.jobserver.JobManagerActor
[2015-01-27 13:15:11,477] INFO k.jobserver.JobStatusActor [] [akka://test/user/$o/status-actor] - Starting actor spark.jobserver.JobStatusActor
[2015-01-27 13:15:11,478] INFO k.jobserver.JobResultActor [] [akka://test/user/$o/result-actor] - Starting actor spark.jobserver.JobResultActor
[2015-01-27 13:15:11,491] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:60401/system/remoting-terminator] - Remoting shut down.
[2015-01-27 13:15:11,494] INFO ache.spark.SecurityManager [] [akka://test/user/$o] - Changing view acls to: tja01
[2015-01-27 13:15:11,494] INFO ache.spark.SecurityManager [] [akka://test/user/$o] - Changing modify acls to: tja01
[2015-01-27 13:15:11,494] INFO ache.spark.SecurityManager [] [akka://test/user/$o] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tja01); users with modify permissions: Set(tja01)
[2015-01-27 13:15:11,526] INFO ka.event.slf4j.Slf4jLogger [] [akka://test/user/$o] - Slf4jLogger started
[2015-01-27 13:15:11,530] INFO Remoting [] [Remoting] - Starting remoting
[2015-01-27 13:15:11,537] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:49686]
[2015-01-27 13:15:11,537] INFO rg.apache.spark.util.Utils [] [akka://test/user/$o] - Successfully started service 'sparkDriver' on port 49686.
[2015-01-27 13:15:11,538] INFO org.apache.spark.SparkEnv [] [akka://test/user/$o] - Registering MapOutputTracker
[2015-01-27 13:15:11,538] INFO org.apache.spark.SparkEnv [] [akka://test/user/$o] - Registering BlockManagerMaster
[2015-01-27 13:15:11,539] INFO k.storage.DiskBlockManager [] [akka://test/user/$o] - Created local directory at /tmp/spark-local-20150127131511-863c
[2015-01-27 13:15:11,539] INFO .spark.storage.MemoryStore [] [akka://test/user/$o] - MemoryStore started with capacity 681.8 MB
[2015-01-27 13:15:11,540] INFO pache.spark.HttpFileServer [] [akka://test/user/$o] - HTTP File server directory is /tmp/spark-8b9fef95-4f46-43e8-83bc-5bd23eeb11e1
[2015-01-27 13:15:11,540] INFO rg.apache.spark.HttpServer [] [akka://test/user/$o] - Starting HTTP Server
[2015-01-27 13:15:11,541] INFO clipse.jetty.server.Server [] [akka://test/user/$o] - jetty-8.1.14.v20131031
[2015-01-27 13:15:11,542] INFO y.server.AbstractConnector [] [akka://test/user/$o] - Started [email protected]:33339
[2015-01-27 13:15:11,542] INFO rg.apache.spark.util.Utils [] [akka://test/user/$o] - Successfully started service 'HTTP file server' on port 33339.
[2015-01-27 13:15:16,556] INFO clipse.jetty.server.Server [] [akka://test/user/$o] - jetty-8.1.14.v20131031
[2015-01-27 13:15:16,563] INFO y.server.AbstractConnector [] [akka://test/user/$o] - Started [email protected]:57192
[2015-01-27 13:15:16,563] INFO rg.apache.spark.util.Utils [] [akka://test/user/$o] - Successfully started service 'SparkUI' on port 57192.
[2015-01-27 13:15:16,564] INFO rg.apache.spark.ui.SparkUI [] [akka://test/user/$o] - Started SparkUI at http://localhost:57192
[2015-01-27 13:15:16,641] INFO pache.spark.util.AkkaUtils [] [] - Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:49686/user/HeartbeatReceiver
[2015-01-27 13:15:16,646] INFO .NettyBlockTransferService [] [akka://test/user/$o] - Server created on 34437
[2015-01-27 13:15:16,646] INFO storage.BlockManagerMaster [] [akka://test/user/$o] - Trying to register BlockManager
[2015-01-27 13:15:16,646] INFO ge.BlockManagerMasterActor [] [] - Registering block manager localhost:34437 with 681.8 MB RAM, BlockManagerId(<driver>, localhost, 34437)
[2015-01-27 13:15:16,647] INFO storage.BlockManagerMaster [] [akka://test/user/$o] - Registered BlockManager
[2015-01-27 13:15:16,651] INFO .jobserver.RddManagerActor [] [akka://test/user/$o/rdd-manager-actor] - Starting actor spark.jobserver.RddManagerActor
[2015-01-27 13:15:16,651] INFO .jobserver.JobManagerActor [] [akka://test/user/$o] - Loading class spark.jobserver.CacheRddByNameJob for app demo
[2015-01-27 13:15:16,652] INFO .apache.spark.SparkContext [] [akka://test/user/$o] - Added JAR /tmp/InMemoryDAO5654339605225412024.jar at http://10.1.3.213:33339/jars/InMemoryDAO5654339605225412024.jar with timestamp 1422364516652
[2015-01-27 13:15:16,654] INFO util.ContextURLClassLoader [] [akka://test/user/$o] - Added URL file:/tmp/InMemoryDAO5654339605225412024.jar to ContextURLClassLoader
[2015-01-27 13:15:16,654] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$o] - Loading object spark.jobserver.CacheRddByNameJob$ using loader spark.jobserver.util.ContextURLClassLoader@27dec5ab
[2015-01-27 13:15:16,655] INFO spark.jobserver.JarUtils$ [] [akka://test/user/$o] - Loading class spark.jobserver.CacheRddByNameJob using loader spark.jobserver.util.ContextURLClassLoader@27dec5ab
[2015-01-27 13:15:16,656] INFO .jobserver.JobManagerActor [] [akka://test/user/$o] - Starting Spark job f52f0dd3-874b-48e3-9d59-a5b9d312adb3 [spark.jobserver.CacheRddByNameJob]...
[2015-01-27 13:15:16,657] INFO k.jobserver.JobResultActor [] [akka://test/user/$o/result-actor] - Added receiver Actor[akka://test/system/testActor1#-2027067696] to subscriber list for JobID f52f0dd3-874b-48e3-9d59-a5b9d312adb3
[2015-01-27 13:15:16,657] INFO .jobserver.JobManagerActor [] [] - Starting job future thread
[2015-01-27 13:15:16,657] WARN .jobserver.RddManagerActor [] [] - Shutting down spark.jobserver.RddManagerActor
[2015-01-27 13:15:16,657] WARN k.jobserver.JobResultActor [] [] - Shutting down spark.jobserver.JobResultActor
[2015-01-27 13:15:16,657] WARN k.jobserver.JobStatusActor [] [] - Shutting down spark.jobserver.JobStatusActor
[2015-01-27 13:15:16,658] INFO .jobserver.JobManagerActor [] [] - Shutting down SparkContext test
[2015-01-27 13:15:16,663] WARN .jobserver.JobManagerActor [] [] - Exception from job f52f0dd3-874b-48e3-9d59-a5b9d312adb3:
akka.pattern.AskTimeoutException: Recipient[Actor[akka://test/user/$o/rdd-manager-actor#1992115446]] had already been terminated.
at akka.pattern.AskableActorRef$.ask$extension(AskSupport.scala:132)
at spark.jobserver.JobServerNamedRdds.getOrElseCreate(JobServerNamedRdds.scala:24)
at spark.jobserver.CacheRddByNameJob.runJob(SparkTestJobs.scala:57)
at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:219)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[2015-01-27 13:15:16,669] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
[2015-01-27 13:15:16,670] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/,null}
[2015-01-27 13:15:16,670] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/static,null}
[2015-01-27 13:15:16,670] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2015-01-27 13:15:16,670] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
[2015-01-27 13:15:16,670] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors/json,null}
[2015-01-27 13:15:16,670] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/executors,null}
[2015-01-27 13:15:16,670] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment/json,null}
[2015-01-27 13:15:16,670] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/environment,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage/json,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/storage,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
[2015-01-27 13:15:16,671] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages/json,null}
[2015-01-27 13:15:16,672] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/stages,null}
[2015-01-27 13:15:16,672] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
[2015-01-27 13:15:16,672] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
[2015-01-27 13:15:16,672] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
[2015-01-27 13:15:16,672] INFO ver.handler.ContextHandler [] [] - stopped o.e.j.s.ServletContextHandler{/jobs,null}
[2015-01-27 13:15:16,724] INFO rg.apache.spark.ui.SparkUI [] [] - Stopped Spark web UI at http://localhost:57192
[2015-01-27 13:15:16,724] INFO ark.scheduler.DAGScheduler [] [] - Stopping DAGScheduler
[2015-01-27 13:15:17,777] INFO apOutputTrackerMasterActor [] [] - MapOutputTrackerActor stopped!
[2015-01-27 13:15:17,781] INFO .spark.storage.MemoryStore [] [] - MemoryStore cleared
[2015-01-27 13:15:17,781] INFO spark.storage.BlockManager [] [] - BlockManager stopped
[2015-01-27 13:15:17,782] INFO storage.BlockManagerMaster [] [] - BlockManagerMaster stopped
[2015-01-27 13:15:17,783] INFO .apache.spark.SparkContext [] [] - Successfully stopped SparkContext
[2015-01-27 13:15:17,788] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49686/system/remoting-terminator] - Shutting down remote daemon.
[2015-01-27 13:15:17,789] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49686/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2015-01-27 13:15:17,799] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriver@localhost:49686/system/remoting-terminator] - Remoting shut down.