Comments (11)
Mmmmh, could you run the master using play2.2.6?
This problem should go away (at least I'm not facing it). However, I'm very sad to face another problem while running a spark job when the notebook is launched by sbt/play run (reported in #57):
java.io.InvalidClassException: scala.Option; local class incompatible: stream classdesc serialVersionUID = -2062608324514658839, local class serialVersionUID = 5081326844987135632
By chance, this problem is not happening when using a play dist
or similarly the docker image.
Maybe one of these options would fit your needs (see https://github.com/andypetrella/spark-notebook/releases), also, it'll be easier than compiling the whole shebang.
from spark-notebook.
Mmmmh, could you run the master using play2.2.6?
I always run play apps with sbt and never had problems. I have to clarify my problem. Notebook opened without any problems but I get a lot of deadletters akka errors and "Failed to initialize compiler: object scala.runtime in compiler mirror not found" when I open any example and try to run it (or part of it)
from spark-notebook.
Yep, I had this runtime problem (which is mainly a classpath problem β clashes between sbt's classpath and project one), so I change the classpath creation in the repl. Normally, it shouldn't be a problem anymore, at least running play like I do, perhaps running sbt has yet another weird behavior.
To make things clear, the notebook opens, but starting the repl in the "remote" process is failing, hence the notebook cannot work.
In all cases, I already had nightmares and couple of infinite nights spent on this serialization problem that only occurs when sbt is part of the game...
from spark-notebook.
I see the same problem if I'm setting
export SBT_SCALA_VERSION=2.10.4
before launching play 2.2.6.
To fix it, I changed the line 98 in Repl.scala to
val gurls = urls(loader).distinct //.filter(!_.contains("sbt/"))
But then I still have the issue #57 :-(
from spark-notebook.
Oh strange. I must say that the classpath/classload management is a pain
mess within SBT π
Need to figure it out, I'll try to poke some SBT guys for that.
Le mer 31 dΓ©c. 2014 22:00, MartinWeindel [email protected] a
Γ©crit :
I see the same problem if I'm setting
export SBT_SCALA_VERSION=2.10.4
before launching play 2.2.6.
To fix it, I changed the line 98 in Repl.scala to
val gurls = urls(loader).distinct //.filter(!_.contains("sbt/"))
But then I still have the issue #57
#57 :-(β
Reply to this email directly or view it on GitHub
#58 (comment)
.
from spark-notebook.
Guyz just to add to the note here: I'm on the way to (apparently) solve this crap out, this is the branch: https://github.com/andypetrella/spark-notebook/tree/remote-process-fix.
I'm not far, just need to tune the akka conf to boot several system without conflicts
from spark-notebook.
There we go, v0.1.3* on current master, allows spark to be used even when developing.
Still have to deploy it now (s3 and docker)
from spark-notebook.
I git cloned from current master and started it with sbt and the bug is still there, here is my log after openning adam example:
SLF4J: Found binding in [jar:file:/home/antonkulaga/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.0.13.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/antonkulaga/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
[INFO] [01/13/2015 11:42:50.249] [NotebookServer-akka.actor.default-dispatcher-6] [akka://NotebookServer/user/$b/$a] ReplCalculator preStart
INIT SCRIPT
<function0>
Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
[ERROR] [01/13/2015 11:42:52.221] [NotebookServer-akka.actor.default-dispatcher-3] [akka://NotebookServer/user/$b/$a] assertion failed: null
akka.actor.ActorInitializationException: exception during creation
at akka.actor.ActorInitializationException$.apply(Actor.scala:218)
at akka.actor.ActorCell.create(ActorCell.scala:578)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:425)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
at akka.dispatch.Mailbox.run(Mailbox.scala:218)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.AssertionError: assertion failed: null
at scala.Predef$.assert(Predef.scala:179)
at org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:203)
at org.apache.spark.repl.HackSparkILoop$$anonfun$process$1.apply$mcZ$sp(HackSparkILoop.scala:89)
at org.apache.spark.repl.HackSparkILoop$$anonfun$process$1.apply(HackSparkILoop.scala:43)
at org.apache.spark.repl.HackSparkILoop$$anonfun$process$1.apply(HackSparkILoop.scala:43)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.HackSparkILoop.process(HackSparkILoop.scala:43)
at notebook.kernel.Repl.<init>(Repl.scala:121)
at notebook.client.ReplCalculator$$anonfun$notebook$client$ReplCalculator$$repl$1.apply(ReplCalculator.scala:58)
at notebook.client.ReplCalculator$$anonfun$notebook$client$ReplCalculator$$repl$1.apply(ReplCalculator.scala:57)
at scala.Option.getOrElse(Option.scala:120)
at notebook.client.ReplCalculator.notebook$client$ReplCalculator$$repl(ReplCalculator.scala:57)
at notebook.client.ReplCalculator.notebook$client$ReplCalculator$$eval$1(ReplCalculator.scala:178)
at notebook.client.ReplCalculator$$anonfun$preStartLogic$2.apply(ReplCalculator.scala:195)
at notebook.client.ReplCalculator$$anonfun$preStartLogic$2.apply(ReplCalculator.scala:189)
at scala.collection.immutable.List.foreach(List.scala:318)
at notebook.client.ReplCalculator.preStartLogic(ReplCalculator.scala:189)
at notebook.client.ReplCalculator.preStart(ReplCalculator.scala:200)
at akka.actor.ActorCell.create(ActorCell.scala:562)
... 9 more
from spark-notebook.
Mmmmh, let me check what happens with sbt... Although I'd be surprise if they behave differently.
Of course, you have no local changes right?
I'll get back soon.
Thanks for reporting!
from spark-notebook.
Yop @antonkulaga, this is very strange. I cannot reproduce it. I'm trying with the Simple Spark example which should have the same error (in case of).
I tested on three different machine, including one that freshly cloned the master and run it using SBT :-/.
I poke you on hangout, so it'll be easier to discuss that (comments are too async ;))
from spark-notebook.
Thanks @antonkulaga to figure that out. Looks like we can face this problem but "cleaning" the target sbt clean
will resolve it.
However, it's rare, it'll occur when old (< 0.1.3) version of the notebook was built and classes are remaining in target.
from spark-notebook.
Related Issues (20)
- Object is not a member of a package notebook.io HOT 5
- Disabled Open SparkUI link after rebuild HOT 1
- [error] (subprocess/compile:compileIncremental) java.lang.NoClassDefFoundError: scala/reflect/internal/Reporter HOT 4
- SparkUI executor tab failing HOT 1
- Error in Mac os
- Instructions for launching spark-notebook on EMR 5.x HOT 5
- [Docker] Generated port is incorrrect HOT 4
- Progress bar buggy (spark 2.3) HOT 1
- NoClassDefFoundError ParquetDecodingException in Loading csv HOT 2
- how to read a json or csv format from local to spark notebook docker on mac. Please help. Thanks
- `Exception in implicit renderer` when running simple scala test
- After installing Spark Notebook the cells do not run or show any output HOT 3
- Why collect on a big dataframe not working? HOT 1
- how to enable/use some profile via cmd line
- java 12 is too old, java 1.6+ requested(problem solved)? HOT 1
- http://spark-notebook.io is down? HOT 7
- GraphChart: Display Node & Edge Values
- Classes publicly available for dependency? HOT 2
- Find new source for sbt-git-stamp, other than bintray.com HOT 3
- Can't download on website
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from spark-notebook.