Giter Site home page Giter Site logo

Comments (11)

andypetrella avatar andypetrella commented on May 12, 2024

Mmmmh, could you run the master using play2.2.6?

This problem should go away (at least I'm not facing it). However, I'm very sad to face another problem while running a spark job when the notebook is launched by sbt/play run (reported in #57):

java.io.InvalidClassException: scala.Option; local class incompatible: stream classdesc serialVersionUID = -2062608324514658839, local class serialVersionUID = 5081326844987135632

By chance, this problem is not happening when using a play dist or similarly the docker image.

Maybe one of these options would fit your needs (see https://github.com/andypetrella/spark-notebook/releases), also, it'll be easier than compiling the whole shebang.

from spark-notebook.

antonkulaga avatar antonkulaga commented on May 12, 2024

Mmmmh, could you run the master using play2.2.6?

I always run play apps with sbt and never had problems. I have to clarify my problem. Notebook opened without any problems but I get a lot of deadletters akka errors and "Failed to initialize compiler: object scala.runtime in compiler mirror not found" when I open any example and try to run it (or part of it)

from spark-notebook.

andypetrella avatar andypetrella commented on May 12, 2024

Yep, I had this runtime problem (which is mainly a classpath problem β†’ clashes between sbt's classpath and project one), so I change the classpath creation in the repl. Normally, it shouldn't be a problem anymore, at least running play like I do, perhaps running sbt has yet another weird behavior.

To make things clear, the notebook opens, but starting the repl in the "remote" process is failing, hence the notebook cannot work.

In all cases, I already had nightmares and couple of infinite nights spent on this serialization problem that only occurs when sbt is part of the game...

from spark-notebook.

MartinWeindel avatar MartinWeindel commented on May 12, 2024

I see the same problem if I'm setting

export SBT_SCALA_VERSION=2.10.4

before launching play 2.2.6.

To fix it, I changed the line 98 in Repl.scala to

val gurls = urls(loader).distinct //.filter(!_.contains("sbt/"))

But then I still have the issue #57 :-(

from spark-notebook.

andypetrella avatar andypetrella commented on May 12, 2024

Oh strange. I must say that the classpath/classload management is a pain
mess within SBT 😐

Need to figure it out, I'll try to poke some SBT guys for that.

Le mer 31 dΓ©c. 2014 22:00, MartinWeindel [email protected] a
Γ©crit :

I see the same problem if I'm setting

export SBT_SCALA_VERSION=2.10.4

before launching play 2.2.6.

To fix it, I changed the line 98 in Repl.scala to

val gurls = urls(loader).distinct //.filter(!_.contains("sbt/"))

But then I still have the issue #57
#57 :-(

β€”
Reply to this email directly or view it on GitHub
#58 (comment)
.

from spark-notebook.

andypetrella avatar andypetrella commented on May 12, 2024

Guyz just to add to the note here: I'm on the way to (apparently) solve this crap out, this is the branch: https://github.com/andypetrella/spark-notebook/tree/remote-process-fix.

I'm not far, just need to tune the akka conf to boot several system without conflicts

from spark-notebook.

andypetrella avatar andypetrella commented on May 12, 2024

There we go, v0.1.3* on current master, allows spark to be used even when developing.

Still have to deploy it now (s3 and docker)

from spark-notebook.

antonkulaga avatar antonkulaga commented on May 12, 2024

I git cloned from current master and started it with sbt and the bug is still there, here is my log after openning adam example:

SLF4J: Found binding in [jar:file:/home/antonkulaga/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.0.13.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/antonkulaga/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
[INFO] [01/13/2015 11:42:50.249] [NotebookServer-akka.actor.default-dispatcher-6] [akka://NotebookServer/user/$b/$a] ReplCalculator preStart
 INIT SCRIPT 
<function0>

Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.

Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
[ERROR] [01/13/2015 11:42:52.221] [NotebookServer-akka.actor.default-dispatcher-3] [akka://NotebookServer/user/$b/$a] assertion failed: null
akka.actor.ActorInitializationException: exception during creation
    at akka.actor.ActorInitializationException$.apply(Actor.scala:218)
    at akka.actor.ActorCell.create(ActorCell.scala:578)
    at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:425)
    at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
    at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
    at akka.dispatch.Mailbox.run(Mailbox.scala:218)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.AssertionError: assertion failed: null
    at scala.Predef$.assert(Predef.scala:179)
    at org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:203)
    at org.apache.spark.repl.HackSparkILoop$$anonfun$process$1.apply$mcZ$sp(HackSparkILoop.scala:89)
    at org.apache.spark.repl.HackSparkILoop$$anonfun$process$1.apply(HackSparkILoop.scala:43)
    at org.apache.spark.repl.HackSparkILoop$$anonfun$process$1.apply(HackSparkILoop.scala:43)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.HackSparkILoop.process(HackSparkILoop.scala:43)
    at notebook.kernel.Repl.<init>(Repl.scala:121)
    at notebook.client.ReplCalculator$$anonfun$notebook$client$ReplCalculator$$repl$1.apply(ReplCalculator.scala:58)
    at notebook.client.ReplCalculator$$anonfun$notebook$client$ReplCalculator$$repl$1.apply(ReplCalculator.scala:57)
    at scala.Option.getOrElse(Option.scala:120)
    at notebook.client.ReplCalculator.notebook$client$ReplCalculator$$repl(ReplCalculator.scala:57)
    at notebook.client.ReplCalculator.notebook$client$ReplCalculator$$eval$1(ReplCalculator.scala:178)
    at notebook.client.ReplCalculator$$anonfun$preStartLogic$2.apply(ReplCalculator.scala:195)
    at notebook.client.ReplCalculator$$anonfun$preStartLogic$2.apply(ReplCalculator.scala:189)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at notebook.client.ReplCalculator.preStartLogic(ReplCalculator.scala:189)
    at notebook.client.ReplCalculator.preStart(ReplCalculator.scala:200)
    at akka.actor.ActorCell.create(ActorCell.scala:562)
    ... 9 more

from spark-notebook.

andypetrella avatar andypetrella commented on May 12, 2024

Mmmmh, let me check what happens with sbt... Although I'd be surprise if they behave differently.
Of course, you have no local changes right?
I'll get back soon.
Thanks for reporting!

from spark-notebook.

andypetrella avatar andypetrella commented on May 12, 2024

Yop @antonkulaga, this is very strange. I cannot reproduce it. I'm trying with the Simple Spark example which should have the same error (in case of).
I tested on three different machine, including one that freshly cloned the master and run it using SBT :-/.

I poke you on hangout, so it'll be easier to discuss that (comments are too async ;))

from spark-notebook.

andypetrella avatar andypetrella commented on May 12, 2024

Thanks @antonkulaga to figure that out. Looks like we can face this problem but "cleaning" the target sbt clean will resolve it.

However, it's rare, it'll occur when old (< 0.1.3) version of the notebook was built and classes are remaining in target.

from spark-notebook.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.