call sc.stop() in accordance with spark 0.8#5
call sc.stop() in accordance with spark 0.8#5AustinBGibbons wants to merge 2 commits intoamplab:masterfrom
Conversation
There was a problem hiding this comment.
For these test suites, shouldn't the afterEach() method in the LocalSparkContext trait take care of stopping these contexts? SparkContext.stop() is idempotent and an extra call wouldn't cause problems, though.
There was a problem hiding this comment.
You're right, after playing with it some more it appears to fail non-deterministically. Adding sc.stop() here falsely mitigated the problem by reducing the number of times it fails. I'm not sure the root issue (maybe sbt/sbt test run in parallel?), but will stymie this for now and check later.
|
Thanks for looking at this, Austin. In general, I'll plan to add an MLContext.stop and an MLContext.broadcast. I'm also wondering if an sbt upgrade will fix the non-determinism issue (though I haven't observed it myself). I'll plan to push those in a bit. |
|
Yeah this issue only arose when swapping in spark master, so it might not ever bear fruit. |
|
I think that SBT runs tests in parallel by default; Spark's SparkBuild.scala contains a line to disable parallel tests: // Only allow one test at a time, even across projects, since they run in the same JVM
concurrentRestrictions in Global += Tags.limit(Tags.Test, 1),We might want to do this here, too. |
Hullo,
This pull request adds a corresponding
sc.stop()for eachval sc = new SparkContext(...)to avoid
address already in useerrors that come up running./sbt/sbt testagainst the current version of spark.