Skip to content

call sc.stop() in accordance with spark 0.8#5

Open
AustinBGibbons wants to merge 2 commits intoamplab:masterfrom
AustinBGibbons:sc-stop
Open

call sc.stop() in accordance with spark 0.8#5
AustinBGibbons wants to merge 2 commits intoamplab:masterfrom
AustinBGibbons:sc-stop

Conversation

@AustinBGibbons
Copy link

Hullo,

This pull request adds a corresponding sc.stop() for each val sc = new SparkContext(...)

to avoid address already in use errors that come up running ./sbt/sbt test against the current version of spark.

Review on Reviewable

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For these test suites, shouldn't the afterEach() method in the LocalSparkContext trait take care of stopping these contexts? SparkContext.stop() is idempotent and an extra call wouldn't cause problems, though.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right, after playing with it some more it appears to fail non-deterministically. Adding sc.stop() here falsely mitigated the problem by reducing the number of times it fails. I'm not sure the root issue (maybe sbt/sbt test run in parallel?), but will stymie this for now and check later.

@etrain
Copy link
Contributor

etrain commented Nov 7, 2013

Thanks for looking at this, Austin. In general, I'll plan to add an MLContext.stop and an MLContext.broadcast.

I'm also wondering if an sbt upgrade will fix the non-determinism issue (though I haven't observed it myself). I'll plan to push those in a bit.

@AustinBGibbons
Copy link
Author

Yeah this issue only arose when swapping in spark master, so it might not ever bear fruit.

@JoshRosen
Copy link
Member

I think that SBT runs tests in parallel by default; Spark's SparkBuild.scala contains a line to disable parallel tests:

// Only allow one test at a time, even across projects, since they run in the same JVM
concurrentRestrictions in Global += Tags.limit(Tags.Test, 1),

We might want to do this here, too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants