- Helper method creates test data
- Separated utilities to calculate contents of DataFrame
- Separated tests for each utility
- SBT or Activator are installed. I use activator command in this example.
- Spark 1.6.2
- Scala 2.11.6
- ScalaTest 2.2.4
- ScalaCheck 1.12.2
$ activator run
$ activator test
This example includes configuration of sbt-assembly, so you can run 'assembly'
$ activator assembly
The following static analyzers are included in build.sbt
Usage: automatically runs during Compilation and evaluation in console
sbt-scapegoat (https://github.com/sksamuel/sbt-scapegoat)
Usage: automatically runs during Compilation
Open target/scala-2.11/scapegoat.xml or target/scala-2.11/scapegoat.html
Usage: sbt scalastyle
Open target/scalastyle-result.xml
Check level are all "warn", change to "error" if you want to reject code changes when integrated with CI tools.
This issue is originally mentioned in Sample Project for Spark 1.3.0 with Scala 2.11.6, a sample Spark application template.
Currently Test and run will fork a JVM. The reason it's necessary is that SBT's classloader doesn't work well with Spark and Spark shell.
However sbt console
does not recognize fork key right now. It might throw ScalaReflectionException, etc.
- dobachi ([email protected])
- Jianshi Huang ([email protected]; [email protected])