Scalatest Maven Plugin "no tests performed"

I am trying to use scalatest and spark-testing-base on Maven to test Spark integration. The Spark job is read in a CSV file, checks the results, and inserts data into the database. I try to validate by putting files of a known format and seeing if and how they fail. This particular test simply verifies that the test passed. Unfortunately scalatest cannot find my tests.

Matching pom plugins:

<plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <configuration> <skipTests>true</skipTests> </configuration> </plugin> <!-- enable scalatest --> <plugin> <groupId>org.scalatest</groupId> <artifactId>scalatest-maven-plugin</artifactId> <version>1.0</version> <configuration> <reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory> <wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites> </configuration> <executions> <execution> <id>test</id> <goals> <goal>test</goal> </goals> </execution> </executions> </plugin> 

And here is the test class:

 class ProficiencySchemaITest extends FlatSpec with Matchers with SharedSparkContext with BeforeAndAfter { private var schemaStrategy: SchemaStrategy = _ private var dataReader: DataFrameReader = _ before { val sqlContext = new SQLContext(sc) import sqlContext._ import sqlContext.implicits._ val dataInReader = sqlContext.read.format("com.databricks.spark.csv") .option("header", "true") .option("nullValue", "") schemaStrategy = SchemaStrategyChooser("dim_state_test_proficiency") dataReader = schemaStrategy.applySchema(dataInReader) } "Proficiency Validation" should "pass with the CSV file proficiency-valid.csv" in { val dataIn = dataReader.load("src/test/resources/proficiency-valid.csv") val valid: Try[DataFrame] = Try(schemaStrategy.validateCsv(dataIn)) valid match { case Success(v) => () case Failure(e) => fail("Validation failed on what should have been a clean file: ", e) } } } 

When I run mvn test , it cannot find any tests and displays this message:

 [INFO] --- scalatest-maven-plugin:1.0:test (test) @ load-csv-into-db --- [36mDiscovery starting.[0m [36mDiscovery completed in 54 milliseconds.[0m [36mRun starting. Expected test count is: 0[0m [32mDiscoverySuite:[0m [36mRun completed in 133 milliseconds.[0m [36mTotal number of tests run: 0[0m [36mSuites: completed 1, aborted 0[0m [36mTests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0[0m [33mNo tests were executed.[0m 

UPDATE
Using:

 <suites>com.cainc.data.etl.schema.proficiency.ProficiencySchemaITest</suites> 

Instead:

 <wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites> 

I can run this test. Obviously, this is not perfect. It is possible that wildcardSuites is broken; I am going to open a ticket on GitHub and see what happens.

+6
source share
2 answers

Perhaps this is due to the fact that there are gaps in the path to the project. Remove space in the project path and tests can be successfully detected. I hope for this help.

+2
source

Try to exclude junit as a transitive dependency. It works for me. The example is below, but note that the versions of Scala and Spark apply to my environment.

  <dependency> <groupId>com.holdenkarau</groupId> <artifactId>spark-testing-base_2.10</artifactId> <version>1.5.0_0.6.0</version> <scope>test</scope> <exclusions> <!-- junit is not compatible with scalatest --> <exclusion> <groupId>junit</groupId> <artifactId>junit</artifactId> </exclusion> </exclusion> </dependency> 
0
source

All Articles