Spark Block Module Tests

I work with a Spark web map and create a RESTful API . ( http://sparkjava.com , as there are several things called "Spark" there)

In my employer standards, we write a series of unit tests that will run automatically once a day to confirm that applications are still running.

Spark is easy to test yourself with a tool like Postman, but I have not found good examples of JUnit tests written with Spark, or even with HTTP requests that were programmatic with it.

Has anyone done this before? Is it possible?

+7
java unit-testing junit spark-java
source share
4 answers

I had the same requirement that you and I found a way to make it work. I was looking for the source code for Spark, and I found two useful classes:

  • SparkTestUtil : this class wraps Apache HttpClient and provides methods for creating different HTTP requests on the local web server (runs on the local host) with a custom port (in the constructor) and relative path (in the request methods)
  • ServletTest : it runs a Jetty instance in the local port with the application context and relative path of the directory where WEB-INF / You can find the file descriptor web.xml. This web.xml will be used to simulate a web application. He then uses SparkTestUtil to make http requests against this simulated application and validate the results.

This is what I did: I created a junit test class that implements the SparkApplication interface. In this interface, I create and initialize a “controller” (my application class) that is responsible for responding to HTTP requests. In a method annotated with @BeforeClass, I initialize a Jetty instance using web.xml, which references the junit test class as SparkApplication and SparkTestUtil

JUnit Testing Class

 package com.test import org.eclipse.jetty.server.Connector; import org.eclipse.jetty.server.Server; import org.eclipse.jetty.server.ServerConnector; import org.eclipse.jetty.webapp.WebAppContext; public class ControllerTest implements SparkApplication { private static SparkTestUtil sparkTestUtil; private static Server webServer; @Override public void init() { new Controller(...) } @BeforeClass public static void beforeClass() throws Exception { sparkTestUtil = new SparkTestUtil(PORT); webServer = new Server(); ServerConnector connector = new ServerConnector(webServer); connector.setPort(PORT); webServer.setConnectors(new Connector[] {connector}); WebAppContext bb = new WebAppContext(); bb.setServer(webServer); bb.setContextPath("/"); bb.setWar("src/test/webapp/"); webServer.setHandler(bb); webServer.start(); (...) } @AfterClass public static void afterClass() throws Exception { webServer.stop(); (...) } } 

src / test / webapp / WEB-INF / web.xml file

 <!DOCTYPE web-app PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.3//EN" "http://java.sun.com/dtd/web-app_2_3.dtd" > <web-app> <display-name>Archetype Created Web Application</display-name> <filter> <filter-name>SparkFilter</filter-name> <filter-class>spark.servlet.SparkFilter</filter-class> <init-param> <param-name>applicationClass</param-name> <param-value>com.test.ControllerTest</param-value> </init-param> </filter> <filter-mapping> <filter-name>SparkFilter</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> </web-app> 

It can be improved, but it is a good starting point, I think. Maybe some kind of “intrinsic safety” component could be created?

Hope this would be helpful for you!

+5
source share

we have developed a small library that facilitates unit testing of Spark controllers / endpoints.

Github

In addition, version 1.1.3 is published in the Maven Central Repository

 <dependency> <groupId>com.despegar</groupId> <artifactId>spark-test</artifactId> <version>1.1.3</version> <scope>test</scope> </dependency> 
+8
source share

Here is my solution. You just need to add add apache-http and junit dependency.

 <dependency> <groupId>org.apache.httpcomponents</groupId> <artifactId>httpclient</artifactId> <version>4.5.2</version> </dependency> public class SparkServer { public static void main(String[] args) { Spark.port(8888); Spark.threadPool(1000, 1000,60000); Spark.get("/ping", (req, res) -> "pong"); } } public class SparkTest { @Before public void setup() { SparkServer.main(null); } @After public void tearDown() throws Exception { Thread.sleep(1000); Spark.stop(); } @Test public void test() throws IOException { CloseableHttpClient httpClient = HttpClients.custom() .build(); HttpGet httpGet = new HttpGet("http://localhost:8888/ping"); CloseableHttpResponse response = httpClient.execute(httpGet); int statusCode = response.getStatusLine().getStatusCode(); BufferedReader rd = new BufferedReader( new InputStreamReader(response.getEntity().getContent())); StringBuffer result = new StringBuffer(); String line = ""; while ((line = rd.readLine()) != null) { result.append(line); } assertEquals(200, statusCode); assertEquals("pong", result.toString()); } } 
0
source share

Another approach is to create a class that implements Route in each path or route . For example, if you have the following route:

 get("maintenance/task", (req, response) -> {....}); 

Then replace (req, response) -> {....} lambda with the class that implements route .

For example:

 public class YourRoute implements Route { public Object handle(Request request, Response response) throws Exception { .... } } 

Will be:

 get("maintenance/task", new YourRoute()); 

You can then do unit testing of the YourRoute class with JUnit.


0
source share

All Articles