Is Java Spark any support for dependency injection or IoC containers?

Being in .NET, I am well versed in supporting the fact that micro-web frameworks such as NancyFX and Web API are designed for IoC containers.

In similar Ruby frameworks such as Sinatra (NancyFX is based on Sinatra), it looks like you have the option to inject dependencies.

From what I see, since Java spark applications run as the main method, it looks like you cannot pass your dependencies or IoC containers.

public class HelloWorld { public static void main(String[] args) { get("/hello", (req, res) -> "Hello World"); } } 

I find it difficult to understand how such an infrastructure can be useful without supporting it.

If not in this structure, is there another lightweight structure (Spring is not light from what I remember, but maybe everything has changed) that supports this?

+6
source share
5 answers

Spring can be easily integrated with Spark. eg

 public interface Spark { /** * adds filters, routes, exceptions, websockets and others */ void register(); } @Configuration public class SparkConfiguration { @Autowired(required = false) private List<Spark> sparks = new ArrayList<>(); @Bean CommandLineRunner sparkRunner() { return args -> sparks.stream().forEach( spark -> spark.register()); } } @Component public class HelloSpark implements Spark { @Autowired private HelloWorldService helloWorldService; @Override public void register() { get("/hello", (request, response) -> helloWorldService.hello() ); } } 

You can find more at https://github.com/pmackowski/spring-boot-spark-java

+8
source

This is a fairly simple use of Guice with Java Spark. Basically, you need to extend SparkFilter as follows to create a Guice injector.

 public class SparkGuiceFilter extends SparkFilter { private Injector injector = null; @Override protected SparkApplication[] getApplications(final FilterConfig filterConfig) throws ServletException { final SparkApplication[] applications = super.getApplications(filterConfig); if (this.injector == null) { this.injector = Guice.createInjector(new MainModule()); } if (applications != null && applications.length != 0) { for (SparkApplication application : applications) { this.injector.injectMembers(application); } } return applications; } } 

Then you need web.xml and you need to run the Spark application as a regular war application using Jetty or any other servlet container:

 <web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://java.sun.com/xml/ns/javaee" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd" version="3.0"> <filter> <filter-name>SparkGuiceFilter</filter-name> <filter-class>com.devng.spark.guice.SparkGuiceFilter</filter-class> <init-param> <param-name>applicationClass</param-name> <param-value>com.devng.spark.SparkApp</param-value> </init-param> </filter> <filter-mapping> <filter-name>SparkGuiceFilter</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> </web-app> 

However, there are some limitations to this approach. You cannot use request or session areas with Guice. If you don’t need it, then you should go, otherwise you need to integrate Guice Servlet extensions and add GuiceFilter to your web.xml , as described in the official Guice documentation . You also need to make sure that you are using the same instance of the injector in GuiceFilter and SparkGuiceFilter , which you need to define GuiceServletContextListener in your web.xml , as described here .

You can find a fully working example in my GitHub here https://github.com/devng/demo/tree/master/sparkjava-guice

+2
source

I'm actually experimenting with Spark and Guice, and as far as I can see, using dependency injection using both is very simple, at least for today (August 2017).

All you have to do is the following:

 public class MySparkApplication { public static void main(String[] args) { Injector injector = Guice.createInjector(); SomeClass someClass = injector.getInstance(SomeClass.class); get("/hello", (req, res) -> someClass.getSomeString()); } } 

This is actually so simple. I just followed the Guice Getting Started tutorial and it works. When I launch Spark and open http://localhost:4567 in my browser, the string returned from my method is displayed.

+1
source

I’ve been working with Spark lately and it doesn’t include the IoC provider out of the box, however you can easily turn on Spring or the Guice kernel, and this will be an easy solution.

All you have to do is add the dependency to Maven and start using it.

Alternatively, you can look at Ninja , which is a full-stack structure and includes Guice, JPA / Hibernate from the box.

0
source

Work independently IoC with Guice . It works with a bit of code;) Link: https://github.com/Romain-P/SparkJava-JFast

By default, a graphics module is not required; objects for automatic detection of bindings are detected.

 public class Main { public static void main(String[] args) { /* give a class as argument for package scanning from its path recursively */ Injector injector = SparkApplication.init(Application.class); injector.getInstance(Application.class).initialize(); } } @Binding public class Application { @Inject Service http; @Inject HelloController helloController; public void initialize() { http.port(8080); http.get("/hello", (req, res) -> helloController.hello()); } } @Binding public class HelloController { @Inject HelloService service; public Object hello() { //business logic return service.hello(); } } @Binding @Slf4j public class HelloService { public Object hello() { log.info("hello"); return new Object(); } } 
0
source

All Articles