Spring package skips records

I have a spring batch job (3.0.4.RELEASE) configured to index some records in solr. I do not experience any exceptions, and the work reacts as completed successfully, however sometimes it processes only half of the records in the database. I have configured JdbcCursorItemReader and a custom writer as shown below.

 <batch:job-repository id="jobRepository" data-source="dataSource" transaction-manager="transactionManager" isolation-level-for-create="DEFAULT" table-prefix="BATCH_" max-varchar-length="1000"/> <bean id="myItemReader" class="org.springframework.batch.item.database.JdbcCursorItemReader" scope="step"> <property name="dataSource" ref="dataSource"/> <property name="sql" value="SELECT id FROM my_item ORDER BY id asc"/> <property name="rowMapper" ref="myItemIdRowMapper"/> </bean> <bean id="taskExecutor" class="org.springframework.core.task.SimpleAsyncTaskExecutor"> <property name="concurrencyLimit" value="4"/> </bean> <bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher"> <property name="jobRepository" ref="jobRepository"/> <property name="taskExecutor" ref="taskExecutor"/> </bean> <batch:job id="myItemIndexJob" restartable="true" job-repository="jobRepository"> <batch:listeners> <batch:listener ref="executionListener"/> </batch:listeners> <batch:step id="myItemSolrIndexStep" allow-start-if-complete="true"> <batch:tasklet> <batch:chunk reader="myItemReader" writer="myItemSolrWriter" commit-interval="50"/> </batch:tasklet> </batch:step> </batch:job> 

The string converter simply returns the identifier from the request

 public final class MyItemRowMapper implements RowMapper<Integer> { @Override public Integer mapRow(ResultSet rs, int rowNum) throws SQLException { return rs.getInt("id"); } } 

And the author goes to the service to index in solr. The author does not change MyItem.

 public class MyItemSolrWriter implements ItemWriter<Integer> { @Override public void write(List<? extends Integer> myItemIds) throws Exception { service.index(Lists.newArrayList(myItemIds)); } } 

This is the step_execution database step_execution for running the job, which indicates that it did not miss anything and only has read 279086 rows

 | step_name | start_time | end_time | status | read_count | filter_count | write_count | read_skip_count | write_skip_count | exit_message | | myItemSolrIndexStep | 2016-05-12 10:07:01.994 | 2016-05-12 10:09:07.303 | COMPLETED | 279086 | 0 | 279086 | 0 | 0 | | 

And a step_execution_context database step_execution_context indicating that the reader has read lines 573937

 {"map":[{"entry":[{"string":"JdbcCursorItemReader.read.count","int":573937},{"string":["batch.taskletType","org.springframework.batch.core.step.item.ChunkOrientedTasklet"]},{"string":["batch.stepType","org.springframework.batch.core.step.tasklet.TaskletStep"]}]}]} 

Subsequent startup leads to the processing of full records 573937 , so this only happens intermittently. Any ideas on what might cause this, or methods to debug the problem?

If necessary, I can provide more detailed information.

+7
java spring spring-batch
source share

No one has answered this question yet.

See related questions:

1873
What is the difference between @Component, @Repository and @Service annotations in Spring?
708
How to configure port for Spring Boot application
562
What is the Spring Framework designed for?
315
Difference between applicationContext.xml and spring -servlet.xml in Spring Framework
4
Spring Package - Bean Creation Failure Autowire Fails
0
Failed to convert property value of type 'java.util.Collections $ UnmodifiableMap' to the required type 'org.springframework.batch.item.ExecutionContext'
0
SpringBatch rowMapper: No suitable editors or conversion strategies found
0
Spring Batch restart function does not work when using @StepScope
0
No bean defined 'springbatch.readerDataSource'
0
in the user processor (Spring batch) is not executed. The error also does not work. like a batch failure, but also does not complete the task

All Articles