How do you update a value in Quartz JobDataMap?

I am using quartz-scheduler 1.8.5. I created a Job running StatefulJob. I plan to work with SimpleTrigger and StdSchedulerFactory.

It seems that I should update the Trigger JobDataMap in addition to the JobDetail JobDataMap in order to change the JobDataMap inside the job. I'm trying to understand why it is necessary to update both? I noticed that JobDataMap is configured for pollution. Maybe I should explicitly save it or something else?

I think I will have to delve into the Quartz source code to really understand what is going on here, but I thought I would be lazy and ask first. Thank you for understanding the inner workings of JobDataMap!

Here is my job:

public class HelloJob implements StatefulJob { public HelloJob() { } public void execute(JobExecutionContext context) throws JobExecutionException { int count = context.getMergedJobDataMap().getInt("count"); int count2 = context.getJobDetail().getJobDataMap().getInt("count"); //int count3 = context.getTrigger().getJobDataMap().getInt("count"); System.err.println("HelloJob is executing. Count: '"+count+"', "+count2+"'"); //The count only gets updated if I updated both the Trigger and // JobDetail DataMaps. If I only update the JobDetail, it doesn't persist. context.getTrigger().getJobDataMap().put("count", count++); context.getJobDetail().getJobDataMap().put("count", count++); //This has no effect inside the job, but it works outside the job try { context.getScheduler().addJob(context.getJobDetail(), true); } catch (SchedulerException e) { // TODO Auto-generated catch block e.printStackTrace(); } //These don't seem to persist between jobs //context.put("count", count++); //context.getMergedJobDataMap().put("count", count++); } } 

This is how I plan the work:

 try { // define the job and tie it to our HelloJob class JobDetail job = new JobDetail(JOB_NAME, JOB_GROUP_NAME, HelloJob.class); job.getJobDataMap().put("count", 1); // Trigger the job to run now, and every so often Trigger trigger = new SimpleTrigger("myTrigger", "group1", SimpleTrigger.REPEAT_INDEFINITELY, howOften); // Tell quartz to schedule the job using our trigger sched.scheduleJob(job, trigger); return job; } catch (SchedulerException e) { throw e; } 

Update:

It seems that I have to add the value twice to the JobDetail JobDataMap so that it is saved, this works:

 public class HelloJob implements StatefulJob { public HelloJob() { } public void execute(JobExecutionContext context) throws JobExecutionException { int count = (Integer) context.getMergedJobDataMap().get("count"); System.err.println("HelloJob is executing. Count: '"+count+"', and is the job stateful? "+context.getJobDetail().isStateful()); context.getJobDetail().getJobDataMap().put("count", count++); context.getJobDetail().getJobDataMap().put("count", count++); } } 

It seems like a mistake, maybe? Or maybe there is some step that I am missing to tell JobDetail to clear the contents of my JobDataMap to the JobStore?

+4
source share
4 answers

I think your problem is using the postfix ++ operator - when you do:

 context.getJobDetail().getJobDataMap().put("count", count++); 

You set the value on the map to count and increase the score.

It seems to me that you wanted:

 context.getJobDetail().getJobDataMap().put("count", ++count); 

which will need to be done only once.

+10
source

As you know, in Quartz, the trigger and the task are separate, and not combined with some schedulers. They can allow you to add values ​​to data defined at the trigger level, rather than at the job level, etc.

I think that it allows you to perform the same final task with a different data set, but at the same time there is still some common data at the task level.

+2
source

As scpritch76 replied, the task and the trigger are separate, so there can be many triggers (schedules) for a given task.

A job can have some basic set of properties in JobDataMap, and then triggers can provide additional properties (or override basic properties) for specific job executions in their JobDataMaps.

+2
source
 @PersistJobDataAfterExecution @DisallowConcurrentExecution public class DynamicTestJob implements Job { private static final Logger log = LoggerFactory.getLogger(DynamicTestJob.class); @Override public void execute(final JobExecutionContext context) { final JobDataMap jobDataMap = context.getJobDetail().getJobDataMap(); Integer counter = (Integer) jobDataMap.get("counter"); if (counter == null) { counter = 1; } else { counter++; } jobDataMap.put("counter", counter); System.out.println(counter); } } 
0
source

All Articles