如何从休息服务开始批处理。 afterJob 运行但 EnableBatchProcessing 仅在部署后运行
How start the batch from rest service. The afterJob runs but EnableBatchProcessing runs only after deployed
我按照教程https://spring.io/guides/gs/batch-processing/ which basically reads a spreadsheet and insert the data in database once the application is started. I want to execute the reading process every time a rest service is evoked, so I added a controller in this application. To call the job I followed the suggestion in How to trigger a job using a rest web service? and now it rises my issue: whenever I called the rest service it runs only the afterJob method. I read how to select which spring batch job to run based on application argument - spring boot java config(与我的问题有一定的相似性)和优秀的博客post指出了它,但我仍然卡住了。我想在 rest 服务之后调用 "public ItemReader reader()",我希望遵循与通过 main() 启动应用程序时相同的流程。我的意思是,和springboot部署时一样的流程。我想我的困惑在于 @EnableBatchProcessing 或 JobExecutionListenerSupport 但我真的坚持下去。
以下是最重要的片段。
控制器
@Autowired
JobLauncher jobLauncher;
@Autowired
Job job;
@RequestMapping("/runit")
public void handle() throws Exception{
JobParameters jobParameters =
new JobParametersBuilder()
.addLong("time",System.currentTimeMillis()).toJobParameters();
jobLauncher.run(job, jobParameters);
}
监听器
@Component
public class JobCompletionNotificationListener extends
JobExecutionListenerSupport {
@Override
public void afterJob(JobExecution jobExecution) {
if(jobExecution.getStatus() == BatchStatus.COMPLETED) {
批量配置
@Configuration
@EnableBatchProcessing
public class BatchConfiguration {
// tag::readerwriterprocessor[]
@Bean
public ItemReader<Person> reader() {
FlatFileItemReader<Person> reader = new FlatFileItemReader<Person>();
reader.setResource(new ClassPathResource("sample-data.csv"));
reader.setLineMapper(new DefaultLineMapper<Person>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "firstName", "lastName" });
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {{
setTargetType(Person.class);
}});
}});
return reader;
}
@Bean
public ItemProcessor<Person, Person> processor() {
return new PersonItemProcessor();
}
@Bean
public ItemWriter<Person> writer(DataSource dataSource) {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
@Bean
public Job importUserJob(JobBuilderFactory jobs, Step s1, JobExecutionListener listener) {
return jobs.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(s1)
.end()
.build();
}
@Bean
public Step step1(StepBuilderFactory stepBuilderFactory, ItemReader<Person> reader,
ItemWriter<Person> writer, ItemProcessor<Person, Person> processor) {
return stepBuilderFactory.get("step1")
.<Person, Person> chunk(10)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
}
application.properties
application.properties
# SPRING BATCH (BatchDatabaseInitializer)
spring.batch.job.enabled=false
spring.batch.initializer.enabled=false
在深入讨论这个问题之后,真正的 "issue" 是 ItemReader
被创建一次,而不是每次 REST 调用一次(也就是每次执行作业一次)。要解决此问题,请将 ItemReader
步骤限定范围。这将允许您获得每个 运行 的 reader 的新实例,这也使您能够根据作业参数注入文件名。
从代码的角度,改变这个:
@Bean
public ItemReader<Person> reader() {
对此:
@Bean
@StepScope
public FlatFileItemReader<Person> reader() {
return 类型更改的原因是 Spring Batch 会自动为您注册 ItemStream
s。但是,当使用 @StepScope
时,我们只会看到您定义的 return。在这种情况下,ItemReader
不会 extend/implement ItemStream
因此我们不知道您正在 return 进行我们应该自动注册的内容。通过 returning FlatFileItemReader
我们可以看到所有 interfaces/etc 并且可以为您应用 "magic"。
我按照教程https://spring.io/guides/gs/batch-processing/ which basically reads a spreadsheet and insert the data in database once the application is started. I want to execute the reading process every time a rest service is evoked, so I added a controller in this application. To call the job I followed the suggestion in How to trigger a job using a rest web service? and now it rises my issue: whenever I called the rest service it runs only the afterJob method. I read how to select which spring batch job to run based on application argument - spring boot java config(与我的问题有一定的相似性)和优秀的博客post指出了它,但我仍然卡住了。我想在 rest 服务之后调用 "public ItemReader reader()",我希望遵循与通过 main() 启动应用程序时相同的流程。我的意思是,和springboot部署时一样的流程。我想我的困惑在于 @EnableBatchProcessing 或 JobExecutionListenerSupport 但我真的坚持下去。 以下是最重要的片段。
控制器
@Autowired
JobLauncher jobLauncher;
@Autowired
Job job;
@RequestMapping("/runit")
public void handle() throws Exception{
JobParameters jobParameters =
new JobParametersBuilder()
.addLong("time",System.currentTimeMillis()).toJobParameters();
jobLauncher.run(job, jobParameters);
}
监听器
@Component
public class JobCompletionNotificationListener extends
JobExecutionListenerSupport {
@Override
public void afterJob(JobExecution jobExecution) {
if(jobExecution.getStatus() == BatchStatus.COMPLETED) {
批量配置
@Configuration
@EnableBatchProcessing
public class BatchConfiguration {
// tag::readerwriterprocessor[]
@Bean
public ItemReader<Person> reader() {
FlatFileItemReader<Person> reader = new FlatFileItemReader<Person>();
reader.setResource(new ClassPathResource("sample-data.csv"));
reader.setLineMapper(new DefaultLineMapper<Person>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "firstName", "lastName" });
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {{
setTargetType(Person.class);
}});
}});
return reader;
}
@Bean
public ItemProcessor<Person, Person> processor() {
return new PersonItemProcessor();
}
@Bean
public ItemWriter<Person> writer(DataSource dataSource) {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
@Bean
public Job importUserJob(JobBuilderFactory jobs, Step s1, JobExecutionListener listener) {
return jobs.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(s1)
.end()
.build();
}
@Bean
public Step step1(StepBuilderFactory stepBuilderFactory, ItemReader<Person> reader,
ItemWriter<Person> writer, ItemProcessor<Person, Person> processor) {
return stepBuilderFactory.get("step1")
.<Person, Person> chunk(10)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
}
application.properties
application.properties
# SPRING BATCH (BatchDatabaseInitializer)
spring.batch.job.enabled=false
spring.batch.initializer.enabled=false
在深入讨论这个问题之后,真正的 "issue" 是 ItemReader
被创建一次,而不是每次 REST 调用一次(也就是每次执行作业一次)。要解决此问题,请将 ItemReader
步骤限定范围。这将允许您获得每个 运行 的 reader 的新实例,这也使您能够根据作业参数注入文件名。
从代码的角度,改变这个:
@Bean
public ItemReader<Person> reader() {
对此:
@Bean
@StepScope
public FlatFileItemReader<Person> reader() {
return 类型更改的原因是 Spring Batch 会自动为您注册 ItemStream
s。但是,当使用 @StepScope
时,我们只会看到您定义的 return。在这种情况下,ItemReader
不会 extend/implement ItemStream
因此我们不知道您正在 return 进行我们应该自动注册的内容。通过 returning FlatFileItemReader
我们可以看到所有 interfaces/etc 并且可以为您应用 "magic"。