Skip to content

Instantly share code, notes, and snippets.

@jonasurbano
Created January 27, 2018 14:43
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jonasurbano/cabcf4e845f545b53cd785dd70b4e69d to your computer and use it in GitHub Desktop.
Save jonasurbano/cabcf4e845f545b53cd785dd70b4e69d to your computer and use it in GitHub Desktop.
Spring Batch give us those ItemReader implementations out of the box:
AbstractItemCountingItem
StreamItemReader
AmqpItemReader
FlatFileItemReader
HibernateCursorItemReader
HibernatePagingItemReader
IbatisPagingItemReader
ItemReaderAdapter
JdbcCursorItemReader
JdbcPagingItemReader
JmsItemReader
JpaPagingItemReader
ListItemReader
MongoItemReader
Neo4jItemReader
RepositoryItemReader
Spring Batch give us those ItemWriter implementations out of t he box:
AbstractItemStream
AmqpItemWriter
CompositeItemWriter
FlatFileItemWriter
GemfireItemWriter
HibernateItemWriter
IbatisBatchItemWriter
ItemWriterAdapter
JdbcBatchItemWriter
JmsItemWriter
JpaItemWriter
MimeMessageItemWriter
MongoItemWriter
Neo4jItemWriter
RepositoryItemWriter
StaxEventItemWriter
id=1,
version=3,
name=step1,
status=COMPLETED,
exitStatus=COMPLETED,
readCount=3,
filterCount=0,
writeCount=2
readSkipCount=1,
writeSkipCount=1,
processSkipCount=0,
commitCount=2,
rollbackCount=2
# Download and run Spring Cloud Data Flow locally:
wget http://repo.spring.io/release/org/springframework/cloud/spring-cloud-dataflow-server-local/1.2.3.RELEASE/spring-cloud-dataflow-server-local-1.2.3.RELEASE.jar
java -jar spring-cloud-dataflow-server-local-1.2.3.RELEASE.jar
# Download and run Spring Cloud Data Flow Shell locally:
wget http://repo.spring.io/release/org/springframework/cloud/spring-cloud-dataflow-shell/1.2.3.RELEASE/spring-cloud-dataflow-shell-1.2.3.RELEASE.jar
java -jar spring-cloud-dataflow-shell-1.2.3.RELEASE.jar
# Register the application:
app register — name batch-demo — type task — uri file:///Users/jurbano/Desktop/batch-demo-0.0.1-SNAPSHOT.jar
# The common case is to register the application with the Maven URI maven.// so Spring Cloud Data Flow downloads the artifact from the repository. Other application types are sinks and sources.
# Create and run the job:
task create myjob — definition batch-demo
task launch myjob
@Bean
public Job job(Step step1) throws Exception {
return jobBuilderFactory.get("job1")
.incrementer(new RunIdIncrementer())
.start(step1)
.build();
}
@Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.tasklet(new Tasklet() {
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
System.out.println("Reading...");
return null;
}
})
.build();
}
@Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Person,Person>chunk(2)
.reader(csvReader())
.writer(writer())
.faultTolerant()
.skipLimit(2)
.skip(FlatFileParseException.class)
.skip(FlatFileFormatException.class)
.skip(IllegalArgumentException.class)
.listener(skipListener())
.build();
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment