I have a Spring job what retrieves some data from db using hibernate (only once per job) and uses this data to insert in db other data ,still using hibernate.
I have a session object that I try to pass from CustomJobExecutionListener to my Writer class in a SpringBatchJob, to use the same session for read and for inserting, in order to not have this org.springframework.orm.hibernate4.HibernateSystemException: Illegal attempt to associate a collection with two open sessions;
.
With the current setup , the session gets closed:
ERROR spi.SqlExceptionHelper - PooledConnection has already been closed.
If I keep the same code, but not call session.flush() , no exception occurs but nothing gets written in db.
This is the job setup:
<batch:partition step="customLoadFile" partitioner="customFilePartitioner"><batch:handler grid-size="8" task-executor="customJobTaskExecutor"/></batch:partition></batch:step><batch:step id="customLoadFile"><batch:tasklet transaction-manager="customTransactionManager"><batch:chunk reader="customFileReader" writer="customFileWriter" commit-interval="5"/></batch:tasklet></batch:step><bean id="customFileWriter" class="com.example.batch.CustomItemWriter"><property name="customService" ref="customService"/></bean><bean id="customJobExecutionListener" class="com.example.batch.CustomJobExecutionListener"><constructor-arg ref="customFileWriter"/><constructor-arg ref="customSessionFactory"/></bean>
where
public class CustomJobExecutionListener implements JobExecutionListener { private final CustomItemWriter customItemWriter; private final SessionFactory customSessionFactory; public CustomJobExecutionListener(CustomItemWriter customItemWriter, SessionFactory customSessionFactory) { this.customItemWriter = customItemWriter; this.customSessionFactory = customSessionFactory; } @Override public void beforeJob(JobExecution jobExecution) { Session session = customSessionFactory.openSession(); customItemWriter.populateDataFromDb(session); } @Override public void afterJob(JobExecution jobExecution) {}}public class CustomItemWriter implements ItemWriter<CustomLogModel> { private CustomService customService; private Session session; private void populateDataFromDb(Session session) { this.session = session; StopWatch stopWatch = new StopWatch(); stopWatch.start(); System.out.println("Retrieving all db info."); List<Object[]> data = customService.getAllData(session); data.forEach(record -> { ArticleDomain articleSupplement = (ArticleDomain) record[2]; // Do something with articleSupplement }); stopWatch.stop(); System.out.println("Finished retrieving data."); } @Override public void write(List<? extends CustomLogModel> items) throws Exception { StopWatch stopWatch = new StopWatch(); stopWatch.start(); for (CustomLogModel item : items) { customService.insertData(item, findArticleDomainByFileName(fileName), session); } stopWatch.stop(); System.out.println("Finished inserting data for a batch of " + items.size() +" items."); } private ArticleDomain findArticleDomainByFileName // Implementation here } public void setCustomService(CustomService customService) { this.customService = customService; }}public class CustomService { @Transactional(readOnly = true) public List<Object[]> getAllData(Session session) { List<Object[]> dbData = session.createQuery("select i.fileName, i.date, a " +"from ArticleDomain a " +"left join fetch a.ipList i ") .list(); return dbData; } @Transactional public void insertData(CustomLogModel logModel, ArticleDomain articleDomain, Session session) { try { if (articleDomain.ipList && !articleDomain.ipList.empty) { articleDomain?.ipList?.last()?.isLast = false; } def ipRecord = new IpRecord(articleId: logModel.articleId, fileName: logModel.fileName, failReason: supplementsLogModel.failReason, ...); articleDomain.addToIpList(ipRecord); session.save(articleSupplement); session.flush(); } catch (Exception ex) { log.error("Error", ex); } }}
An ArticleDomain object will have a list of IpRecords, each IpRecord can have same fileName, but process date different