Scenario
Our application should export all books and a summary of the chapters in XML format. We have a total of 100,000 books with 15 chapter each. It is just impossible to load all the entities into memory.
Solution
Instead of a normal query we are going to use a ScrollableResults. It will load the data step by step as we are iterating through the result. From time to time, we will call session.clear to remove the entities from the session. This allows the garbage collection to take them away and will limit our memory consumption.
ScrollableResults results = session.createQuery( "select b from Book b left join fetch b.chapters") .scroll(ScrollMode.FORWARD_ONLY); while (results.next()) { Book book = (Book) results.get(0); /* display entities and collection entries in the session */ log.debug(session.getStatistics().toString()); // create XML for book for (Chapter chapter : book.getChapters()) { chapter.getContent(); // create XML for chapter } session.clear(); } results.close();
For debugging purpose I am printing the number of entities in the session using session.getStatistics().toString().
Pitfall warning
If you change entities, you must call session.flush before calling session.clear. Clear clears you session and all open SQL statements not yet sent to the database will be sent as well.