Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[eclipselink-users] Lazy loading scenario

Hi all

I have a user screen (thick client; Swing) that presents a list of sound recording entities, including metadata such as date and time. These range up to 211kb in size (average 30kb) and can be up to 3k of them presented at a time. I can't fully load them in one go as get OOM errors. To solve this, I use lazy loading (@Basic(fetch = FetchType.LAZY)) on the sound data attribute of the entity. I've recently upgraded to eclipselink 2.1.2 from 1.0.2 and am having some strange behaviour.

What happens in the code is this:
Load:
  Get all records (sound data is lazy loaded so memory minimal)
  entityManager.clear() to detach all records from the db
User selects a record:
  entityManager.merge() the selected recording, inserting it in place of the detached object in the list
  User listens to the recording which fetches the sound data
User selects a different recording:
  entityManager.clear() to detach all records from the db - yes, again. If I didn't do this then all recordings are loaded by the time the screen has been completed and the same memory issues are present.
  entityManager.merge() the selected recording, inserting it in place of the detached object in the list
  User listens to the recording which fetches the sound data
and so on...

This worked fine in 1.0.2. It felt clunky, but seemed to be OK.

In 2.1.2 it is not working. Looking into it, it seems to be trying to insert a new instance of a related singleton reference object, for which a duplicate ID exception is generated. My interpretation is that the clear() is detaching all entities, then a new one is being created for the update (commit). This seems perfectly reasonable, but it breaks my current implementation, and didn't happen in 1.0.2.

So 1) Is there a way to detach individual entities (or their fields) after they have been accessed/modified/then committed?
2) Does somebody know what changed that affected this? Was it a regression that's since been fixed, maybe?
2) Or should I be completely rethinking the pattern I'm using here?

Thanks a lot in advance
Brad

Back to the top