Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[eclipselink-users] too many inserts - increased use of heap memory , better way ?

Hi,

When a operation in our application is triggered, we have to do around 400
transactions. 

Each transaction inserts around 2049 objects into database. There are very
few sql queries. 
We run 30 transactions in parallel ( at a time ) till the 400 transactions
complete. This we do using MDB. Each MDB is responsible for 1 transaction.
We control this 30 transactions using Maxthreadconstraint of the workmanager
asssigned to mdb.

Now everything works fine. But the memory keeps on increasing . 

We are using eclipselink  to do database insertion,updation,queries.
I was wondering if there are better ways of doing it ?  Any tuning in
eclipselink we can try ?
Please let us know.

1. We use Cachetype to WEAK in persistence.xml  ( <property
name="eclipselink.cache.type.default" value="Weak"/> ). This improved
performance but still memory is an issue.
2. We use the following jvm option :  -d64 -Xms20g -Xmx20g -XX:PermSize=512m  
-XX:MaxPermSize=1012m   -XX:+UseConcMarkSweepGC -XX:+UseParNewGC  
-XX:+UseCompressedOops
3. We are running on weblogic server
4. attached GC.log http://old.nabble.com/file/p29223044/gc.log gc.log 

-- 
View this message in context: http://old.nabble.com/too-many-inserts---increased-use-of-heap-memory-%2C-better-way---tp29223044p29223044.html
Sent from the EclipseLink - Users mailing list archive at Nabble.com.



Back to the top