Bug 552009

Summary: NegativeArraySizeException parsing huge heap dump
Product: [Tools] MAT Reporter: Andrew Johnson <andrew_johnson>
Component: CoreAssignee: Project Inbox <mat.core-inbox>
Status: CLOSED MOVED QA Contact:
Severity: normal    
Priority: P3 CC: roy.sunny.zhang007
Version: 1.9   
Target Milestone: ---   
Hardware: All   
OS: All   
Whiteboard:

Description Andrew Johnson CLA 2019-10-10 10:29:06 EDT
From the MAT dev mailing list: https://www.eclipse.org/lists/mat-dev/msg00589.html

Hi All,

Seems even if # of objects is less than 2 billion, we still have some issues when we use MAT. I encounter following error log when I try to open heap dump with 1 billion object...

!ENTRY org.eclipse.mat.ui 1 0 2019-10-09 00:09:01.674
!MESSAGE Heap D:\tmp\19674.dump contains 1,104,267,590 objects

!ENTRY org.eclipse.core.jobs 4 2 2019-10-09 02:02:42.929
!MESSAGE An internal error occurred during: "Parsing heap dump from 'D:\tmp\19674.dump'".
!STACK 0
java.lang.NegativeArraySizeException
at java.io.ObjectOutputStream$HandleTable.growEntries(ObjectOutputStream.java:2347)
at java.io.ObjectOutputStream$HandleTable.assign(ObjectOutputStream.java:2276)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1428)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.eclipse.mat.collect.HashMapIntObject.writeObject(HashMapIntObject.java:438)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.eclipse.mat.parser.internal.SnapshotImpl.create(SnapshotImpl.java:215)
at org.eclipse.mat.parser.internal.SnapshotImplBuilder.create(SnapshotImplBuilder.java:95)
at org.eclipse.mat.parser.internal.SnapshotFactoryImpl.parse(SnapshotFactoryImpl.java:235)
at org.eclipse.mat.parser.internal.SnapshotFactoryImpl.openSnapshot(SnapshotFactoryImpl.java:126)
at org.eclipse.mat.snapshot.SnapshotFactory.openSnapshot(SnapshotFactory.java:147)
at org.eclipse.mat.ui.snapshot.ParseHeapDumpJob.run(ParseHeapDumpJob.java:83)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:60)

Thanks,
Roy
Comment 1 Andrew Johnson CLA 2019-10-10 12:34:56 EDT
For the record, please could you confirm the version of MAT and the JVM version.

The failure looks to be in JVM class library code in ObjectOutputStream where the table holding the mapping of objects to IDs is extended. This code seems to double the size of the array each time.
The MAT code is writing out the GC roots as a HashMapIntObject<XGCRootInfo[]>
The heap size of 1,104,267,590 objects is just over 2^30

Perhaps the dump has very many GC roots (keep unreachable objects is set?).
Comment 3 Eclipse Webmaster CLA 2024-05-08 15:43:27 EDT
This issue has been migrated to https://github.com/eclipse-mat/org.eclipse.mat/issues/29.