Question

Is there a guideline for estimating the amount of memory consumed by a BigDecimal?

Looking for something similar to these guidelines for estimating String memory usage.

Was it helpful?

Solution

If you look at the fields in the source for BigDecimal there is:

BigDecimal:
  long intCompact +8 bytes
  int precision +4 bytes
  int scale +4 bytes
  String stringCache +?
  BigInteger intVal +?

BigInteger:
  int bitCount +4 bytes
  int bitLength +4 bytes
  int firstNonzeroIntNum +4 bytes
  int lowestSetBit +4 bytes
  int signum +4 bytes
  int[] mag +?

The comment for stringCache says

Used to store the canonical string representation, if computed.

Assuming you don't call .toString(), it will remain zero bytes. Hence BigDecimal is (8+4+4)=16 bytes + BigInteger.

BigInteger itself is 4+4+4+4+4=20 bytes + mag.

20+16 gives total of 36 bytes plus the magnitude, which is always the minimum number of bits necessary to represent the full integer. For a number n it will need log2(n) bits, which can be converted to ints. You should be using about:

36 + Ceiling(log2(n)/8.0) bytes

(note this doesn't include any of the other object descriptor overhead as your example link for strings does, but it should give you a good general idea.)

OTHER TIPS

If you dig into the internals of BigDecimal you'll see that it uses a compact representation if the significand is <= Long.MAX_VALUE. Hence, the memory usage can vary depending on the actual values you're representing.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top