What is the best way to add BigDecimals in Hadoop?

31 Views Asked by At

I need to add BigDecimals in Hadoop. I'm currently using Apache Pig BigDecimalWritable but Pig seems to be completely outdated.

<dependency>
  <groupId>org.apache.pig</groupId>
  <artifactId>pig</artifactId>
  <version>0.17.0</version>
</dependency>

This version is 5 years old!!

1

There are 1 best solutions below

0
BuckBazooka On

I now use instead:

<dependency>
  <groupId>org.apache.parquet</groupId>
  <artifactId>parquet-hive-storage-handler</artifactId>
  <version>1.11.2</version>
</dependency>

<dependency>
  <groupId>org.apache.hive</groupId>
  <artifactId>hive-serde</artifactId>
  <version>4.0.0-alpha-1</version>
  <exclusions>
    <exclusion>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-slf4j-impl</artifactId>
    </exclusion>
  </exclusions>
</dependency>

Parquet contains a BigDecimalWritable and is more up to date.

Would there be a better solution?