After division, I get wrong value:
...
.withColumn("percentage", regexp_replace(lit("10.62%"), "%", "").cast("double") / 100)
...
Expected value: 0.1062
Received value: 0.10619999999999
What's the problem with it? Are there any solutions without round operation?
This is not related to spark, you can try that division in pretty much any programming languages, and you will get the same result
In most programming languages, mathematical operations, are based on the IEEE 754 standard.
Here's a more clear explanation of how IEEE 754 standard works:
I'm afraid that the only solution is rounding.
Another possible solution is to use BigDecimal, but you have to put a precision, but obly for the input, you don't have to care about the precision od the result: