Xcode 12 and below used to be able to handle decimals with many more digits than Xcode 13. The below code on Xcode 12 & 13 produces very different result:
let formatter = NumberFormatter()
formatter.minimumFractionDigits = 1
formatter.maximumFractionDigits = 16
formatter.usesGroupingSeparator = false
formatter.string(from: NSDecimalNumber(string: "1234567890123456.1234567890123456"))
// Xcode 12: "1234567890123456.1234567890123456"
// Xcode 13: "1234567890123460.0"
Interestingly, switching to usesSignificantDigits alleviated the issue, but I'm looking for the behaviour of maximumFractionDigits in my feature instead.
Did Apple reimplemented something that changes this behaviour? What can we do to get back the higher precision of the Xcode 12 formatter?