I want to print a decimal number with error at most 10-6.
What does error at most 10-6 mean?
E.g.: 10.35652383,
Which of the following two options is correct?
1)10.356523 or 2)10.356524
As printf("%.6lf") prints with the decimal rounded off to 6 decimal places 10.356524 will be printed.
Let's calculate both absolute and relative errors:
It gives:
So both absolute and relative errors are below 10e-6, i.e. below 0.000001, but the second value is closer.