I have a table having real datatype column in sql server.
CREATE TABLE Table_1(f_int int, f_string nchar(10), f_real real);
INSERT INTO Table_1 (f_int, f_string, f_real) VALUES (42000000, 'myString2', 40003.16);
when i execute select f_real from Table_1;
using ODBC API(From c++ application)
I get value 40003.160156250000 instead of 40003.16
I have binded the result column like below.
struct ColValInfo
{
ColValInfo(){
pValue = NULL;
}
SQLPOINTER pValue;
SQLINTEGER StrLen_or_Ind;
};
ColValInfo* m_pColValInfo;
m_pColValInfo = new ColValInfo[numColumns];
m_pColValInfo[0].pValue = (SQLPOINTER) new char[50];
SQLBindCol(hstmt, 1, SQL_C_DOUBLE, m_pColValInfo[0].pValue,
siz, &(m_pColValInfo[0].StrLen_or_Ind));
Note : SQLPOINTER is a typedef of void * and SQLINTEGER is typedef of long. I am getting data by doing below
double data = *(double *)m_pColValInfo[0].pValue;
I am using odbc driver 13 for communicating with database. As driver takes care of converting data from sql type to native type which is specified in SQLBindCol ODBC call, i want to know how i am getting 40003.160156250000 when i have 40003.16 in my database. I have changed precision and scale of the data base column to decimal(16,2), to see if that makes any difference, i see no difference it makes.
Could some please let me know where the conversion is going wrong.