I am currently working on a Go project where I need to generate a CSV file that can be interpreted by Excel using its default encoding configurations. The requirements for the task at hand state that the downloaded file should appear correct in Excel (where the clients actually work with the data). By default, Excel interprets CSV files as ANSI encoded, but Golang uses UTF-8 as its default character encoding. The solution seems pretty simple... We would just need to convert the format, using golang.org/x/text/encoding/charmap package's Windows1252 encoding, which is equivalent to ANSI. However, this did not work as expected.
Here is my `Download` method
func Download(invoices []*domainInvoice.ProjectInvoice, sites map[string]*sites.Site) ([]byte, error) {
data := make([]*ListItemDownload, 0, len(invoices))
for _, invoice := range invoices {
data = append(data, newDownloadInvoiceList(invoice, sites[invoice.Invoiced[0].SiteID]))
}
csvFile, err := csv.WriteCSVWithComma(data)
if err != nil {
return nil, err
}
ansi, err := charmap.Windows1252.NewEncoder().Bytes(csvFile)
if err != nil {
return nil, err
}
return ansi, nil
}
It is called after querying the database for a set of records to be downloaded, and the content gets written to csv format using the csv.WriteCSVWithComma method - this function just returns a []byte object with csv data.
In my mind, encoding the final result to ANSI should definitely work, but it didn't. Special characters are not handled at all by Excel.
Am I missing something here? Is this a problem purely related with Excel and there is no way to solve it with encodings in the backend?