I am trying to load a huge data set from DB.
func main() {
db, err := sql.Open("mysql", "root:pass1@tcp(127.0.0.1:3306)/tuts")
if err != nil {
log.Print(err.Error())
}
defer db.Close()
results, err := db.Query("SELECT id, name FROM tags")
if err != nil {
panic(err.Error())
}
for results.Next() {
var tag Tag
err = results.Scan(&tag.ID, &tag.Name)
if err != nil {
panic(err.Error())
}
log.Printf(tag.Name)
}
}
Does the program load all the records to memory in single shot? Or is there any way to specify the fetch size so that program will load only n of rows at a time? Assuming there are million rows in the database, i would like to fetch 1000 records each time.
This will work just fine for a single row and millions of rows. Most SQL implementations have the notion of batches when reading. They load data from the disk as needed and keeps RAM usage low/constant as needed.
For example if you are selecting 1000 rows. The database may load the first 100 rows to the RAM. While you are calling
Next()
, for example when you reach the 50th row, the database grabs another hundred (rows 100 to 201 for example).