Original post

Hello community,

I am not sure if I understand it correctly. Reading the documentation about the database/sql module from Go suggests that DB.Query returns Rows and Rows is using the sql cursor under the hood. My program recently fired a sql query that should have returned several GB of rows but my memory consumption was very low, which matches the documentation.

For a lot of rows (~1e8) I find my program waiting for the next row a lot, having only 40-50 % of CPU usage while iterating over the rows. Is it possible to get batches of rows or the whole dataset at once without using the row by row cursor? I could use this to reduce the network and database overhead while iterating over the results.

Thanks a lot!