A golang package for fetching big chunks of rows from a postgres database using a cursor.
MIT License
A golang package for fetching big chunks of rows from a postgres database using a cursor.
package main
import (
"context"
"fmt"
cursoriterator "github.com/Eun/go-pgx-cursor-iterator/v2"
"github.com/jackc/pgx/v5/pgxpool"
)
type User struct {
Name string `db:"name"`
Role string `db:"role"`
}
func main() {
ctx := context.Background()
pool, _ := pgxpool.New(ctx, "example-connection-url")
values := make([]User, 1000)
iter, err := cursoriterator.NewCursorIterator(pool, values, "SELECT * FROM users WHERE role = $1", "Guest")
if err != nil {
panic(err)
}
defer iter.Close(ctx)
for iter.Next(ctx) {
fmt.Printf("Name: %s\n", values[iter.ValueIndex()].Name)
}
if err := iter.Error(); err != nil {
panic(err)
}
}
With the first Next()
call the iterator will start a transaction and define the cursor.
After that it will fetch the first chunk of rows.
With the following Next()
calls the iterator will first consume the already fetched rows.
If it iterated over all (pre)fetched rows it will fetch the next chunk and repeats the process.
With a chunk/batch size of 100 items and 250 rows in the database, the iterator will perform 3 fetches:
The passed in values
slice will be used as storage. So don't rely on the contents besides fetching the data from it.
Therefore, you should not reference an item in the values
slice since it will most likely be replaced sooner or later.
(depending on its size)