I am using a gqlgen package for Golang and I am trying to implement a Dataloaders. This is the Dataloader I am using: https://github.com/graph-gophers/dataloader
I did everything as in the Tutorial, but I am not able to load all keys as slices of keys. In Node.js if I do something like
loader.load(1)
loader.load(3)
loader.load(3)
In the batch function, for keys, I will receive an array of [1,3]. However, in Golang, I am getting completely different thing.
This is my field resolver
func (r *newsResolver) CreatedBy(ctx context.Context, obj *models.News) (*models.User, error) {
loader := NewDataLoader()
thunk := loader.Load(context.TODO(), dataloader.StringKey(obj.CreatedById)) // StringKey is a convenience method that make wraps string to implement `Key` interface
result, err := thunk()
if err != nil {
// handle data error
}
log.Printf("value: %#v", result)
return nil, nil
}
and a Dataloader function
func NewDataLoader() *dataloader.Loader {
batchFn := func(ctx context.Context, keys dataloader.Keys) []*dataloader.Result {
var results []*dataloader.Result
fmt.Println(keys)
results = append(results, &dataloader.Result{Data: keys})
return results
}
loader := dataloader.NewBatchedLoader(batchFn)
return loader
}
However, when I print keys, I am getting in a batch function, I am getting each time a slice of a single input. Something like
[2]
[1]
[3]
[1]
[3]
And I think I am supposed to get something like: [2, 1, 3]
You are creating a new dataloader (
loader := NewDataLoader()
) every time you are using it. This renders it useless.You should use the same dataloader for the whole request. E.g. storing it in the request context. Some tutorials, I've found: