Elixir: Using Absinthe to query Dgraph, a graph database. GraphQL to GraphQL+ mapping

834 Views Asked by At

I am using Absinthe to build a GraphQL API. The datastore is Dgraph, which uses GraphQL+ as query language. It is similar to GraphQL but not identical.

This would in theory put me in a wonderful situation. A GraphQL query like

query {
  user {
    id
    username
    posts {
      title
      text
      comments {
        text
      }
    }
  }
}

could be also just one query in Dgraph. It would look almost identical:

{
  users(func: has(type_user))
  {
    id
    username
    posts {
      title
      text
      comments {
        text
      }
    }
  }
}

This power of graph databases to load complex relations in one go is something that I would like to use. The problem is just: In Absinthe the schema is supposed to be composable. In the schema would have one :user object that has a :posts field which would be a list_of(:post). And then a :post object. Etc. pp.

To help prevent N+1 queries you would use dataloader or batch loading.

Now I could just load everything in one go. I could for example write a resolver that does just that:

defmodule MyApp.Resolvers.User do

  alias MyApp.Users

  def users(_, _args, _) do
    {:ok, Users.all()}
  end

end

And the user context that actually queries the db

defmodule MyApp.Users do

  alias MyApp.Users.User

  def all do
    query = """
      {
        users(func: has(type_user))
        {
          id
          username
          posts {
            title
            text
            comments {
              text
            }
          }
        }
      }
    """

    case ExDgraph.query(conn(), query) do
      {:ok, msg} ->
        %{result: %{users: users}} = msg
        Enum.map(users, fn x -> struct(User, x) end)

      {:error, _error} ->
        []
    end
  end

end

The issue here is that I overfetch. I ALWAYS query everything, even if I only want a list of users. This works but is not very good performance wise. And I loose the composability.

What would be a solution is if I had access to the query in the resolver to understand which fields are requested. I could then use pattern matching to build the query and then send it to Dgraph. I could even have one central resolver and many query builders. But I would need to hook into the query and parse it directly.

Is something like that possible? Any idea where I could find something interesting to solve this? Maybe with Absinthe middleware?

Thanks!

1

There are 1 best solutions below

0
On

I think I found a solution. Let’s take a resolver that gets you a list of users. Something like this:

object :user_queries do
  field :users, list_of(:user) do
    resolve fn _parent, _, resolution ->
      IO.inspect(resolution.definition.selections)
      {:ok, Users.all()}
    end
  end
end

If you hit it with a nested query like:

{
  users {
    id
    name
    posts {
      id
      title
      text
      comments {
        id
        text
      }
    }
  }
}

you will see a complex nested map in the terminal. And it contains all the info we need. The selections contain the fields. Each field has a name. And if it has a nested object it contains again a list selections.

Now we only have to recurse our way down the rabbit hole, collect info for our Dgraph query and build it accordingly. And since it already passed the Absinthe validation etc. it looks like a good option.

Interesting will be if an approach like that will lead to problems with the optimizations Absinthe comes with, like in memory caching of already fetched objects…