PostGraphile automatically detects tables, columns, indexes, relationships, views, types, functions, comments and more. It builds a GraphQL server that is highly intelligent about your data, and that automatically updates itself without restarting when you modify your database.

PostGraphile provisions, sets up and tears down a PostgreSQL client automatically for each GraphQL query

  • Setup involves beginning a transaction and setting the relevant session variables, e.g. using your JWT or the pgSettings function

Smart Comments

  • in postgres, we can make user-friendly remarks on a table called comments by using the COMMENT statement
  • Postgraphile leverages this feature to be able to alter functionalities by adding smart comments as comments on a table


Computed Column vs. Custom Query

  • We can differentiate between computed columns and custom queries by observing that computed columns must accept the table they belong to as a first arg. Of course, to be able to create a computed column we need a table to attach it to.
  • Consider the expansive nature of computed columns. They allow us to augment our existing tables with data that doesn't normally belong there.

Computed column

  • def - a psuedo column that we can attach to a table that will automatically be reflected in the graphql schema
  • in a function, running setof nugget will return a connection

Custom Queries

  • similar to computed columns, but instead of the function being callable as a node's field (ex. the buckets field on nugget), the function is callable from the root-level
  • ex. all_nuggets_with_bucket_id function


  • Postgraphile can generate JWTs easily from inside your PostgreSQL schema.
    • to do this, we define a jwtToken composite type and we pass it to jwtPgTypeIdentifier, and now every time that type is returned from a postgres function, it will be signed with the jwtSecret, and return it as a jwt token as part of the graphql response.
  • when the server receives a jwtToken from the request's authorization headers, like so:
  "aud": "postgraphile",
  "role": "app_user",
  "user_id": 27

it will automatically run this code:

set local role app_user;
set local to 'user_login';
set local to '27';

Exposing HTTP requests to Postgres

  • pgSettings lets us set the jwt within postgres' current_setting while having access to the request
    • this function fires on each request, and everything returned by it will be applied to current_setting (with set_config)
    • ex. we can get the userId from the request and update the value of user_id within current_setting(...)
    • pgSettings is a function that can be async
  • instead of passing an object, we can pass pgSettings function that will get executed on each request.
  • Everything returned by pgSettings is applied to the current session with set_config($key, $value, true)
  • You can use pgSettings to define variables that your Postgres functions/policies depend on
    • When adding variables for your own usage, the keys must contain either one or two periods (.)
      • Variables without periods will be interpreted as internal Postgres settings, such as role, and will be applied by Postgres
      • All settings are automatically reset when the transaction completes
    • Here's an example of switching the user into the Postgres 'visitor' role, and applying the application setting
    pgSettings: async req => ({
        'role': 'visitor',
        '': req.user ? : undefined,
  • role is overridden after pgSettings is applied
    • But only if pgDefaultRole is set or there's a role property in the claim of the JWT

Exposing HTTP requests to resolvers

  • additionalGraphQLContextFromRequest is an optionally asynchronous function that has access to the req and res objects from your HTTP library (Express)
    • The result returned from the function is merged into the GraphQL context object which is passed as the third argument to every GraphQL resolver
    • additionalGraphQLContextFromRequest let us perform asynchronous actions if we need to, for example looking up the current user in the database with Passport. Once all that is done, we can return an object from this function that will merge with the existing context object so the resolvers can access it.

Graphile Workers

  • allows you to run jobs (e.g. sending emails, performing calculations, generating PDFs, etc) "in the background" so that your HTTP response/application code is not held up repo

Extending Graphql Schema (makeExtendSchemaPlugin)

  • when we use makeExtendSchemaPlugin, we can define types and resolvers that will get merged into the existing ones generated by Postgraphile
  • the callback returns an object with 2 keys:
    1. typeDefs
    2. resolvers
      • an object whose keys are graphql types, which resolve to an object with key-value pair of field-resolver function
  • the build argument is supplied to the makeExtendSchemaPlugin callback, and it contains lots of information and helpers defined by various plugins
    • includes the introspection results, inflection functions, and SQL helper (build.pgSql, an instance of pg-sql2, a query builder)

Custom mutations/queries

  • By default Postgres assumes all functions will mutate the database. Therefore, if we want the postgres function to show up as a query, we need to mark it stable
  • when naming custom functions that get back the user some data, we need to name it as if it were a simple property on an object. We don't want to name is something like getUsers. Instead, we want to simply call it users. This makes more sense when viewing it from the graphiql perspective and querying via graphql.


  • if we are using postgraphile, rootPgPool Postgraphile doesn't know about it, as we don't pass it to the postgraphile engine. Instead, it is used in the additionalGraphQLContextFromRequest callback.

Scalar Types

PostGraphile generates the following scalar types:

  • BigFloat, BigInt, BitString, Boolean, CidrAddress, Date, Datetime, Float, Int, InternetAddress, Interval, JSON, KeyValueHash, MacAddress, MacAddress8, String, Time, UUID


Quality Repos

graphile/starter - lots of really quality code. check out how some of the lower level db config functions work

Quality Docs

lots of overall good info, inc high level setting up the SQL

  1. Pubsub
  2. Schema
  3. Utils