Schema management howto

A few tips on how to manage your database schema with rel.

Schema definition strategy

Rel lets you decide where you want to define your database schema. You can define it either internally, with an OCaml Rel.Schema.t value or externally using any other tool you may see fit. It is also possible, to some extent, to move from one strategy to the other.

An internal OCaml definition can always be switched to an external one. The Rel_sql.create_schema function generates legible SQL data definitions from a Rel.Schema.t value in the dialect of your database management system (DBMS). Would you need to move away from rel this always works.

An external schema definition can be moved to an OCaml one by connecting to your database and generating the OCaml code for a Rel.Schema.t value representing it. The rel command line tool or (e.g.) Rel_sqlite3.schema_of_db and Rel.Schema.pp_ocaml can be used for that. However if the external schema definition uses features that cannot be expressed by a Rel.Schema.t value, it is not posssible to faithfully represent and regenerate the external schema from the generated OCaml schema – it is nevertheless useful to query the database with rel.

Ultimately which solution you choose depends on the degree of control you need over your DBMS. Since Rel.Schema.t abstracts over DBMS and their differences, you may find yourself limited by sticking to an OCaml internal definition. Though using raw SQL statements can bring you a long way.

Schema changes

The goal of a schema change is to make the live schema (which can be captured for example with Rel_sqlite3.schema_of_db) of a database instance to coincide with the application's schema, that is the schema that your software is assuming to hold.

Rel provides support to compute the change between a source and destination Rel.Schema.t value. Table and column renames need however to be provided manually. The rel changes command or the Rel.Schema.changes function perform this.

This allow to compute the changes needed by your live database to bring it the schema you need.

These changes can be turned into SQL data definition statements with the Rel_sql.schema_changes functions to apply on a source schema to bring it to the destination schema. Note that this only handles structural changes to the database. You may need to provide additional statements to handle data migrations. You should also always have a careful look at these steps and possibly tweak them, especially if you do this with externally defined schemas.

Single live database instance or development mode

Released schemas

In this case we need to version the schema. The way to do this is DBMS dependent. Here a few ways:

The application keeps the latest version of the schema and diff steps to go from earlier versions to the next version until the latest one.

Before interacting with the database:

Schema conventions

There's more than one way to model your database in OCaml with Rel. The following defines a simple conventions you can follow.

These conventions is followed by the rel tool when it outputs the OCaml code needed to support interaction with an externally defined database schema, except for the naming conventions which respect those found in the existing schema.


Tables representation

Given a table named n with columns c0, c1, … define a module N (n capitalized) for it. This module should have

Once you have modelled your tables and gathered them into a schema values with Rel.Schema.v you can use Rel_sql.create_schema to output the corresponding schema in SQL's data definition language.


Consider a person table which has three columns id and first_name and last_name columns. The following interface represents such a table according to the convention.

(** Persons. *)
module Person : sig

  type id = int
  (** The type for person identifiers. *)

  type t
  (** The type for persons. *)

  val v : id:id -> first_name:string -> last_name:string -> t
  (** [v ~id ~first_name ~last_name] is a person with given attributes.
      See accessors for semantics. *)

  val row : int -> string -> string -> t
  (** [row] is unlabelled {!v}. *)

  val id : t -> id
  (** [id p] is the unique identifier of [p]. *)

  val first_name : t -> string
  (** [first_name p] is the first name of [p]. *)

  val last_name : t -> string
  (** [last_name p] is the last name of [p]. *)

  (** {1:table Table} *)

  open Rel

  val id' : (t, id) Col.t
  (** [id'] is the {!id} column. *)

  val first_name' : (t, string) Col.t
  (** [first_name'] is the {!first_name} column. *)

  val last_name' : (t, string) Col.t
  (** [last_name'] is the {!last_name} column. *)

  val table : t Table.t
  (** [table] is the person table. *)

The simplest way of implementing this signature is by using OCaml records. For example:

module Person = struct
  type id = int
  type t = { id : id; first_name : string; last_name : string }

  let v ~id ~first_name ~last_name = { id; first_name; last_name }
  let row id first_name last_name = { id; first_name; last_name }

  let id r =
  let first_name r = r.first_name
  let last_name r = r.last_name

  open Rel

  let id' = Col.v "id" ~params:[Col.Primary_key] Type.Int id
  let first_name' = Col.v "first_name" Type.Text first_name
  let last_name' = Col.v "last_name" Type.Text last_name

 let table =
   Table.v "person" Row.(unit row * id' * first_name' * last_name')

Alernate direction

An alternate direction could be to simply define abstract types for tables but simply have them as generic column map (i.e. heterogenous dictionary). Could have less boiler plate and less gc pressure since we do pack Rel.Col.values at the IO boundary anyways.

Unsupported DBMS features

A few DBMS features that would be nice to have but are not supported at the moment.

For SQLite3.