Publishing Rust types to a TypeScript frontend

Common libraries and projects used to share types from a Rust web backend to a TypeScript frontend
rust typescript 2022-09-13

When building in a strongly-typed language such as rust it's a shame to have to throw this strictness away, especially when dealing with the wild west of frontend stacks. The bigger the project is, the more you want to keep things cohesive.

This cohesion evaporates quickly when you need to pass language barriers. While normal JavaScript does not have types, we can utilise TypeScript to share this type information and hopefully maintain some cohesion between languages.

The scenario we're targeting here is writing a web backend in rust, and having those types/methods available easily for a TypeScript enabled frontend. This article describes a few common methods we can bridge the two languages together using currently available tools and libraries. Most of them follow similar characteristics, but each approach has their own set of tradeoffs.

This blog will primarily use actix-web as the example for the web backend, as it is one I'm most familiar with, but the libraries used here do have support for other backends.

Derive Macros

Strictly speaking, all these methods use derive macros in some way, but what I am describing here is using derive macros to publish structs/objects as typescript definitions. These don't include function/methods or http requests (which will be discussed later on).

Rust does not have runtime reflection built into the language & so relies on decorating types via macros to provide some sort of approximate "reflection".

In terms of the two main derive macro libraries to choose from, we have ts-rs and schemars. Both of these libraries are integrated tightly with serde.

ts-rs

The ts-rs library will allow you to output typescript types for your existing objects:

#[derive(TS)]
#[ts(export)]
struct User {
    user_id: i32,
    first_name: String,
    last_name: String,
}

The #[ts(export)] attribute will export this type when you run cargo test on your crate.

By default this will be in the bindings/ directory, with a separate typescript file per object type:

export interface User {
  user_id: number;
  first_name: string;
  last_name: string;
}

Alternatively, you can use the TS::decl() method to export it as a String.

fn main() {
    println!("{}", User::decl());
}

schemars

Now, schemars does not export to typescript directly, but will export to JSON Schema which can be consumed & converted into typescript with the help of conversion tools.

Working similar to ts-rs, you can derive JsonSchema on your objects:

#[derive(JsonSchema)]
struct User {
    user_id: i32,
    first_name: String,
    last_name: String,
}

From there, you can print out the schema with the help the schema_for macro & serde_json:

fn main() {
    let schema = schema_for!(User);
    println!("{}", serde_json::to_string_pretty(&schema).unwrap());
}

With this schema in hand, you can use a tool like json-schema-to-typescript to output it as typescript types:

$ cargo run | npx json-schema-to-typescript
/* tslint:disable */
/**
 * This file was automatically generated by json-schema-to-typescript.
 * DO NOT MODIFY IT BY HAND. Instead, modify the source JSONSchema file,
 * and run json-schema-to-typescript to regenerate this file.
 */

export interface User {
  first_name: string;
  last_name: string;
  user_id: number;
  [k: string]: unknown;
}

OpenAPI Generators

While straight derive macros are great at defining types, you sometimes also want some type safety around the methods used. If your backend is a web server, it's great to auto-generate the appropriate typescript code for making HTTP API calls. Luckily, if you can generate an OpenAPI spec, you can auto-generate clients for use in the frontend, all typed and ready to go!

Depending on your backend web framework will dictate what support there are for this method. There is a push for the most common web frameworks to have OpenAPI support with varying states of maturity.

Paperclip and Actix Web

This method is one I am most familiar with, using paperclip with an actix-web backend. The paperclip crate allows you to annotate structs & request methods to have it auto-generate an OpenAPI spec for you:

use actix_web::{App, Error, HttpServer};
use paperclip::actix::{
    api_v2_operation,
    web::{self, Json},
    Apiv2Schema, OpenApiExt,
};
use serde::{Deserialize, Serialize};

#[derive(Serialize, Deserialize, Apiv2Schema)]
struct User {
    user_id: i32,
    first_name: String,
    last_name: String,
}

#[api_v2_operation]
async fn echo_user(body: Json<User>) -> Result<Json<User>, Error> {
    Ok(body)
}

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    HttpServer::new(|| {
        App::new()
            .wrap_api()
            .service(web::resource("/echo").route(web::post().to(echo_user)))
            .with_json_spec_v3_at("/api/spec/v3")
            .build()
    })
    .bind("127.0.0.1:8080")?
    .run()
    .await
}

Now, you can curl to retrieve the json spec:

curl http://127.0.0.1:8080/api/spec/v3
{
  "openapi": "3.0.0",
  "info": { "title": "", "version": "" },
  "paths": {
    "/echo": {
      "post": {
        "requestBody": {
          "content": {
            "application/json": {
              "schema": { "$ref": "#/components/schemas/User" }
            }
          },
          "required": true
        },
        "responses": {
          "200": {
            "description": "OK",
            "content": {
              "application/json": {
                "schema": { "$ref": "#/components/schemas/User" }
              }
            }
          }
        }
      }
    }
  },
  "components": {
    "schemas": {
      "User": {
        "type": "object",
        "properties": {
          "first_name": { "type": "string" },
          "last_name": { "type": "string" },
          "user_id": { "type": "integer", "format": "int32" }
        },
        "required": ["first_name", "last_name", "user_id"]
      }
    }
  }
}

However, if you're like me and would like it written out to a file rather than rely on a backend to be running, you can do so with a little bit of fiddling around.

Firstly, we want to create a function to configure our services:

fn config(cfg: &mut web::ServiceConfig) {
    cfg.service(web::resource("/echo").route(web::post().to(echo_user)));
}

Then we can adjust our main app to use this to configure actix web:

App::new()
    .wrap_api()
    .configure(config)
    .with_json_spec_v3_at("/api/spec/v3")
    .build()

And we can create another function that will write out the api spec to a file:

pub fn dump_schema<P: AsRef<Path>>(path: P) {
    App::new()
        .wrap_api()
        .configure(config)
        .with_raw_json_spec_v3(|app, spec| {
            let mut file = File::create(&path).expect("Could not create a file");
            file.write_all(
                serde_json::to_string_pretty(&spec)
                    .expect("Could not serialize spec")
                    .as_bytes(),
            )
            .expect("Could not write schema");

            app
        });
}

The reason we keep it separate from the main App builder inside the HttpServer closure is that the builder will be called per worker & so as you can imagine it would write out the file multiple times on server startup.

Now we can call it manually:

dump_schema("openapi.json");

Utoipa

More general purpose and targeting a number of different web frameworks, the utoipa library is similar in function to paperclip, albeit with a little bit more boilerplate to get going.

As before, a derive macro ToSchema handles the straight type magic:

#[derive(Serialize, Deserialize, ToSchema)]
struct User {
    user_id: i32,
    first_name: String,
    last_name: String,
}

Decorating services is a little more cumbersome, as we need to specify the request & response types alongside a status.

Here's an actix web example:

#[utoipa::path(post, request_body = User, responses((status = 200, body = User)))]
#[post("/echo")]
async fn echo_user(body: Json<User>) -> Result<Json<User>, Error> {
    Ok(body)
}

You can see, that the User type has to be repeated in both the request_body and body attributes. This could potentially mean that the generated OpenAPI specification does not match reality, so extra care is needed to ensure they match.

It would be nice if the proc macro could work this out by itself rather than having to redefine them in attributes, but it would require tighter integration, something that might be added in the future.

Generating a OpenAPI JSON file is a little bit easier than in paperclip.

Firstly you combine all your components on a struct:

#[derive(OpenApi)]
#[openapi(components(schemas(User)), paths(echo_user))]
struct ApiDoc;

Then you can use some standard methods to write it out:

std::fs::write(
    "openapi.json",
    &ApiDoc::openapi()
        .to_pretty_json()
        .expect("Should serialize"),
)?;

TypeScript from OpenAPI

With an openapi.json file you can use a tool such as openapi-generator-cli to generate the requisite definitions:

npx openapi-generator-cli generate -i openapi.json -g typescript

This won't create purely typescript definitions, as there still needs to be a library to make requests and you may want to generate types closer to your frontend framework like using typescript-rxjs etc...

Normally this isn't run individually, but rather it is included inside package.json of an existing frontend project.

GraphQL

GraphQL can be used to publish types, albeit indirectly and does require you to modify your backend to support it. The types that can be represented in GraphQL are also quite restricted: no recursive types, no shared types between input/output, incomplete support for sum-types/unions to name a few issues I've personally hit using it. If you're happy with the high level of opinionation GraphQL forces upon your code base, you are rewarded with some great tooling & infrastructure.

The main two libraries for GraphQL on the backend are async-graphql and juniper.

On the frontend, there is similar tooling to OpenAPI for generating types & requests. You will, however, need to also write out some specific GraphQL queries, to utilise them.

async-graphql

As with the Derive Macro & OpenAPI generators you start by decorating your structs. We can use the SimpleObject and InputObject derive macros (in GraphQL you need separate types for input...):

#[derive(Serialize, Deserialize, SimpleObject)]
struct User {
    user_id: i32,
    first_name: String,
    last_name: String,
}

#[derive(Serialize, Deserialize, InputObject)]
struct UserInput {
    user_id: i32,
    first_name: String,
    last_name: String,
}

Next, we want to define a Query object which will dictate the methods available via GraphQL:

pub struct Query;

#[Object]
impl Query {
    async fn echo_user(&self, user: UserInput) -> User {
        User {
            user_id: user.user_id,
            first_name: user.first_name,
            last_name: user.last_name,
        }
    }
}

Then we can set up a handler for graphql queries (leaving subscriptions and mutations empty for now):

async fn graphql_handler(
    schema: web::Data<Schema<Query, EmptyMutation, EmptySubscription>>,
    request: GraphQLRequest,
) -> GraphQLResponse {
    schema.execute(request.into_inner()).await.into()
}

And a GraphiQL endpoint to use a browser to craft graphql queries:

async fn graphql_playground() -> HttpResponse {
    HttpResponse::Ok()
        .content_type("text/html; charset=utf-8")
        .body(GraphiQLSource::build().endpoint("/graphql").finish())
}

We wrap the schema in web::Data and add it as app_data, then provide our two new services:

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    let schema = Schema::build(Query, EmptyMutation, EmptySubscription).finish();
    let schema_data = web::Data::new(schema);

    HttpServer::new(move || {
        App::new()
            .app_data(schema_data.clone())
            .service(web::resource("/graphql").route(web::post().to(graphql_handler)))
            .service(web::resource("/playground").route(web::get().to(graphql_playground)))
    })
    .bind("127.0.0.1:8080")?
    .run()
    .await
}

You can also write out the schema via the sdl() method on Schema:

std::fs::write("example.graphql", schema.sdl())?;

GraphiQL Playground

Navigating to http://localhost:8080/playground in a browser you will get the GraphiQL playground:

juniper

The juniper crate in this context is very similar to async-graphql.

juniper as a crate has been around a fair bit longer async-graphql & did not start off as an async library. Both are very capable libraries, but I have found async-graphql to be better documented, and more "modern".

In terms of coding for our example, juniper is very similar to async-graphql, with derive macros on the types you want to define:

#[derive(Serialize, Deserialize, GraphQLObject)]
struct User {
    user_id: i32,
    first_name: String,
    last_name: String,
}

#[derive(Serialize, Deserialize, GraphQLInputObject)]
struct UserInput {
    user_id: i32,
    first_name: String,
    last_name: String,
}

The query object is also very much a slot in:

pub struct Query;

#[juniper::graphql_object]
impl Query {
    async fn echo_user(&self, user: UserInput) -> User {
        User {
            user_id: user.user_id,
            first_name: user.first_name,
            last_name: user.last_name,
        }
    }
}

Creating the schema is also similar:

let schema = Schema::new(Query, EmptyMutation::new(), EmptySubscription::new());

The handler method deviates slightly, but is mostly the same as above:

async fn graphql(
    req: actix_web::HttpRequest,
    payload: actix_web::web::Payload,
    schema: web::Data<Schema>,
) -> Result<HttpResponse, Error> {
    graphql_handler(&schema, &(), req, payload).await
}

The juniper_actix integration also provides a graphiql_handler which you can use to mount the playground (v1 of graphiql instead of v2 at this time of writing).

Putting this together, we get almost an identical layout:

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    let schema = Schema::new(Query, EmptyMutation::new(), EmptySubscription::new());
    let schema_data = web::Data::new(schema);

    HttpServer::new(move || {
        App::new()
            .app_data(schema_data.clone())
            .service(web::resource("/graphql").route(web::post().to(graphql)))
            .service(
                web::resource("/playground")
                    .route(web::get().to(|| graphiql_handler("/graphql", None))),
            )
    })
    .bind("127.0.0.1:8080")?
    .run()
    .await
}

As with async-graphql the graphql schema can be written out to disk using the as_schema_language() method:

std::fs::write("example.graphql", schema.as_schema_language())?;

GraphQL to TypeScript

Now, the next question is: we have our graphql schema, how do we get to typescript definitions? The easiest way I've found, from a maintenance perspective, is a little bit involved for this particular blog. The gist is that using GraphQL Code Generator you can define your queries and then have frontend methods/types generated from them.

This does require a bit of scaffolding in your frontend project to get working, but is rather painless when it is set up, besides having to write queries/mutations for all of your types.

Conclusions

As you can see there are a variety of ways of publishing rust types for use within TypeScript. Having this process is as automated as possible is a great way to icnrease cohesion between front & backend. Whether these methods are right for you depends entirely on your use-case & what tradeoffs you're willing to make:

Types Only: You can use direct translation tools such as ts-rs or schemars

OpenAPI: You can go the OpenAPI route using paperclip or utoipa

GraphQL: You can use GraphQL with async-graphql or juniper

When building DiveDB, I opted to use GraphQL due to its great frontend tooling & standardised introspection. In future blogs I hope to dive into some of the details of this decision.