15th Apr 2023
Using supabase with `fastn` means we have to install `fastn` and supabase.
`fastn` is an easy dependency, just a single binary, works well on windows, mac
and linux (and more tomorrow, we intend to ensure people can use fastn from
tablets etc also).
But `supabase` is quite heavy dependency. One has to install Docker to use it,
which is quite painful, and will def never run on mobile.
Why do we want to install supabase? Can't we just add postgresql bindings to
`fastn` and pass DB url to `fastn` and let it to do its thing? Why do we need
supbase on local machine?
One question is how do we manage database tables? One way to manage db is use
[supbase migration](https://supabase.com/docs/guides/cli/managing-environments)
features to manage the migrations. Supabase gives you "studio" where you can
edit tables in your DB using their studio UI, and then run `supabase db diff -f
<migraiton-name>` to do a diff from your local db against the prod db, and
`supabase db push` to apply unapplied migrations to remote.
These migration files are stored in `supabase/migrations` folder and supabase.
We want to store migrations in each `fastn` package. And when I install a
package that contains migrations, I want some `fastn` command to apply this
migrations to prod db (after appropriate warnings and inspections and
confirmations etc).
My earlier thought was to use Django. I love Django. And using Django means our
models can be further used for scripting. But installing Python across all
platforms is a pain. I tried checking if we can use [pyiodide](/pyiodide/) with
`fastn` without installing Python on the system at all, but `pyiodide` does not
yet work with `psycopg` etc so its a no go.
I also came across [migra](/migra/), a python package that can do a db diff,
supabase has experimental supprot for `migra`.
So what do we do? We have two roles, fastn developers, people who create fastn
packages that depends on supabase tables. Such people can install python and
migra, which is a simpler dependency than supabase. And then we have people who
will use such packages, website creators, they just install fastn, and fastn
will apply the migrations.
How migrations could be stored? For each db change, we run `migra
postgresql:///old postgresql:///new` to create `up.sql` and `migra
postgresql:///down postgresql:///up` to create `down.sql`. For each named
migration we create a folder containing the up and down sql files. We also
create a database table called `fastn_migrations` and store which all migration
is already applied, and `fastn migrate <optional-destination>` will apply all
the migrations needed to go from current state in `fastn_migrations` to the
`optional-destination`.
We will have to collect migrations from all the packages (each package will
store all migrations in a folder named `FASTN/migrations`. Each package will
get its own db schema so that table names do not conflict. We will also have to
worry about database schema etc that each migration only modifies the schema
specific to that package.
`fastn` Functions
We can use quickjs compiled as wasm.
What About Supabase Functions
Some of the projects we are going to depend on will require supabase functions
as well. This will allow arbitrary computation, which we can use from either
the backend using processor or frontend using endpoint and ajax. In such cases,
if a fastn package contains supabase function, we should install the function,
along with creating the db tables needed for that application. Further newer
versions of that fastn package may require use to update the code behind the
supabase function, along with running db migrations.
We can run migrations, they are just a bunch of sql files, but how do we create
supabase functions?
old thought
`fastn` is a good choice for writing frontend. Supabase is a great choice for
backend. Together they can be a great choice.
Semi Static Sites
`fastn` can be used for creating static sites. Static sites are great because
they do not require a server, and thus all the maintenance, cost etc associated
with that. There are a lot of really cheap and free options to deploy your site
on static servers, like Github Pages.
Some sites can be "semi static", instead of content coming from text files, page
content can come from postgresql. Now content can come from a lot of places,
APIs, JSON files and so on.
APIs require backend and so if you are lucky and someone is hosting those APIs
then great, use the API with your fastn pages. But if not, if the APIs are not
exactly what you need, then you have two options, you can store JSON files or
put things in a SQL table.
JSON files are problematic, even though they initially sound simpler. Say you
are trying to show a list of github repositories in your organisation on your
website for some reason. You can write a github action that will fetch the repo
data and store it in a JSON file, and then commit that JSON in your git repo,
and you can instrument your FTD file to pick the data from JSON file.
-- import: fastn/processors as pr
-- record repo-data:
caption name:
string url:
string stars:
boolean is-private:
optional string license:
optional body description:
-- record org-wise-repos-data:
repo-data list fifthtry:
repo-data list fastn-stack:
-- org-wise-repos-data org-wise-repos:
$processor$: pr.get-data
file: github-data.json
This setup is simple, only requires Github Action, no store, JSON files are
stored in your Git repo so you even have their history.
import subprocess
import json
org_wise_repos = {}
for org in ["fifthtry", "fastn-stack"]:
cmd = (
"gh repo list --json name,url,description,stargazerCount,isPrivate," +
"licenseInfo --limit 300 %s" % org
)
output = subprocess.check_output(cmd, shell=True, text=True)
repo_list = json.loads(output)
org_wise_repos[org] = [
{
"name": repo["name"],
"url": repo["url"],
"stars": repo["stargazerCount"],
"is-private": repo["isPrivate"],
"license": repo["licenseInfo"]["name"] if repo["licenseInfo"] else None,
"description": repo.get("description")
}
]
assert len(org_wise_repos[org]) < 300, "limit reached"
with open("github-data.json", "w") as f: # <hl>
json.dump(org_wise_repos, f)
name: fetch the data
on:
workflow_dispatch:
jobs:
fetch-github-repo-data:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- run: python ./scripts/fetch-repo-data.py # <hl>
- uses: stefanzweifel/git-auto-commit-action@v4
with:
commit_message: "updated github-repo.json"
call-deploy-workflow:
needs: [ fetch-github-repo-data ]
uses: ./.github/workflows/deploy.yml
The github action can be manually triggered, or can be run periodically. It
runs the python script and commits the updated json back. It then triggers the
`deploy.yml` workflow, which updates your website.
SQL instead of JSON
But JSON files are hard to query. If you really wanted the git history of JSON
files you can store in JSON file as well, but it may be better to spin up a
free supabase project, create your table and store the repo data there.
Recommendations For Using Supabase With `fastn`
So supabase expects a folder `supabase`, where it stores all supabase specific
stuff. You can also overwrite this folder by passing `--workdir` to all
`supabase` cli invocations.
You can store the `supabase` folder in your `fastn` package repo.
Separate `fastn` package and `supabase` projects
You may not want to store your supabase stuff in `fastn` package, and that is
perfectly fine. In that case you would want to store the appropriate API keys
in your `fastn` package somewhere, and you will not have `supabase` folder.
You may want to do this for privacy reason, say you do not want supabase code to
be publicly viewable, but you are okay with `fastn` package source to be
public. You may also want this if you are using your supabase project
elsewhere.
One `fastn` package, multiple Supabase projects
If your fastn package is "parent" of the supabase project, you may want to have
one of more supabase child projects. By having more than one you can have
different policies, security etc.
If you have more than one projects, maybe store your supabase stuff in folders
named `supabase-<project-name>`. You will have to use `--workdir` appropriatly
for it to work properly.
Normal Scenario: One fastn package, and one child supabase project
This is what a lot of people will end up doing. Say for your blog, or for your
company website. In this case the folder would be named just `supabase` and you
will not have to bother with `--workdir` flag.
Django For Migrations
Now supabase has great migration support. You can use their "studio" product and
do table schema updates using the UI, and then use the `supabase db diff` etc
commands to create migration files in `supabase/migrations` folder. This is
quite decent, but I am not fond of this.
I am not fond of not reading / manually editing text files, generated by tools.
The code generated by `supabase migrtion` is pretty decent, but I want to still
own this code. I can of course manually modify sql files generated by `supabase
migration`, but I prefer Django's migration system for doing this.
Why Django? Django has rock solid migration support. And using Django for
migration means we have to write Python models compatible with what we have in
the database, and since those python model files are the master source of truth
about the database schema, you not only get migraiton, but also get proper
python model definitions.
If you have python model definitions, scripting things become lot easier. If you
do not and want to write Python scripts to interact with database, you are
going to have inferior experience.
Django Folder: `dj`
Our recommendation is you store your django stuff in a folder named `dj`.
`poetry` for managing python dependencies
The [`poetry`](/poetry/) files would be kept in `dj` itself. To do any python
related thing one has to cd to `dj` folder.
python3 -m venv .venv # one time only
source .venv/bin/activate
cat requirements.in
brew install postgresql
pip install -r requirements.in
doit pip # updates requirements.txt file
`dj/scripts` folder
We are going to manage supabase and django, and there are going to be many
commands you want everyone who works on the fastn package to run regularly. Its
a good idea to put them in a well known location, and `dj/scripts` is a good
such location.
Use `zsh`
One decision you have to make early on is if your scripts are going to be
written with `bash` as the script engine or `zsh`. `zsh` is more modern, maybe
more powerful, but `bash` is more widely supported.
We recommend you use `zsh` for your daily scripting needs. The scripts we
distribute are all written for `zsh`.
`scripts/auto.sh`
We recommend you use `scripts/auto.sh` for defining all shell aliases, shell
functions, commonly used environment variables etc.
Anytime someone `cd`s into your `fastn` package, they should `source
scripts/auto.sh`. This can be done automatically if you are using `direnv`
plugin of `oh-my-zsh`, else you have to do this manually before running any
other daily use commands.
`scripts/common.sh`
Other scripts in `scripts` folder should not assume that `auto.sh` has been
sourced. They should also not explicitly source `auto.sh` as `auto.sh` is meant
only for shell access. If some common functionality is needed by both
`scripts/auto.sh` and other scripts, they should be put in a
`scripts/common.sh` and sourced from both `scripts/auto.sh` and other scripts.
Security Sensitive Environment Variables
`fastn-ignore`
Now that you are going to have `supabase` and `dj` folders that are not going to
contain `ftd` files, it's a good idea to tell `fastn` to ignore those folders.
Add this to your `FASTN.ftd`
-- fastn.ignore: supabase
-- fastn.ignore: dj