griffin's blog

dont use services

factorio is a game about automation

you should finnd services, not the other way around

if it needs marketing, why are you using it?

if you needed to search best xyz site:reddit.com why are you using it?

database

memory | saving/loading json files | mongodb | postgres

postgres makes sense when you start to have like a users table but each user has posts and instead of storing it

users: [
{
  name: "griffin",
  posts: [
    title: "my first post"
  ]
}

you can have a users table and a posts table

users Name posts Title Author
1 Griffin 1 My First Post Griffin

this is where json starts to get unwieldy because what if you want to query all posts?? do you go through all users and put all the posts together? naw you just SELECT * FROM posts

web proxy

its easier to write a reverse proxy in hono than it is to figure out how to run and configure caddy - so RESIST SERVICES until they are EASIER to use

like eventually it should get to a point where its like bruh I can't keep doing this manually I need to use caddy

and when you use caddy it should be so easy and simple to integrate because you already know how it works because you have been doing it yourself this whole time

instead of ahhh caddy is the best reverse proxy lemme read docs for 5 hours until I figure out how to install it this sucks but I know it is considered the best reverse proxy so why would I try to do that in 10 lines of code in my web app when I could use the best reverse proxy??

my cool web app | caddy/nginx/a reverse proxy | cloudflare/a cloud service

auth

just have 1 user | dummy auth where you pick your user | some simple hardcoded username/password whatever in 10 lines of code | sign in with github in 20 lines of code | user/pass login/social log in/mfa in 100 lines of code | oh no we need password resets/mfa/social log in - time for an auth service ( zitadel :) )

llms

await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [
    {
      role: 'user'
      message: 'give a number 1-5'
    }
  ]
)}

but now we want the llm to do stuff with tool calls

but now we want to be able to have a conversation instead of a request and response

but not we want multiple llms in a group chat

??? there are a million libraries idk how to do this