AI Package
RAG-powered conversational AI via @skam/ai
Overview
The @skam/ai package provides a simple TypeScript wrapper around the Supabase Edge Function rag-chat. It abstracts the complexity of vector search and Gemini AI invocation into a single function call.
API Reference
packages/ai/src/index.ts
import { SupabaseClient } from '@supabase/supabase-js'
export async function askRagChat(
supabase: SupabaseClient,
question: string
): Promise<{
answer?: string
error?: string
cached?: boolean
}>Parameters
supabaseSupabaseClientAn authenticated Supabase client instance
questionstringThe user's prompt or question to ask the RAG AI
Usage Example
Server Component / Server Action
import { askRagChat } from '@skam/ai'
import { createClient } from '@/lib/supabase/server'
const supabase = await createClient()
const { answer, error, cached } = await askRagChat(
supabase,
'What are the shipping options?'
)
if (error) {
console.error(error)
} else {
console.log(answer) // AI-generated response
console.log(cached) // true if served from cache
}How It Works
1
Validate
Question string is checked for non-empty input.
2
Invoke
Calls supabase functions invoke("rag-chat").
3
Search
Semantic similarity search over embedded chunks.
4
Generate
Context is fed to Google Gemini for grounded answer.
5
Return
Returns { answer, cached } or { error }.