HYVE Docs
Dump

Prerequisites

Dump is part of the HYVE monorepo. You need the full monorepo set up before running Dump.

  • Node.js 22+
  • pnpm 9+
  • Supabase project (for database + auth + storage)
  • Google AI API key (Gemini, for categorization and embeddings)

Environment Variables

Never commit your .env.local file. It contains secrets that should stay local.

Create apps/dump/.env.local with:

.env.local
NEXT_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=your-anon-key
SUPABASE_SERVICE_ROLE_KEY=your-service-role-key
GOOGLE_AI_API_KEY=your-gemini-api-key

Installation

Clone the monorepo

git clone https://github.com/THE-HYVE-COMPANY/hyve-os.git
cd hyve-os

Start Dump

pnpm --filter dump dev

Dump runs on http://localhost:3106.

Your First Ingest

Once Dump is running, you can ingest content via the UI or the API.

Via API

Ingest a URL
curl -X POST http://localhost:3106/api/ingest \
  -H "Content-Type: application/json" \
  -d '{"url": "https://example.com/article"}'

The ingest endpoint accepts three input types:

Ingest payload
{
  url?: string    // URL to extract content from
  text?: string   // Plain text to store directly
  image?: string  // Image URL or base64
  force?: boolean // Re-ingest even if URL exists
}

At least one of url, text, or image must be provided. Source type is auto-detected from the URL pattern.

Via UI

  1. Open Dump in your browser
  2. Paste a URL into the input field
  3. Click Ingest — Dump auto-detects the source type
  4. Watch real-time progress via SSE streaming
  5. Once complete, the item appears in your collection

What Happens During Ingestion

When you ingest content, Dump runs through this pipeline:

  1. Detection — URL pattern determines source type (twitter, youtube, article, etc.)
  2. Extraction — Source-specific extractor pulls title, content, author, and media
  3. Categorization — Gemini assigns a category, subcategories, and tags
  4. Embedding — Content is vectorized for semantic search
  5. Storage — Item is saved to Supabase with full-text and vector indexes

Next Steps

On this page