custom-embed

Custom embedding node for n8n to work with Supabase Vector Store

Package Information

Released: 3/22/2025
Downloads: 55 weekly / 170 monthly
Latest Version: 0.1.1
Author: kieng

Documentation

Banner image

n8n-nodes-custom-embed

This is a custom n8n node that allows you to generate embeddings using your own embedding server and use them with the Supabase Vector Store in n8n. It's designed to work seamlessly with the existing LangChain ecosystem in n8n.

Features

  • Generate text embeddings using your custom embedding server
  • Fully compatible with Supabase Vector Store node as a sub-node
  • Configurable embedding model selection
  • Support for API authentication
  • Error handling and input validation

Prerequisites

  • n8n instance (version 0.214.0 or later recommended)
  • A custom embedding server with an API that returns embeddings
  • Supabase instance configured with pgvector extension for vector storage

Installation

Local Installation

  1. Clone this repository:
git clone https://github.com/yourusername/n8n-nodes-custom-embed.git
  1. Install dependencies:
cd n8n-nodes-custom-embed
pnpm install
  1. Build the code:
pnpm build
  1. Link to your n8n installation:
# First, make sure your PNPM_HOME is set correctly
pnpm setup
# Then link
pnpm link -g
cd ~/.n8n/custom
npm link n8n-nodes-custom-embed

Global Installation (npm)

npm install -g n8n-nodes-custom-embed

Setup

  1. Start your n8n instance
  2. Go to Settings > Community Nodes
  3. Make sure n8n-nodes-custom-embed is listed and enabled

Configuration

Custom Embeddings API Credentials

  1. Go to Settings > Credentials
  2. Click "Add Credential"
  3. Select "Custom Embeddings API"
  4. Fill in:
    • API URL: Your embedding server's base URL
    • API Key: Your API authentication key (if required)
  5. Save the credential

Usage

Connecting with Supabase Vector Store

  1. Create a new workflow
  2. Add a trigger node (e.g., HTTP Request)
  3. Add a "Supabase Vector Store" node
  4. When you connect to the Supabase Vector Store node, click on it and look for the "Embedding" connection
  5. In the connection dropdown, you should now see "Custom Embeddings" as an option
  6. Select it and configure:
    • Connect it to your Custom Embeddings API credential
    • Specify the text to embed or input field containing the text
    • Select your embedding model (if applicable)

Example Workflow

Here's a typical workflow structure:

[Trigger] → [Text Input] → [Supabase Vector Store] ↔ [Custom Embeddings]

Where:

  • The Supabase Vector Store node uses the Custom Embeddings sub-node through the "Embedding" connection
  • The Custom Embeddings node generates vector embeddings
  • The Supabase Vector Store node saves the embeddings in your vector database

API Requirements

Your custom embedding server should provide an API with the following:

  1. Endpoint: POST /embeddings (configurable in the node)
  2. Request format:
    {
      "text": "Your text to embed",
      "model": "optional-model-name"
    }
    
  3. Response format:
    {
      "embeddings": [0.1, 0.2, 0.3, ...],
      "dim": 1536,  // Optional - dimension of embeddings
      "model": "model-name"  // Optional
    }
    

Troubleshooting

Node not appearing as a sub-node

If the Custom Embeddings node is not appearing as a sub-node for the Supabase Vector Store:

  1. Make sure you've rebuilt the project with pnpm build
  2. Check that you've properly linked the node to your n8n installation
  3. Restart your n8n instance
  4. Try clearing your browser cache or opening in an incognito window

Vector Store compatibility

Make sure the embeddings dimension matches what your Supabase Vector Store expects. If your embeddings have a different dimension than what your pgvector table is configured for, you'll need to update your database schema.

Development

To modify this node:

  1. Clone the repository
  2. Install dependencies with pnpm install
  3. Make your changes to the node implementation
  4. Build with pnpm build
  5. Test with your n8n instance

License

MIT

Discussion