module Memo
Overview
Memo - Semantic search and vector storage library
A focused library for chunking, embedding, and searching text using vector similarity. Designed to be embedded in applications that need semantic search capabilities.
Quick Start
require "memo"
# Initialize database
db = DB.open("sqlite3://app.db")
Memo::Database.load_schema(db)
# Create service (handles embeddings internally)
memo = Memo::Service.new(
db: db,
provider: "openai",
api_key: ENV["OPENAI_API_KEY"]
)
# Index a document
memo.index(
source_type: "document",
source_id: 42,
text: "Your document text..."
)
# Search
results = memo.search(query: "search query", limit: 10)
API
The primary API is Memo::Service which provides:
index()- Index documents with automatic chunking and embeddingsearch()- Search with automatic query embeddingmark_as_read()- Track chunk usage
Internal modules (Storage, Search, Chunking, RRF) remain accessible for advanced use cases but Service is the recommended entry point.
Defined in:
memo.crmemo/chunking.cr
memo/config.cr
memo/database.cr
memo/projection.cr
memo/providers/base.cr
memo/providers/mock.cr
memo/providers/openai.cr
memo/rrf.cr
memo/search.cr
memo/service.cr
memo/storage.cr
Constant Summary
-
VERSION =
"0.1.0"
Class Method Summary
-
.configure(&)
Configure memo (optional - has sensible defaults)
-
.table_prefix : String
Global configuration
-
.table_prefix=(table_prefix : String)
Global configuration