Creating an HNSW index over a table can
significantly improve query times.
Vectors can be searched using L2 distance, cosine distance, or inner product.
Copy
Ask AI
-- L2 distanceSELECT * FROM mock_items ORDER BY embedding <-> '[1,2,3]'::vector;-- Cosine distanceSELECT * FROM mock_items ORDER BY embedding <=> '[1,2,3]'::vector;-- Inner productSELECT * FROM mock_items ORDER BY embedding <#> '[1,2,3]'::vector;
The following code block demonstrates the equivalent for sparse vector search.
Copy
Ask AI
-- L2 distanceSELECT * FROM items ORDER BY embedding <-> '{1:3,3:1,5:2}/5' LIMIT 5;-- Cosine distanceSELECT * FROM items ORDER BY embedding <=> '{1:3,3:1,5:2}/5' LIMIT 5;-- Inner productSELECT * FROM items ORDER BY embedding <#> '{1:3,3:1,5:2}/5' LIMIT 5;