Paragraphs
Translation memory

The cost-saving feature most people can't see.

Source units get embedded the moment they enter the graph. Similar future units re-use the prior translation. The bigger your catalogue, the more it saves.

How it works

Three steps, one cost line you don't pay.

  1. 1. Embed. Every source unit is embedded using OpenAI text-embedding-3-large @ 3072 dimensions on insert. Stored in pgvector alongside the row.
  2. 2. Query. Next translation request: cosine-similarity lookup against approved translations in the same project + locale + unit_type.
  3. 3. Decide. Similarity ≥0.95 → reuse the prior translation directly. 0.85–0.95 → use as a hint to the model. <0.85 → translate fresh.
What it saves you

A practical example.

"Add to cart" appears on 500 product pages. The graph stores one fingerprint, one translation per locale. TM bypasses re-translation for every variation ("Add this to cart", "Add to my cart", "+ Cart"): similarity high enough to reuse, similarity adequate to use as a hint.

Across a typical 2,000-product WooCommerce catalogue with English source, TM hit rates of 60–80% are normal after the first month. Your translation cost compounds down, not up.

Quality controls

No memory pollution.

  • Only translations with quality_score ≥ 0.85 enter TM. Bad outputs never poison the well.
  • Rejected translations are explicitly never indexed, even if a human reviewer then later approves a different version.
  • One-click TM purge per project, per locale, per unit type. Useful when you change brand voice.
  • TM is scoped to your project. Your translations never leak across organisations.
Fuzzy matching

The middle band.

Hits between 0.85 and 0.95 are too good to ignore and not safe to reuse blindly. The cascade injects the matching translation into the prompt as a few-shot hint. The model produces something close to your prior translation; you save 30–40% of the cost of a fresh translation.

Start free in 30 seconds.

No credit card. 100,000 words on the free tier. Self-serve onboarding.