I am sharing my experience organizing the translation of content into different languages. I work on a multilingual website using the Next.js i18n + MongoDB (Mongoose) stack. The site contains a significant amount of text that is occasionally updated.
Google Translate is used to translate the texts in the project. Initially, a simple function was created to translate the view.
// @/lib/translate.ts
import "server-only";
import { z } from "zod";
const { Translate } = require("@google-cloud/translate").v2;
const ApiKey = z.string().parse(process.env.GOOGLE_TRANSLATION_API_KEY);
const googleTranslate = new Translate({ key: ApiKey });
export async function translateFromEn(text: string, locale: string): Promise<string | null> {
try {
if (locale === 'en') return text.trim();
const translations = await googleTranslate.translate(text, locale);
const result = translations[0];
return result;
} catch (error) {
return null;
}
}
Then it was used everywhere when preparing pages. To avoid translating the same texts repeatedly, I used the “unstable_cache” function (despite its name, I’ve never had any problems with it).
// .../page.tsx
type Params = { locale: string };
export default async function Page({ params }: { params: Params }) {
const { locale } = params;
...
const translatedTitle = await unstable_cache(
async () => await translateFromRu(post.title, locale),
[`post:title:${post.slug}:${locale}`],
{ revalidate: false, tags: [`post:${post.slug}`] },
)();
const translatedShortDescription = await unstable_cache(
async () => await translateFromRu(post.metaCustom.shortdescription, locale),
[`post:shortDescription:${post.slug}:${locale}`],
{ revalidate: false, tags: [`post:${post.slug}`] },
)();
return {
...
}
}
Accordingly, whenever the blog post was updated, the revalidateTag() method was called and the translations were updated.
While this was a working solution, the cache was reset every time the site was updated, which was a flaw. Consequently, this risked a large bill from Google.
Therefore, I added a feature that stores the translation results in the Mongo database. The idea is as follows:
- Create a unique ID for each translation entity.
- Before accessing the Google API, we check the database to see if there is a record with this ID and if it matches the text to be translated.
- If so, we return the record from the database.
- If not, we request a translation via the API, return the result, and save it in the database.
I use hashing for simple text matching. With the new logic, the translateFromEn function looks like this:
import "server-only";
import crypto from "crypto";
import { z } from "zod";
import { TranslateResult, zTranslateResult } from "../mongoModels";
const { Translate } = require("@google-cloud/translate").v2;
const ApiKey = z.string().parse(process.env.GOOGLE_TRANSLATION_API_KEY);
const googleTranslate = new Translate({ key: ApiKey });
// -------------------------------------------------------------------
type GetTranslationKeyProps = {
locale: string; // en, ko, ch...
entity: string; // user, post, comment
entityType: string; // shortDescription, metaTitle
entityId: string; // slug, _id ...
};
function getTranslationKey(props: GetTranslationKeyProps) {
const { locale, entity, entityType, entityId } = props;
return `${locale}-${entity}-${entityType}-${entityId}`;
}
// --------------------------------------------------------------------
// 1. If locale === 'en' - just return text
// 2. Generate Key and Check if translation exists in DB
// 3. Generate hash from text
// 4. If old Db Translation exist and hash is the same - return translation
// 5. If hash is different or hash not the save - create new translation
// 6. Return translation
// --------------------------------------------------------------------
export async function translateFromEn(
text: string,
idData: GetTranslationKeyProps,
): Promise<string | null> {
try {
// 1. If locale === 'en' - just return text
if (idData.locale === "en" || text.trim().length === 0) {
return text.trim();
}
// 2. Generate Key and Check if translation exists in DB
const key = getTranslationKey(idData);
const oldTranslateDoc = await TranslateResult.findOne({
key,
}).lean();
// 3. Generate hash from text
const textHash = crypto.createHash("SHA-256").update(text).digest("hex");
// 4. If old Db Translation exist and hash is the same - return translation
if (oldTranslateDoc && oldTranslateDoc.hash === textHash) {
return oldTranslateDoc.translation.trim();
}
// 5. If hash is different or hash not the save - create new translation
const translations = await googleTranslate.translate(text.trim(), idData.locale);
const translateResult = translations[0];
const newTranslationResult = zTranslateResult.parse({
key,
hash: textHash,
translation: translateResult,
});
await TranslateResult.findOneAndUpdate({ key }, newTranslationResult, {
upsert: true,
});
// 6. Return translation
return translateResult;
// ---
} catch (error) {
return null;
}
}
This allowed us to significantly reduce the number of calls to the Google Translate API and stay within budget.