Search results
more freely available translation models from the Tatoeba translation challenge, CC-BY 4.0 license. 543 live demo APIs of language variants available at Tiyaro.ai. For example, an English to German finetuned translator. This repository includes two setups:
ALMA is a many-to-many translation model that fine-tunes large language models with monolingual and parallel data. It has three generations: ALMA, ALMA-R, and X-ALMA, which support 6, 10, and 50 languages respectively.
19 paź 2020 · Facebook AI is introducing M2M-100, the first multilingual machine translation model that can translate between any pair of 100 languages without relying on English data.
We introduce SMaLL-100, a distilled version of the M2M-100(12B) model, a massively multilingual machine translation model covering 100 languages. We train SMaLL-100 with uniform sampling across all language pairs and therefore focus on preserving the performance of low-resource languages.
5 cze 2024 · The translation models cover 200 languages; the NLLB models come in multiple sizes (54.5B MoE, 3.3B and 1.3B Dense, and 1.3B and 600M distilled). The language identification models contain...
We have designed a set of 28 multilingual translation prompts that encompass various application scenarios for multilingual translation. We randomly select a prompt from the set for instruction tuning for each parallel sentence.
Meta AI has built a single AI model, NLLB-200, that is the first to translate across 200 different languages with state-of-the-art quality that has been validated through extensive evaluations for each of them. We’ve also created a new evaluation dataset, FLORES-200, and measured NLLB-200’s performance in each language to confirm that the ...