News·January 31, 2025
Building Multi-Language Support in the Age of LLMs
By Jack Deng
It's wild to look back at how AI has evolved. Before LLMs took over everything, we had Seq2Seq (shoutout to Ilya Sutskever's 2014 paper) and Neural Machine Translation, which made headlines when Google Translate switched over in 2016. Then came Attention Is All You Need (2017), GPT (2018), ChatGPT (2022), O1 (2024), and now an open weight equivalent (R1) in 2025. What a ride.
By early 2023, it was clear to us that chatting would become a core interface for applications—ChatGPT was training an entire generation to interact this way. Multi-language support has always been a challenge in mortgage systems, where traditional i18n effort is heavy lifting and low ROI. But with LLMs, a new approach emerged—one where language support is built-in for LLMs, and thus we can shift a meaningful portion of functionality to a chat interface.
That said, we didn’t start building until late 2024—because, well, "priorities" (and in an AI boom, waiting can usually be a more cost-effective strategy). The top models today are incredible at language following, translation, and transcription—but, as expected, hallucinations are quite common. It reinforced what we already believed: a chat-only UI is not the answer, especially in financial systems where compliance and accuracy are non-negotiable. Users love to chat, but they also need clarity and confidence that they’re getting the right results.
So while enabling multi-language support in chat was relatively straightforward with prompting, the real work was elsewhere—i18n for our React Native web and app, i18n for our Go backend, and multi-language support for our templating system. Users want both determinism and configurability, and doing it right requires a solid foundation.
The good news? What used to be a tedious, months-long effort was dramatically accelerated by better tooling—code parsers, GitHub Actions, and, of course, LLMs. Our engineers automated most of the process, from identifying translation points to generating artifacts with minimal coding. Within weeks, we had a release-ready version, all without disrupting core feature development.
This is the new era of software development—AI and LLMs aren't just part of the product; they're revolutionizing the development process itself. 🚀