Google AI Edge provides the tools to run AI features on-device, and its new LiteRT-LM runtime is a significant leap forward for generative AI. LiteRT-LM is an open-source C++ API, cross-platform compatibility, and hardware acceleration designed to efficiently run large language models like Gemma and Gemini Nano across a vast range of hardware. Its key innovation is a flexible, modular architecture that can scale to power complex, multi-task features in Chrome and Chromebook Plus, while also being lean enough for resource-constrained devices like the Pixel Watch. This versatility is already enabling a new wave of on-device generative AI, bringing capabilities like WebAI and smart replies to users.
Related Posts
Beyond CAP: Why Partial Progress is Critical for Modern Distributed Systems
This is a Plain English Papers summary of a research paper called Beyond CAP: Why Partial Progress is…
Do we need Axios having the fetch API everywhere?
What do you prefer? Is there a reason to use Adios these times?
Accessibility Breakdown | External Links
Todays mini series of things you can do right now cover the essentials of checking and enhancing the…