Opera integrates 150 local AI language models in developer stream
Opera, the Nordic browser developer, is implementing experimental support for 150 local Large Language Model (LLM) variants into the Opera One developer stream. LLMs like Llama, Vicuna, Gemma, and Mixtral, among others, are now part of the lineup. The salient feature of using local LLM is data security, with the user's machine functioning independently to process prompts without the intervention of any server, facilitating offline use and increased privacy.
Opera has become the first major browser to extend built-in access to local AI models, landmarking the first instance of local LLMs being effortlessly accessed and managed through a primary browser via an in-built feature complementing Opera's Aria online AI service. This move manifests Opera's commitment to innovation by bringing AI models closer to the users, enhancing their browsing experience while maintaining their data privacy: the data used for these language models will be stored and processed locally on their devices.
Opera is introducing this innovative support for local AI models as part of its forward-thinking AI Feature Drops Program. Krystian Kolondra, EVP of Browsers and Gaming at Opera, emphasised the importance of this decision: "Introducing Local LLMs in this way allows Opera to start exploring ways of building experiences and know-how within the fast-emerging local AI space." It essentially targets early adopters who are keen to test the browser's AI feature set at an experimental stage.
Currently, Opera One Developer users are gaining the opportunity to choose the model they wish to process their data with. To experiment with these models, users will have to update their versions and follow several steps to instigate the new feature. Opting for a local LLM will facilitate its download to their device, effectively replacing Opera's native browser, AI Aria, until the user initiates a new session with the AI or reactivates Aria.
Another noteworthy feature is that these local LLMs require 2-10 GB of local storage space for each variant, which can be activated according to the user's preference. They serve as an alternative to Opera's online Aria AI service. This launch symbolises a significant step towards a user-centric AI experience where the user has full control over his data.
Pioneering AI integration into browser functionality, Opera has a history of pushing boundaries for improved user experience. In 2023, the company presented Opera One, an AI-centric flagship browser revealing a new browser architecture founded on modular design principles. The browser had a multithreaded compositor, allowing for smoother UX element processing. It featured the Aria browser AI, accessible via the browser command line's browser sidebar. It was also available in the gamer-centric Opera GX, and in the Opera browser on iOS and Android.
Opera's integration of local LLMs into its browser environment is a major milestone in its mission to provide user-centric and innovative browsing experiences across all devices while safeguarding user privacy.