

Photo by Editor | Chat GPT
In the current era of large language model (LLM) products like Chat GPT or Gemini, we have rely on these tools to help increase our productivity by helping different tasks, such as answering questions, summarizing documents, planning activities, and more. These tools have become a part of our daily life.
However, many of these products have been hosted in the cloud, and we need to access them through their platform. In addition, every platform is usually limited to its proprietary model and does not allow the use of other LLMs. This is why the Olama application was designed to help users who want to implement various LLMs in their local environment.
Olama has been for some time and has already helped many users run the language model locally. The latest update has made it even more powerful.
Let’s ask why Olama’s new application is becoming an essential source for many users.
. Olama’s new app
As mentioned, Beset There is an open source tool that runs LLM in a local environment without trusting cloud hosting. We can discover Which models are available At Olama’s site, download them, and run them directly using the application interface. The application works as a local model manager who hosts various models for us freely. Such a local application benefits by allowing the user to freedom, while still preserves data privacy and reduces any delays.
The biggest development is that Olama now operates as a standstone GUI app, as is fully through the command line. The days of Is necessary Search, install, and create a third -party UI app (or your writing) that can turn Olama into a more pleasant experience. Sure you can still do all this, but no longer need it.
With a new update, the Olama application has become more useful than ever, and we will find every feature of it individually.
The Olama application allows users to download the model in their local environment and use LLM locally. You can communicate by selecting a model and providing gestures to get the result.
Olama maintains the history of your conversation, and allows you to follow the conversation.
If no model is still available locally, the new Olama application will automatically download it before processing immediately. This simplifies the user’s experience, as they do not need to download models separately before use.
Another new feature is the ability to chat with your files. By dragging and droping the file on the Olama app, we can ask questions about its contents.
The result is shown below, where the model accesses the file and processes the premip based on the document provided.
If you need to take action on a lot of big documents, you can increase the length of the context of the halama through settings. However, to ensure stable performance, more memory will be needed to increase the length of context.
The new features of Olama are not limited to just documents like PDF and Word. It also offers multi -modal support, provided the selected models can process different data types. For example, we can use a model like Lama for action on an image, as shown below.
The result is shown below.
Finally, the new Olama application can process code files to produce documents. This feature is especially beneficial for developers.
The result is shown below.
These are the key new features that Olama provides to improve your productivity. We encourage you to use them in our work.
. Wrap
Olama is an application that allows users to run LLM in the local environment. This is a valuable tool for any user who wants to test different models and wants to keep your data private. Its new release features more powerful features, including usable chat UI, easy to download, ability to chat with your files, multi -modal support, and code processing.
I hope it has helped!
Cornelius Yodha Vijaya Data Science is Assistant Manager and Data Writer. Elijan, working in Indonesia for a full time, likes to share indicators of data and data through social media and written media. CorneLius writes on various types of AI and machine learning titles.