Ollama, a library that allows you to locally run large-scale language models such as Llama 2, is compatible with AMD graphics cards



' Ollama ' is a library that allows you to operate large-scale language models (LLM) such as 'Llama 2', 'Mistral', 'Vicuna', and 'LLaVA' locally with relative ease. Ollama now supports operation with AMD graphics boards.

Ollama now supports AMD graphics cards · Ollama Blog
https://ollama.com/blog/amd-preview

Ollama is a library published for Windows, macOS, and Linux, and official Docker images are also distributed. The following article explains in detail the steps to install Docker on Debian, start the Ollama official Docker image, and run LLM.

Official Docker image of 'Ollama', an application that allows you to easily run various chat AIs in the local environment, is now available - GIGAZINE



Ollama allows you to use CPU processing mode and GPU processing mode. Until now, GPU processing mode was only compatible with NVIDIA graphics boards, but on March 14, 2024, it was announced that it was also compatible with AMD graphics boards. The graphics boards supported at the time of article creation are as follows.

◆AMD Radeon RX series
・AMD Radeon RX 7900 XTX
・AMD Radeon RX 7900 XT
・AMD Radeon RX 7900 GRE
・AMD Radeon RX 7800 XT
・AMD Radeon RX 7700 XT
・AMD Radeon RX 7600 XT
・AMD Radeon RX 7600
・AMD Radeon RX 6950 XT
・AMD Radeon RX 6900 XTX
・AMD Radeon RX 6900 XT
・AMD Radeon RX 6800 XT
・AMD Radeon RX 6800
・AMD Radeon RX Vega 64
・AMD Radeon RX Vega 56

◆AMD Radeon PRO series
・AMD Radeon PRO W7900
・AMD Radeon PRO W7800
・AMD Radeon PRO W7700
・AMD Radeon PRO W7600
・AMD Radeon PRO W7500
・AMD Radeon PRO W6900X
・AMD Radeon PRO W6800X Duo
・AMD Radeon PRO W6800X
・AMD Radeon PRO W6800
・AMD Radeon PRO V620
・AMD Radeon PRO V420
・AMD Radeon PRO V340
・AMD Radeon PRO V320
・AMD Radeon PRO Vega II Duo
・AMD Radeon PRO Vega II
・AMD Radeon PRO VII
・AMD Radeon PRO SSG

◆AMD Instinct series
・AMD Instinct MI300X
・AMD Instinct MI300A
・AMD Instinct MI300
・AMD Instinct MI250X
・AMD Instinct MI250
・AMD Instinct MI210
・AMD Instinct MI200
・AMD Instinct MI100
・AMD Instinct MI60
・AMD Instinct MI50

The number of compatible graphics boards will continue to increase in the future.

The Ollama source code is available at the link below.

ollama/ollama: Get up and running with Llama 2, Mistral, Gemma, and other large language models.
https://github.com/ollama/ollama



◆Forum now open
A forum related to this article has been set up on the GIGAZINE official Discord server . Anyone can write freely, so please feel free to comment! If you do not have a Discord account, please create one by referring to the article explaining how to create an account!

• Discord | 'Tell me your own points of interest when it comes to AI model release news! What's important, such as 'will it work on my PC', 'how much performance does it have', 'who is the developer'?' ' | GIGAZINE
https://discord.com/channels/1037961069903216680/1219572135463354399

in Software,   Hardware, Posted by log1o_hf