Local LLMs#

Working with Local LLMs (On Your Own Computer!) — Ollama and Llama 3#

This tutorial demonstrates how to run LLMs locally on your own computer using Ollama and Meta’s Llama 3 model. Running models locally means your data never leaves your machine, which is especially important for researchers working with sensitive or copyrighted materials.

The tutorial covers:

  • Setting up Ollama and downloading models

  • Creating structured data from unstructured text

  • Chatting with a local LLM

  • Generating document embeddings

From AI for Humanists