2025 Barcamp Session Proposal: Using the OpenRefine LLM Extension

… with Local and Remote Models

Description

LLMs are powerful tools for cleaning and enriching data, extracting entities, and generating translations. Thanks to @Sunil_Natraj, there is an excellent AI extension for OpenRefine that enables the use of local and remote LLM models with apps and services like Ollama, llama.cpp,OpenRouter, as well as most other AI services based on the OpenAI API.

In this session, I will demonstrate how to install and set up the extension in OpenRefine. Following the demonstration, I would like to discuss use cases and applications for AI in the context of data wrangling.

Format

Presentation and Discussion
Duration: 30 minutes

5 Likes

This session was cancelled as @Michael_Markert was not available to present. Instead, @Martin did a small demo of how the extension worked, and the group discussed the usage of LLM with OpenRefine. The notes from the shared Etherpad are available here.

Interesting. I missed the demo. I managed to install the extension, but is good documentation on the next step (setting up an LLM provider in OpenRefine) available?

You can find some documentation directly in the GitHub repo. I suggest starting with those pages: