Sitemap
Predict

where the future is written

Member-only story

Using the Model Context Protocol (MCP) With a Local LLM

5 min readMar 27, 2025

--

Photo by on

Any large language model that supports function calling (or tool use) is capable of making use of the model context protocol. In this tutorial article, we’ll start by creating an MCP server that offers a few tools, and then get a small Llama 3.2 model to make use of those tools.

If you are not familiar with MCP, please consider reading the following article first:

The MCP server we’re going to create today will offer three tools: ls to list the contents of a directory, cat to read the contents of a file, and echo to write something to a file.

We start by installing the mcp library using pip.

pip install mcp
pip list | grep mcp

# Output:
# mcp 1.5.0

I’m going to call my MCP server “Local Agent Helper”. It’s a decent name because our local LLM will be behaving like a simple agent once it can communicate with this server.

Predict
Predict
Ashraff Hathibelagal
Ashraff Hathibelagal

Written by Ashraff Hathibelagal

Computer Scientist | Writer | Effective Accelerationist

Responses (2)