Australia/Sydney
BlogMay 25, 2024

Create Local AI Agents with LangGraph Easily with Ollama

Fahd Mirza

 This video is a step-by-step tutorial to locally create AI agents with Langgraph and Ollama.



Code:


conda create -n langgraph python=3.11

export OPENAI_API_KEY=""
export TAVILY_API_KEY=""

pip install -U langchain-nomic langchain_community tiktoken langchainhub chromadb langchain langgraph tavily-python
pip install langchain-openai

ollama pull mistral

from langchain.prompts import PromptTemplate
from langchain_community.chat_models import ChatOllama
from langchain_core.output_parsers import JsonOutputParser
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain import hub
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent

llm = ChatOpenAI(api_key="ollama",model="mistral",base_url="http://localhost:11434/v1",)

tools = [TavilySearchResults(max_results=1)]

llm_with_tools = llm.bind_tools(tools)

prompt = hub.pull("wfh/react-agent-executor")
prompt.pretty_print()

agent_executor = create_react_agent(llm_with_tools, tools, messages_modifier=prompt)

response=agent_executor.invoke({"messages": [("user", "What is Oracle database")]})

for message in response['messages']:
    print(message.content)

Share this post:
On this page

Let's Partner

If you are looking to build, deploy or scale AI solutions — whether you're just starting or facing production-scale challenges — let's chat.

Subscribe to Fahd's Newsletter

Weekly updates on AI, cloud engineering, and tech innovations