A Code Implementation to Building a Context-Aware AI Assistant in Google Colab Using LangChain, LangGraph, Gemini Pro, and Model Context Protocol (MCP) Principles with Tool Integration Support
In this hands-on tutorial, we bring the core principles of the Model Context Protocol (MCP) to life by implementing a lightweight, context-aware AI assistant using LangChain, LangGraph, and Google’s Gemini language model. While full MCP integration typically involves dedicated servers and communication protocols, this simplified version demonstrates how the same ideas, context retrieval, tool invocation, and dynamic interaction can be recreated in a single notebook using a modular agent architecture. The assistant can respond to natural language queries and selectively route them to external tools (like a custom knowledge base), mimicking how MCP clients interact with context providers in real-world setups.
Copy Code Copied Use a different Browser
!pip install langchain langchain-google-genai langgraph python-dotenv
!pip install google-generativeai
First, we install essential libraries. The first command installs LangChain, LangGraph, the Google Generative AI LangChain wrapper, and environment variable support via python-dotenv. The second command installs Google& official generative AI client, which enables interaction with Gemini models.
Copy Code Copied Use a different Browser
import os
on[«GEMINI_API_KEY»] = «Your API Key»
Here, we set your Gemini API key as an environment variable so the model can securely access it without hardcoding it into your codebase. Replace & API Key&; with your actual key from Google AI Studio.
Copy Code Copied Use a different Browser
from import BaseTool
from langchain_google_genai import ChatGoogleGenerativeAI
from ts import ChatPromptTemplate
from ges import HumanMessage, AIMessage
from ilt import create_react_agent
import os
model = ChatGoogleGenerativeAI(
model=»gemini-2.0-flash-lite»,
temperature=0.7,
google_api_key=v(«GEMINI_API_KEY»)
)
class SimpleKnowledgeBaseTool(BaseTool):
name: str = «simple_knowledge_base»
description: str = «Retrieves basic information about AI concepts.»
def _run(self, query: str):
knowledge =
«MCP»: «Model Context Protocol (MCP) is an open standard by Anthropic designed to connect AI assistants with external data sources, enabling real-time, context-rich interactions.»,
«RAG»: «Retrieval-Augmented Generation (RAG) enhances LLM responses by dynamically retrieving relevant external documents.»
return (query, «I don’t have information on that topic.»)
async def _arun(self, query: str):
return self._run(query)
kb_tool = SimpleKnowledgeBaseTool()
tools = [kb_tool]
graph = create_react_agent(model, tools)
In this block, we initialize the Gemini language model (gemini-2.0-flash-lite) using LangChain’s ChatGoogleGenerativeAI, with the API key securely loaded from environment variables. We then define a custom tool named SimpleKnowledgeBaseTool that simulates an external knowledge source by returning predefined answers to queries about AI concepts like &; and &; This tool acts as a basic context provider, similar to how an MCP server would operate. Finally, we use LangGraph’s create_react_agent to build a ReAct-style agent that can reason through prompts and dynamically decide when to call tools, mimicking MCP& tool-aware, context-rich interactions principle.
Copy Code Copied Use a different Browser
import nest_asyncio
import asyncio
nest_()
async def chat_with_agent():
inputs = «messages»: []
print(» MCP-Like Assistant ready! Type ‘exit’ to quit.»)
while True:
user_input = input(«nYou: «)
if user_() == «exit»:
print(» Ending chat.»)
break
from ges import HumanMessage, AIMessage
inputs[«messages»].append(HumanMessage(content=user_input))
async for state in am(inputs, stream_mode=»values»):
last_message = state[«messages»]
[-1]
if isinstance(last_message, AIMessage):
print(«nAgent:», last_nt)
inputs[«messages»] = state[«messages»]
await chat_with_agent()
Finally, we set up an asynchronous chat loop to interact with the MCP-inspired assistant. Using nest_asyncio, we enable support for running asynchronous code inside the notebook& existing event loop. The chat_with_agent() function captures user input, feeds it to the ReAct agent, and streams the model& responses in real time. With each turn, the assistant uses tool-aware reasoning to decide whether to answer directly or invoke the custom knowledge base tool, emulating how an MCP client interacts with context providers to deliver dynamic, context-rich responses.
In conclusion, this tutorial offers a practical foundation for building context-aware AI agents inspired by the MCP standard. We& created a functional prototype demonstrating on-demand tool use and external knowledge retrieval by combining LangChain’s tool interface, LangGraph’s agent framework, and Gemini’s powerful language generation. Although the setup is simplified, it captures the essence of MCP’s architecture: modularity, interoperability, and intelligent context injection. From here, you can extend the assistant to integrate real APIs, local documents, or dynamic search tools, evolving it into a production-ready AI system aligned with the principles of the Model Context Protocol.
Here is the . Also, don’t forget to follow us on and join our and . Don’t Forget to join our .
[
The post appeared first on .


