«`html
Build a Powerful Multi-Tool AI Agent Using Nebius with Llama 3 and Real-Time Reasoning Tools
This tutorial introduces an advanced AI agent built using Nebius’ robust ecosystem, particularly the ChatNebius, NebiusEmbeddings, and NebiusRetriever components. The agent utilizes the Llama-3.3-70B-Instruct-fast model to generate high-quality responses, incorporating external functionalities such as Wikipedia search, contextual document retrieval, and safe mathematical computation. By combining structured prompt design with LangChain’s modular framework, this tutorial demonstrates how to build a multi-functional, reasoning-capable AI assistant that is both interactive and extensible. Whether for scientific queries, technological insights, or basic numerical tasks, this agent showcases the potential of Nebius as a platform for building sophisticated AI systems.
Target Audience Analysis
The target audience for this tutorial primarily consists of:
- AI developers and engineers looking to enhance their skills in building interactive AI agents.
- Business managers interested in leveraging AI for operational efficiency and decision-making.
- Researchers and academics focused on AI applications in various domains.
Common pain points include:
- Difficulty in integrating multiple AI functionalities into a single tool.
- Need for real-time data processing and contextual responses.
- Concerns about the safety and accuracy of AI computations.
Goals include:
- Building a versatile AI agent capable of performing complex tasks.
- Improving user interaction and satisfaction through contextual understanding.
- Streamlining workflows by automating repetitive tasks.
Interests often revolve around:
- Latest advancements in AI technologies and frameworks.
- Real-world applications of AI in business scenarios.
- Networking with like-minded professionals in the AI community.
Preferred communication methods include detailed tutorials, webinars, and interactive coding sessions.
Implementation Overview
We begin by installing essential libraries, including langchain-nebius, langchain-core, langchain-community, and Wikipedia, which are crucial for building a feature-rich AI assistant. It then imports necessary modules such as os, getpass, datetime, and typing utilities, and initializes the Wikipedia API for external data access.
!pip install -q langchain-nebius langchain-core langchain-community wikipedia
Next, we import core components from LangChain and Nebius to enable document handling, prompt templating, output parsing, and tool integration. It sets up key classes such as ChatNebius for language modeling, NebiusEmbeddings for vector representation, and NebiusRetriever for semantic search. The user’s Nebius API key is securely accessed using getpass to authenticate subsequent API interactions.
from langchain_core.documents import Document
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnablePassthrough
from langchain_core.tools import tool
from langchain_nebius import ChatNebius, NebiusEmbeddings, NebiusRetriever
if "NEBIUS_API_KEY" not in os.environ:
os.environ["NEBIUS_API_KEY"] = getpass.getpass("Enter your Nebius API key: ")
Advanced AI Agent Class
The core of the implementation is encapsulated within the AdvancedNebiusAgent
class, which orchestrates reasoning, retrieval, and tool integration. It initializes a high-performance LLM from Nebius (meta-llama/Llama-3.3-70B-Instruct-fast). It sets up a semantic retriever based on embedded documents, forming a mini knowledge base that covers topics such as AI, quantum computing, blockchain, and more. A dynamic prompt template guides the agent’s responses by including retrieved context, external tool outputs, and the current date.
class AdvancedNebiusAgent:
"""Advanced AI Agent with retrieval, reasoning, and external tool capabilities"""
def __init__(self):
self.llm = ChatNebius(model="meta-llama/Llama-3.3-70B-Instruct-fast")
self.embeddings = NebiusEmbeddings()
self.knowledge_base = self._create_knowledge_base()
self.retriever = NebiusRetriever(
embeddings=self.embeddings,
docs=self.knowledge_base,
k=3
)
self.agent_prompt = ChatPromptTemplate.from_template("""
You are an advanced AI assistant with access to:
1. A knowledge base about technology and science
2. Wikipedia search capabilities
3. Mathematical calculation tools
4. Current date/time information
Context from knowledge base:
{context}
External tool results:
{tool_results}
Current date: {current_date}
User Query: {query}
Instructions:
- Use the knowledge base context when relevant
- If you need additional information, mention what external sources would help
- Be comprehensive but concise
- Show your reasoning process
- If calculations are needed, break them down step by step
Response:
""")
Knowledge Base Creation
The agent’s knowledge base is created to include essential documents covering various topics:
def _create_knowledge_base(self) -> List[Document]:
"""Create a comprehensive knowledge base"""
return [
Document(
page_content="Artificial Intelligence (AI) is transforming industries through ML, NLP, and computer vision. Key applications include autonomous vehicles, medical diagnosis, and financial trading.",
metadata={"topic": "AI", "category": "technology"}
),
Document(
page_content="Quantum computing uses quantum mechanical phenomena like superposition and entanglement to process information. Companies like IBM, Google, and Microsoft are leading quantum research.",
metadata={"topic": "quantum_computing", "category": "technology"}
),
Document(
page_content="Climate change is caused by greenhouse gas emissions, primarily CO2 from fossil fuels. Renewable energy sources are crucial for mitigation.",
metadata={"topic": "climate", "category": "environment"}
),
Document(
page_content="CRISPR-Cas9 is a gene editing technology that allows precise DNA modifications. It has applications in treating genetic diseases and improving crops.",
metadata={"topic": "biotechnology", "category": "science"}
),
Document(
page_content="Blockchain technology enables decentralized, secure transactions without intermediaries. It has applications in supply chain, healthcare, and voting systems.",
metadata={"topic": "blockchain", "category": "technology"}
),
Document(
page_content="Space exploration has advanced with reusable rockets, Mars rovers, and commercial space travel. SpaceX, Blue Origin, and NASA are pioneering new missions.",
metadata={"topic": "space", "category": "science"}
),
Document(
page_content="Renewable energy costs have dropped dramatically. Solar & wind power are now cheaper than fossil fuels in many regions, driving global energy transition.",
metadata={"topic": "renewable_energy", "category": "environment"}
),
Document(
page_content="5G networks provide ultra-fast internet speeds and low latency, enabling IoT devices, autonomous vehicles, and augmented reality applications.",
metadata={"topic": "5G", "category": "technology"}
)
]
Integration of External Tools
Two built-in tools enhance the agent’s functionality:
- wikipedia_search: Accesses Wikipedia for additional information.
- calculate: Performs mathematical calculations safely.
@tool
def wikipedia_search(query: str) -> str:
"""Search Wikipedia for additional information"""
try:
search_results = wikipedia.search(query, results=3)
if not search_results:
return f"No Wikipedia results found for '{query}'"
page = wikipedia.page(search_results[0])
summary = wikipedia.summary(search_results[0], sentences=3)
return f"Wikipedia: {page.title}\n{summary}\nURL: {page.url}"
except Exception as e:
return f"Wikipedia search error: {str(e)}"
@tool
def calculate(expression: str) -> str:
"""Perform mathematical calculations safely"""
try:
allowed_chars = set('0123456789+-*/.() ')
if not all(c in allowed_chars for c in expression):
return "Error: Only basic mathematical operations allowed"
result = eval(expression)
return f"Calculation: {expression} = {result}"
except Exception as e:
return f"Calculation error: {str(e)}"
Query Processing
The process_query
method brings it all together, dynamically invoking the prompt chain with context, tools, and reasoning to generate informative, multi-source answers.
def process_query(self, query: str, use_wikipedia: bool = False,
calculate_expr: str = None) -> str:
"""Process a user query with optional external tools"""
relevant_docs = self.retriever.invoke(query)
context = self._format_docs(relevant_docs)
tool_results = []
if use_wikipedia:
wiki_keywords = self._extract_keywords(query)
if wiki_keywords:
wiki_result = self.wikipedia_search(wiki_keywords)
tool_results.append(f"Wikipedia Search: {wiki_result}")
if calculate_expr:
calc_result = self.calculate(calculate_expr)
tool_results.append(f"Calculation: {calc_result}")
tool_results_str = "n".join(tool_results) if tool_results else "No external tools used"
chain = (
{
"context": lambda x: context,
"tool_results": lambda x: tool_results_str,
"current_date": lambda x: self._get_current_date(),
"query": RunnablePassthrough()
}
| self.agent_prompt
| self.llm
| StrOutputParser()
)
return chain.invoke(query)
Conclusion
This Nebius-powered agent exemplifies how to effectively combine LLM-driven reasoning with structured retrieval and external tool usage to build a capable, context-aware assistant. By integrating LangChain with Nebius APIs, the agent accesses a curated knowledge base, fetches live data from Wikipedia, and handles arithmetic operations with safety checks. The tutorial’s modular architecture, featuring prompt templates, dynamic chaining, and customizable inputs, provides a powerful blueprint for developers seeking to create intelligent systems that surpass static large language model (LLM) responses.
Check out the Codes. All credit for this research goes to the researchers of this project. Follow us on Twitter and join our 100k+ ML SubReddit.
«`