Agentic Ai Learning Kit
20by NihalJain
This repository serves as a kit for learning how to build Agentic AI applications.
Getting Started
Or connect to the hosted endpoint: https://www.linkedin.com/in/nihaljain/
README
Agentic AI Starter Kit
This repository serves as a kit for learning how to build Agentic AI applications. Currently, it has code to demonstrate two key communication patterns: Agent-to-Agent (A2A) communication and Model Context Protocol (MCP) communication. We will add more concepts in the future.
Project Structure
-
a2a/: Contains a hands-on demonstration of the Agent-to-Agent (A2A) communication pattern. This includes:- Detailed
README.mdfor setup and running A2A demos.
- Detailed
-
mcp/: Contains a demonstration of the Model Context Protocol (MCP) communication pattern. This includes:clients/: Client demo application (mcp_client_demo.py).servers/: Example MCP servers (e.g.,math_server.py).- Detailed
README.mdfor setup and running MCP demos.
-
.env: Environment variables for configuring LLM access and other settings.
Prerequisites
- Python 3.12.7 or higher
- Basic familiarity with Python and command-line operations.
Setup Instructions
-
Clone the Repository
git clone https://github.com/NihalJain/agentic-ai-learning-kit.git cd agentic-ai-learning-kit -
Set Up Virtual Environment
python3 -m venv .venv # On Windows: .venv\Scripts\Activate.ps1 # On macOS/Linux: source .venv/bin/activate -
Configure Environment Variables Create a
.envfile in the root directory of the project (agentic-ai--learning-kit) and populate it with the necessary LLM API keys and other configurations.
LLM_MODEL_PROVIDER: (Required) Specifies the model provider (googleoropenai). Defaults togoogle, if not explicitly set.LLM_API_KEY: (Required) The API key for authenticating with the LLM provider.LLM_BASE_URL: (Optional) The base URL of the LLM API. Defaults tohttps://generativelanguage.googleapis.com/v1betaforgooglesource, orhttps://api.openai.com/v1/foropenaisource.LLM_MODEL_NAME: (Optional) The name of the LLM model to use. Defaults togemini-2.5-proforgooglesource, orgpt-4o-miniforopenaisource.LLM_TEMPERATURE: (Optional) The temperature setting for the LLM. Defaults to0.
Refer to the a2a/README.md and mcp/README.md for detailed information on required environment variables for each
respective demo.
Running the Demos
To run the demos, navigate into the respective subdirectories and follow their README.md instructions:
- A2A Demos: See
a2a/README.mdfor detailed instructions. - MCP Demos: See
mcp/README.mdfor detailed instructions.
Author Information
For any issues or queries, please contact the author:
- Name: Nihal Jain
- Email: nihaljain.cs@gmail.com