Console Login

Use AI Apps to build custom AI Agents with Nvidia NIM

This guide will help you set up and run and build Custom AI Agent that has access to connectors

Prerequisites

  1. Create Connectors:
    • Log in to your Xpander account.
    • Go to the Connectors section.
    • Create connectors for Notion and Slack. Follow the respective instructions to authorize access.
  2. Define Your AI App:
    • In Xpander, create a new AI App type Nvidia NIM named for example "Notion AI Agent".
    • Set up the AI App to have access to your Notion and Slack connectors.
    • Obtain the AI App API key from the xpander dashboard.
  3. API Keys: Obtain your API keys from Xpander AI App and your LLM Provider (NVIDIA_NIM_API_KEY)

Single Agent Example

Step 1 : create python virtual environment and install xpander-sdk library

python3 -m venv .venv
source .venv/bin/activate
pip install https://assets.xpanderai.io/xpander-sdk.tar.gz
pip freeze > requirements.txt #Optional

Create a Python script (e.g., single_agent_example.py) and add the following code:

from xpander_sdk import XpanderClient, LLMProvider, NvidiaNIMSupportedModels
from openai import OpenAI as NvidiaChat
import os

# Retrieve the XPANDER API key from environment variables.
xpanderAPIKey = os.environ.get("XPANDER_API_KEY", "")

# Initialize the XpanderClient with the agent key, agent URL, and specifying the LLM provider as NVIDIA NIM.
xpander_client = XpanderClient(
  agent_key=xpanderAPIKey,
  agent_url="https://inbound.xpander.ai/agent/<agent-id>",
  llm_provider=LLMProvider.NVIDIA_NIM
)

# Retrieve the available tools from the XpanderClient.
tools = xpander_client.tools()

# Retrieve the NVIDIA API key from environment variables.
NvidiaAPIKey = os.environ.get("NVIDIA_NIM_API_KEY", "")

# Initialize the NvidiaChat client with the Nvidia API key and base URL.
NvidiaChat_Client = NvidiaChat(api_key=NvidiaAPIKey, base_url="https://integrate.api.nvidia.com/v1")

# Define the message that will be sent to the LLM.
messages = [
    {
        "role": "user",
        "content": "Find news on Nvidia NIM function calling capability"
    }
]

# Use the NvidiaChat client to create a chat completion using the specified model, messages, tools, and tool choice.
llm_response = NvidiaChat_Client.chat.completions.create(
    model=NvidiaNIMSupportedModels.LLAMA_3_1_70_B_INSTRUCT,
    messages=messages,
    tools=tools,
    max_tokens=1024,
    tool_choice="required"
)

# Call the Xpander tool with the response from the LLM and print the response message.
tool_responses = xpander_client.xpander_tool_call(tool_selector_response=llm_response.model_dump())
for tool_response in tool_responses:
    print(tool_response.response_message)

Running the Single Agent Example

  1. Run the Python script:

    python single_agent_example.py
    
  2. View the response:
    The script will retrieve the last blog post from your Content DB Database in Notion using the configured Notion and Nvidia NIM integrations.