Provider

This chapter describes the concept of a provider.

Idea

A Provider in the Maia Framework is used as a bridge between an Agent and the model. As mentioned in previous chapters, Agent is just an abstraction, so in tests, agents can be used without worrying if it is OpenAI, Anthropic, or Ollama. However, we need to somehow tell the agent how it should communicate with the model - this is the job of the Provider.

Built-in providers

MAIA comes with built-in providers to build tests super-fast. This group will grow over time, developed either by the MAIA core team or by the community.

Generic Lite LLM

This provider is an implementation of the LiteLLM library, a popular abstraction over models. You can easily create a new agent by just providing the name of the model and the API endpoint.

Here is the example:

  self.create_agent(
      name="Bob",
      provider=GenericLiteLLMProvider(config={
          "model": "ollama/mistral",
          "api_base": "http://localhost:11434"
      }),
      system_message="You are an assistant who only suggests clothing."
  )

Note:

A provider can also be loaded using a configuration file (see: Config), but this is an advanced concept, so we encourage you to get familiar with the direct approach first.

Ollama

We also have a dedicated OllamaProvider:

  self.create_agent(
      name="Bob",
      provider=OllamaProvider(config={
          "model": "mistral"
      }),
      system_message="You are an assistant who only suggests clothing."
  )

CrewAI

You can also easily integrate CrewAI into our test framework:

from typing import List
from crewai import Agent, Task, Crew, LLM
import pytest
from maia_test_framework.providers.crewai import CrewAIProvider
from maia_test_framework.core.message import Message
from maia_test_framework.testing.base import MaiaTest

# Input mapper (maps history to Crew input)
def crew_input_mapper(history: List[Message], system_message: str):
    user_prompt = history[-1].content if history else ""
    return {"query": user_prompt, "system": system_message}

class TestCrewAIIntegration(MaiaTest):
    def setup_agents(self):

        llm=LLM(model="ollama/mistral", base_url="http://localhost:11434")

        # Define simple agents
        researcher = Agent(
            role="Researcher",
            goal="Find information about countries",
            backstory="Expert in geography",
            llm=llm
        )
        responder = Agent(
            role="Responder",
            goal="Answer user queries concisely",
            backstory="Skilled in communication",
            llm=llm
        )

        # Define a task
        task = Task(description="Answer the query: {query}", expected_output="Short factual answer", agent=responder)

        # Create crew
        crew = Crew(agents=[researcher, responder], tasks=[task])

        # Provider
        crewai_provider = CrewAIProvider(config={
            "crew": crew,
            "input_mapper": crew_input_mapper
        })

        # Register agent
        self.create_agent(
            name="CrewAgent",
            provider=crewai_provider,
            system_message="You are a helpful multi-agent assistant."
        )

    @pytest.mark.asyncio
    async def test_crewai_agent(self):
        session = self.create_session(["CrewAgent"])
        await session.user_says("What is the capital of France?")
        response = await session.agent_responds("CrewAgent")

        assert "Paris" in response.content

LangChain

We support also LangChain framework:

import pytest
from langchain_ollama import OllamaLLM
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
from maia_test_framework.providers.langchain import LangChainProvider
from maia_test_framework.testing.base import MaiaTest
from typing import List, Dict, Any
from maia_test_framework.core.message import Message

def simple_input_mapper(history: List[Message], system_message: str) -> Dict[str, Any]:
    user_prompt = history[-1].content if history else ""
    place = user_prompt.split("of ")[-1].replace("?", "").strip()
    return {"place": place, "system": system_message}

def simple_output_parser(raw: Dict[str, Any]) -> str:
    return raw.get("output") or raw.get("answer") or str(raw)

class TestLangChainIntegration(MaiaTest):
    def setup_agents(self):
        llm = OllamaLLM(model="mistral")
        prompt = PromptTemplate.from_template("What is the capital of {place}?")
        chain = LLMChain(llm=llm, prompt=prompt)

        langchain_provider = LangChainProvider(config={
            "chain": chain,
            "input_mapper": simple_input_mapper,
            "output_parser": simple_output_parser
        })

        self.create_agent(
            name="LangChainAgent",
            provider=langchain_provider,
            system_message="You are a helpful assistant.",
        )

    @pytest.mark.asyncio
    async def test_langchain_agent(self):
        session = self.create_session(["LangChainAgent"])
        await session.user_says("What is the capital of Poland?")
        response = await session.agent_responds("LangChainAgent")
        
        assert "Warsaw" in response.content

Mock

Sometimes you might want to completely mock the AI agent. For instance, you might want to simulate an AI agent's response by providing some corner-case messages. You can do this by creating an agent with MockProvider:

  self.create_agent(
      name="Alice",
      provider=MockProvider(config={
          "responses": [
              "Hello, I am a mock agent.",
              "I am doing well, thank you for asking."
          ]
      })
  )

  def echo_bot(user_prompt):
      return f"You said: {user_prompt}"

  self.create_agent(
      name="Bob",
      provider=MockProvider(config={
          "response_function": echo_bot
      })
  )

The MockProvider takes a hardcoded list of responses or a response_function that can be used to react to a prompt.

Existing

Note:

This is an advanced concept for specific scenarios.

To be prepared for very custom providers, MAIA provides the ability to pass an existing agent solution. This means you can have your own private model solution and still use it in MAIA.

Here is an example of such an agent:

class ComplexResponseAgent:
    """An agent with a custom query method and a complex response format."""
    def query(self, prompt: str) -> dict:
        return {"response": f"Complex: {prompt}", "data": [1, 2, 3]}

...

    # Agent with a custom method name and a response extractor
    self.create_agent(
        name="ComplexAgent",
        provider=ExistingAgentProvider(config={
            "agent_instance": ComplexResponseAgent(),
            "call_method": "query",
            "response_extractor": lambda r: r["response"]
        })
    )

Note:

We will be constantly building new providers, but if you would like to contribute, we would be more than happy! The instructions on how to write a new provider will be shared soon. Stay tuned!