Connecting CrewAI to Google's Gemini
-
CrewAI defaults to using GPT as its LLM. Although the official website introduces methods to connect other models like Llama2, it does not detail how to connect Google's Gemini. However, we can follow the official approach to explore how to link Gemini.
ref: https://docs.crewai.com/how-to/LLM-Connections/
First, we notice that CrewAI's LLM is compatible with Langchain's LLM. Therefore, we can use a Langchain instance connected to Gemini as the LLM for the Agent. To connect Langchain to an LLM, we use langchain-google-genai:
pip install langchain-google-genai
In the code, declare it as follows:
from langchain_google_genai import ChatGoogleGenerativeAI import os api_key = os.getenv("GOOGLE_API_KEY") llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash-latest", google_api_key=api_key)
Then, pass this LLM to the Agent's LLM parameter, as shown in the following example:
from crewai import Agent agent = Agent( role="a role", goal="a goal", backstory="a backstory", llm=llm )
In essence, it is straightforward. As long as Langchain supports an LLM, CrewAI can theoretically support it as well.
-
-