Google Cloud Vertex AI
Note: This is separate from the Google Generative AI
integration,
it exposes Vertex AI Generative
API
on Google Cloud
.
VertexAI exposes all foundational models available in google cloud: -
Gemini (gemini-pro
and gemini-pro-vision
) - Palm 2 for Text
(text-bison
) - Codey for Code Generation (code-bison
)
For a full and updated list of available models visit VertexAI documentation
Setup
By default, Google Cloud does not use customer data to train its foundation models as part of Google Cloud’s AI/ML Privacy Commitment. More details about how Google processes data can also be found in Google’s Customer Data Processing Addendum (CDPA).
To use Vertex AI Generative AI
you must have the
langchain-google-vertexai
Python package installed and either: - Have
credentials configured for your environment (gcloud, workload identity,
etc…) - Store the path to a service account JSON file as the
GOOGLE_APPLICATION_CREDENTIALS environment variable
This codebase uses the google.auth
library which first looks for the
application credentials variable mentioned above, and then looks for
system-level auth.
For more information, see: - https://cloud.google.com/docs/authentication/application-default-credentials#GAC - https://googleapis.dev/python/google-auth/latest/reference/google.auth.html#module-google.auth
%pip install --upgrade --quiet langchain-core langchain-google-vertexai
[notice] A new release of pip is available: 23.2.1 -> 23.3.2
[notice] To update, run: pip install --upgrade pip
Note: you may need to restart the kernel to use updated packages.
Usage
VertexAI supports all LLM functionality.
from langchain_google_vertexai import VertexAI
model = VertexAI(model_name="gemini-pro")
message = "What are some of the pros and cons of Python as a programming language?"
model.invoke(message)
'**Pros:**\n\n* **Easy to learn and use:** Python is known for its simple syntax and readability, making it a great choice for beginners and experienced programmers alike.\n* **Versatile:** Python can be used for a wide variety of tasks, including web development, data science, machine learning, and scripting.\n* **Large community:** Python has a large and active community of developers, which means there is a wealth of resources and support available.\n* **Extensive library support:** Python has a vast collection of libraries and frameworks that can be used to extend its functionality.\n* **Cross-platform:** Python is available for a'
await model.ainvoke(message)
'**Pros:**\n\n* **Easy to learn and use:** Python is known for its simple syntax and readability, making it a great choice for beginners and experienced programmers alike.\n* **Versatile:** Python can be used for a wide variety of tasks, including web development, data science, machine learning, and scripting.\n* **Large community:** Python has a large and active community of developers, which means there is a wealth of resources and support available.\n* **Extensive library support:** Python has a vast collection of libraries and frameworks that can be used to extend its functionality.\n* **Cross-platform:** Python is available for a'
for chunk in model.stream(message):
print(chunk, end="", flush=True)
**Pros:**
* **Easy to learn and use:** Python is known for its simple syntax and readability, making it a great choice for beginners and experienced programmers alike.
* **Versatile:** Python can be used for a wide variety of tasks, including web development, data science, machine learning, and scripting.
* **Large community:** Python has a large and active community of developers, which means there is a wealth of resources and support available.
* **Extensive library support:** Python has a vast collection of libraries and frameworks that can be used to extend its functionality.
* **Cross-platform:** Python is available for a
model.batch([message])
['**Pros:**\n\n* **Easy to learn and use:** Python is known for its simple syntax and readability, making it a great choice for beginners and experienced programmers alike.\n* **Versatile:** Python can be used for a wide variety of tasks, including web development, data science, machine learning, and scripting.\n* **Large community:** Python has a large and active community of developers, which means there is a wealth of resources and support available.\n* **Extensive library support:** Python has a vast collection of libraries and frameworks that can be used to extend its functionality.\n* **Cross-platform:** Python is available for a']
We can use the generate
method to get back extra metadata like safety
attributes
and not just text completions.
result = model.generate([message])
result.generations
[[GenerationChunk(text='**Pros:**\n\n* **Easy to learn and use:** Python is known for its simple syntax and readability, making it a great choice for beginners and experienced programmers alike.\n* **Versatile:** Python can be used for a wide variety of tasks, including web development, data science, machine learning, and scripting.\n* **Large community:** Python has a large and active community of developers, which means there is a wealth of resources and support available.\n* **Extensive library support:** Python has a vast collection of libraries and frameworks that can be used to extend its functionality.\n* **Cross-platform:** Python is available for a')]]
result = await model.agenerate([message])
result.generations
[[GenerationChunk(text='**Pros:**\n\n* **Easy to learn and use:** Python is known for its simple syntax and readability, making it a great choice for beginners and experienced programmers alike.\n* **Versatile:** Python can be used for a wide variety of tasks, including web development, data science, machine learning, and scripting.\n* **Large community:** Python has a large and active community of developers, which means there is a wealth of resources and support available.\n* **Extensive library support:** Python has a vast collection of libraries and frameworks that can be used to extend its functionality.\n* **Cross-platform:** Python is available for a')]]
You can also easily combine with a prompt template for easy structuring of user input. We can do this using LCEL
from langchain_core.prompts import PromptTemplate
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)
chain = prompt | model
question = """
I have five apples. I throw two away. I eat one. How many apples do I have left?
"""
print(chain.invoke({"question": question}))
1. You start with 5 apples.
2. You throw away 2 apples, so you have 5 - 2 = 3 apples left.
3. You eat 1 apple, so you have 3 - 1 = 2 apples left.
Therefore, you have 2 apples left.
You can use different foundational models for specialized in different tasks. For an updated list of available models visit VertexAI documentation
llm = VertexAI(model_name="code-bison", max_output_tokens=1000, temperature=0.3)
question = "Write a python function that checks if a string is a valid email address"
print(model.invoke(question))
```python
import re
def is_valid_email(email):
"""
Checks if a string is a valid email address.
Args:
email: The string to check.
Returns:
True if the string is a valid email address, False otherwise.
"""
# Compile the regular expression for an email address.
regex = re.compile(r"[^@]+@[^@]+\.[^@]+")
# Check if the string matches the regular expression.
return regex.match(email) is not None
```
Multimodality
With Gemini, you can use LLM in a multimodal mode:
from langchain_core.messages import HumanMessage
from langchain_google_vertexai import ChatVertexAI
llm = ChatVertexAI(model_name="gemini-ultra-vision")
image_message = {
"type": "image_url",
"image_url": {"url": "image_example.jpg"},
}
text_message = {
"type": "text",
"text": "What is shown in this image?",
}
message = HumanMessage(content=[text_message, image_message])
output = llm([message])
print(output.content)
This is a Yorkshire Terrier.
Let’s double-check it’s a cat :)
from vertexai.preview.generative_models import Image
i = Image.load_from_file("image_example.jpg")
i
You can also pass images as bytes:
import base64
with open("image_example.jpg", "rb") as image_file:
image_bytes = image_file.read()
image_message = {
"type": "image_url",
"image_url": {
"url": f"data:image/jpeg;base64,{base64.b64encode(image_bytes).decode('utf-8')}"
},
}
text_message = {
"type": "text",
"text": "What is shown in this image?",
}
message = HumanMessage(content=[text_message, image_message])
output = llm([message])
print(output.content)
This is a Yorkshire Terrier.
Please, note that you can also use the image stored in GCS (just point
the url
to the full GCS path, starting with gs://
instead of a local
one).
And you can also pass a history of a previous chat to the LLM:
message2 = HumanMessage(content="And where the image is taken?")
output2 = llm([message, output, message2])
print(output2.content)
You can also use the public image URL:
image_message = {
"type": "image_url",
"image_url": {
"url": "https://python.langchain.com/assets/images/cell-18-output-1-0c7fb8b94ff032d51bfe1880d8370104.png",
},
}
text_message = {
"type": "text",
"text": "What is shown in this image?",
}
message = HumanMessage(content=[text_message, image_message])
output = llm([message])
print(output.content)
Vertex Model Garden
Vertex Model Garden exposes open-sourced models that can be deployed and served on Vertex AI. If you have successfully deployed a model from Vertex Model Garden, you can find a corresponding Vertex AI endpoint in the console or via API.
from langchain_google_vertexai import VertexAIModelGarden
llm = VertexAIModelGarden(project="YOUR PROJECT", endpoint_id="YOUR ENDPOINT_ID")
llm.invoke("What is the meaning of life?")
Like all LLMs, we can then compose it with other components:
prompt = PromptTemplate.from_template("What is the meaning of {thing}?")
chain = prompt | llm
print(chain.invoke({"thing": "life"}))