To access the Gemini API and the Gemini family of models directly from your Generative AI powered Android apps, its recommended to use the Vertex AI in Firebase SDK for Android. This SDK is part of the larger Firebase platform that helps you build and run full-stack apps.
As per the official Google documentation, the Google AI client SDK for Android should be used for prototyping only. If you embed a Gemini API key directly in your Android app or fetch it remotely at runtime, you risk potentially exposing your API key to malicious actors.
The Vertex AI in Firebase SDK is similar to the Google AI client SDK, but the Vertex AI in Firebase SDK offers critical security options and other features for production use cases. For example, when using Vertex AI in Firebase you can also use the following:
- Firebase App Check to protect the Gemini API from abuse by unauthorized clients.
- Firebase Remote Config to dynamically set and change values for your app in the cloud (for example, model names) without the need to release a new version of your app.
- Cloud Storage for Firebase to include large media files in your request to the Gemini API.
Now, let’s get started with Vertex AI,
First, setup the project in Firebase as mentioned here.
Secondly, add the following dependencies in your apps module build.gradle file
implementation(platform("com.google.firebase:firebase-bom:33.9.0"))
implementation("com.google.firebase:firebase-vertexai")
Then, initialize the Vertex AI service and the model and configuration that you want to use.
val config = generationConfig {
temperature = 0.7f
}
val generativeModel = Firebase.vertexAI.generativeModel(
modelName = "gemini-2.0-flash",
generationConfig = config
)
Here, we are using the generationConfig helper method and just setting a temperature, however you can add other safety settings as well( if you’d want to block harassment or hate speech content).
Then, to create a continuous conversation between user and the bot(LLM in this case), we’ll need to use Multi-turn chat API of Vertex AI SDK. We initialise the chat with startChat()function, to which we can provide the conversation history as well.
private val chat = generativeModel.startChat(
history = listOf(
content(role = "user") { text("Hello, I need some help.") },
content(role = "model") { text("Great to meet you. How can I help you?") }
)
)
Next, we can send a message to the LLM and get the response using sendMessage() as shown below,
val response = chat.sendMessage(userMessage)
response.text?.let { modelResponse ->
_uiState.value.addMessage(
ChatMessage(
text = modelResponse,
participant = Participant.MODEL,
isPending = false
)
)
}
Make sure you have INTERNET permission in your app as Vertex AI client SDK for Android needs it to remotely access the Gemini API for generative AI. Below is how the Gemini Coach application looks and works,
Conclusion
Vertex AI Client SDK has multiple capabilities, multi-turn chat covered here is one of them.
Hope this blog helps you get started with building chat-based android apps using Vertex AI using Firebase. Stay tuned for more such content.
Building Generative AI-powered Chat Android apps with Vertex AI was originally published in Google Developer Experts on Medium, where people are continuing the conversation by highlighting and responding to this story.