Execute firebase platform expert with Vertex AI Gemini integration for Authentication, Firestore, Storage, Functions, Hosting, and AI-powered features. Use when asked to "setup firebase", "deploy to firebase", or "integrate vertex ai with firebase". Trigger with relevant phrases based on skill purpose.
Installation
Details
Usage
After installing, this skill will be available to your AI coding assistant.
Verify installation:
skills listSkill Instructions
name: firebase-vertex-ai description: | Execute firebase platform expert with Vertex AI Gemini integration for Authentication, Firestore, Storage, Functions, Hosting, and AI-powered features. Use when asked to "setup firebase", "deploy to firebase", or "integrate vertex ai with firebase". Trigger with relevant phrases based on skill purpose. allowed-tools: Read, Write, Edit, Grep, Glob, Bash(cmd:*) version: 1.0.0 author: Jeremy Longshore jeremy@intentsolutions.io license: MIT
Firebase Vertex AI
Operate Firebase projects end-to-end (Auth, Firestore, Functions, Hosting) and integrate Gemini/Vertex AI safely for AI-powered features.
Overview
Use this skill to design, implement, and deploy Firebase applications that call Vertex AI/Gemini from Cloud Functions (or other GCP services) with secure secrets handling, least-privilege IAM, and production-ready observability.
Prerequisites
- Node.js runtime and Firebase CLI access for the target project
- A Firebase project (billing enabled for Functions/Vertex AI as needed)
- Vertex AI API enabled and permissions to call Gemini/Vertex AI from your backend
- Secrets managed via env vars or Secret Manager (never in client code)
Instructions
- Initialize Firebase (or validate an existing repo): Hosting/Functions/Firestore as required.
- Implement backend integration:
- add a Cloud Function/HTTP endpoint that calls Gemini/Vertex AI
- validate inputs and return structured responses
- Configure data and security:
- Firestore rules + indexes
- Storage rules (if applicable)
- Auth providers and authorization checks
- Deploy and verify:
- deploy Functions/Hosting
- run smoke tests against deployed endpoints
- Add ops guardrails:
- logging/metrics
- alerting for error spikes
- basic cost controls (budgets/quotas) where appropriate
Output
- A deployable Firebase project structure (configs + Functions/Hosting as needed)
- Secure backend code that calls Gemini/Vertex AI (with secrets handled correctly)
- Firestore/Storage rules and index guidance
- A verification checklist (local + deployed) and CI-ready commands
Error Handling
- Auth failures: identify the principal and missing permission/role; fix with least privilege.
- Billing/API issues: detect which API or quota is blocking and provide remediation steps.
- Firestore rule/index problems: provide minimal repro queries and rule fixes.
- Vertex AI call failures: surface model/region mismatches and add retries/backoff for transient errors.
Examples
Example: Gemini-backed chat API on Firebase
- Request: “Deploy Hosting + a Function that powers a Gemini chat endpoint.”
- Result:
/api/chatfunction, Secret Manager wiring, and smoke tests.
Example: Firestore-powered RAG
- Request: “Build a RAG flow that embeds docs and answers with citations.”
- Result: ingestion plan, embedding + index strategy, and evaluation prompts.
Resources
- Full detailed guide (kept for reference):
{baseDir}/references/SKILL.full.md - Firebase docs: https://firebase.google.com/docs
- Cloud Functions for Firebase: https://firebase.google.com/docs/functions
- Vertex AI docs: https://cloud.google.com/vertex-ai/docs
More by jeremylongshore
View allRabbitmq Queue Setup - Auto-activating skill for Backend Development. Triggers on: rabbitmq queue setup, rabbitmq queue setup Part of the Backend Development skill category.
evaluating-machine-learning-models: This skill allows Claude to evaluate machine learning models using a comprehensive suite of metrics. It should be used when the user requests model performance analysis, validation, or testing. Claude can use this skill to assess model accuracy, precision, recall, F1-score, and other relevant metrics. Trigger this skill when the user mentions "evaluate model", "model performance", "testing metrics", "validation results", or requests a comprehensive "model evaluation".
building-neural-networks: This skill allows Claude to construct and configure neural network architectures using the neural-network-builder plugin. It should be used when the user requests the creation of a new neural network, modification of an existing one, or assistance with defining the layers, parameters, and training process. The skill is triggered by requests involving terms like "build a neural network," "define network architecture," "configure layers," or specific mentions of neural network types (e.g., "CNN," "RNN," "transformer").
Oauth Callback Handler - Auto-activating skill for API Integration. Triggers on: oauth callback handler, oauth callback handler Part of the API Integration skill category.
