Skip to content

Using Gemini in Apps Script

Published on Markdown

This has been a long time coming, and it is finally here. Apps Script now has a new Vertex AI advanced service!

⚠️⚠️ Important: In my testing, it is not possible to use Gemini 3 models, like gemini-3-pro-preview, because they are in preview. You can use the gemini-2.5-pro model instead.

Previously, if you wanted to call Gemini or other Vertex AI models, you had to manually construct UrlFetchApp requests, handle bearer tokens, and manage headers. It was doable, but verbose and annoying.

The Vertex AI Advanced Service

The new service, VertexAI, allows you to interact with the Vertex AI API directly. This means you can generate text, images, and more with significantly less boilerplate code. You can check out the full Vertex AI REST reference docs for more details on available methods and parameters.

Before: The Old Way

In my previous post on Using Vertex AI in Apps Script, the code looked like this:

function predict(prompt) {
  const URL = `${BASE}/v1/projects/${PROJECT_ID}/locations/us-central1/publishers/google/models/${MODEL}:predict`;
  const options = {
    method: "post",
    headers: { Authorization: `Bearer ${ACCESS_TOKEN}` },
    muteHttpExceptions: true,
    contentType: "application/json",
    payload: JSON.stringify(payload),
  };
  const response = UrlFetchApp.fetch(URL, options);
  // ... parsing logic ...
}

After: The New Way

Now, with the built-in service, it’s just:

const response = VertexAI.Endpoints.generateContent(payload, model);
// ... parsing logic ...

Prerequisites

To use the Vertex AI advanced service, you need to do the following, which you already did if your were using via UrlFetchApp:

  1. Google Cloud Project: You need a Standard GCP Project (not the default Apps Script managed one).

  2. Billing Enabled: Vertex AI requires a billing account attached to the project.

    Enable Billing for Vertex AI

    Enable Billing for Vertex AI

    Failed to Leverage Core Competencies Enable Billing

    Failed to Leverage Core Competencies Enable Billing

    Failed to leverage core competencies: API call to aiplatform.endpoints.generateContent failed with error: This API method requires billing to be enabled. Please enable billing on project … then retry.

  3. API Enabled: Enable the Vertex AI API in your Cloud Console.

  4. Apps Script Configuration: Add your Cloud Project number in Project Settings.

  5. Add Service: Enable the Vertex AI advanced service in the “Services” section of the editor.

    Apps Script Cloud Project Number

    Apps Script Cloud Project Number

    Add Vertex AI Service

    Add Vertex AI Service

    Enable Vertex AI Service

    Enable Vertex AI Service

    Or manually enable it in appsscript.json:

    JSON
    {
      "timeZone": "America/Denver",
      "dependencies": {
        "enabledAdvancedServices": [
          {
            "userSymbol": "VertexAI",
            "version": "v1",
            "serviceId": "aiplatform"
          }
        ]
      },
      "exceptionLogging": "STACKDRIVER",
      "runtimeVersion": "V8"
    }

Code Snippet: The Gen Z Translator

To demonstrate the power of this service for educators, let’s build the “Gen Z” Translator. This tool takes student emails filled with slang and translates them into proper Victorian-era English, ensuring clear communication.

JavaScript
/**
 * Translates an email from "Gen Z" slang into proper Victorian English.
 * @param {string} emailBody - The student's email text (e.g., "no cap this exam was mid").
 * @return {string} The translated text suitable for a 19th-century gentleman or scholar.
 */
function translateGenZtoVictorian_(emailBody) {
  const projectId = "jpoehnelt-blog";
  const region = "us-central1";
  const modelName = "gemini-2.5-pro"; 

  const model = `projects/${projectId}/locations/${region}/publishers/google/models/${modelName}`;

  const prompt = `
    You are a distinguished Victorian-era scholar and translator. 
    Your task is to translate the following email, written in modern "Gen Z" slang, 
    into formal, elegant Victorian English. 
    
    Maintain the core meaning but completely change the tone to be exceedingly polite, 
    verbose, and aristocratic.

    Student's Email: "${emailBody}"
    
    Victorian Translation:
  `;

  const payload = {
    contents: [
      {
        role: "user",
        parts: [{ text: prompt }],
      },
    ],
    generationConfig: {
      temperature: 0.7,
    },
  };

  try {
    const response = VertexAI.Endpoints.generateContent(payload, model);
    const translation = response?.candidates?.[0]?.content?.parts?.[0]?.text;

    if (translation) {
      console.log(`Student said: ${emailBody}`);
      console.log(`Professor heard\n\n: ${translation}`);
      return translation;
    }
    return "Error: The telegram was lost in transit.";
  } catch (e) {
    console.error("Translation failed:", e.message);
    return "Error: An unfathomable calamity has occurred.";
  }
}

function runTranslatorDemo() {
  const studentEmail =
    "Hey prof, that lecture today was straight fire. The vibes were immaculate and I'm lowkey obsessed with this topic. Slay.";
  translateGenZtoVictorian_(studentEmail);
}

The result:

3:35:40 PM	Notice	Execution started
3:36:02 PM	Info	Student said: Hey prof, that lecture today was straight fire. The vibes were immaculate and I'm lowkey obsessed with this topic. Slay.
3:36:02 PM	Info	Professor heard

My Dearest Professor,

Permit me to express, with the utmost sincerity, my profound admiration for your discourse this day. It was a truly masterful and illuminating exposition, delivered with a passion that can only be described as incandescent.

The intellectual atmosphere you so deftly cultivated within the hall was of the most superlative quality; a veritable feast for the mind. Indeed, I confess that you have awakened within me a most fervent and, I daresay, burgeoning obsession with the subject matter, a fascination I had not previously known myself to possess.

It was, in all respects, a triumph of scholarly erudition.

I have the honour to remain, Sir,
Your most humble and devoted student.
3:36:02 PM	Notice	Execution completed

Code Snippet: Corporate Jargon Generator

And if you need to translate in the other direction—from simple human emotion to soul-crushing business speak—we have you covered too. This snippet does the exact opposite, turning honest phrases into “synergistic deliverables.”

JavaScript
JavaScript
/**
 * Generates corporate jargon from a simple phrase using Gemini.
 * @param {string} simplePhrase - The simple phrase to translate (e.g., "I'm going to lunch").
 * @return {string} The corporate jargon version.
 */
function prioritizeSynergy_(simplePhrase) {
  // TODO: Replace with your actual project ID
  const projectId = "jpoehnelt-blog";
  const region = "global";
  const modelName = "gemini-3-pro-preview";

  const model = `projects/${projectId}/locations/${region}/publishers/google/models/${modelName}`;

  const prompt = `
    Rewrite the following simple phrase into overly complex, cringeworthy corporate jargon. 
    Make it sound like a LinkedIn thought leader who just discovered a thesaurus.
    
    Simple phrase: "${simplePhrase}"
  `;

  const payload = {
    contents: [
      {
        role: "user",
        parts: [{ text: prompt }],
      },
    ],
    generationConfig: {
      temperature: 0.9, // Max creativity for max cringe
    },
  };

  try {
    const response = VertexAI.Endpoints.generateContent(payload, model);

    VertexAI.En

    // Safety check just in case the AI refuses to be that annoying
    const jargon = response?.candidates?.[0]?.content?.parts?.[0]?.text;

    if (jargon) {
      console.log(`Original: ${simplePhrase}`);
      console.log(`Corporate:\n\n${jargon}`);
      return jargon;
    } else {
      return "Error: Synergy levels critical. Please circle back.";
    }
  } catch (e) {
    console.error("Failed to leverage core competencies:", e.message);
    return "Error: Blocker identified.";
  }
}

function runDemo() {
  prioritizeSynergy_("I made a mistake.");
  prioritizeSynergy_("Can we meet later?");
  prioritizeSynergy_("I need a raise.");
}

The result:

3:29:25 PM	Notice	Execution started
3:29:32 PM	Info	Original: I made a mistake.
3:29:32 PM	Info	Corporate:

It has come to my attention, through rigorous self-assessment and a steadfast commitment to continuous improvement, that a momentary lapse in strategic foresight led to a suboptimal outcome, which I am now diligently leveraging as a foundational catalyst for enhanced future performance metrics.
3:29:41 PM	Info	Original: Can we meet later?
3:29:41 PM	Info	Corporate:

Considering the dynamic parameters of our current operational cadence, might we strategically align our respective bandwidths for a high-impact ideation interface at a mutually agreeable, post-meridian temporal increment?
3:29:52 PM	Info	Original: I need a raise.
3:29:52 PM	Info	Corporate:

In order to strategically galvanize optimal human capital resource allocation and ensure the continued, robust realization of enterprise-wide objectives, it is incumbent upon us to engage in a proactive, granular analysis of my present remuneration scaffolding, thereby effectuating an equitable recalibration commensurate with my demonstrably amplified value proposition and pivotal synergistic contributions.
3:29:53 PM	Notice	Execution completed

Multimodal Magic

Text is great, but Gemini is multimodal. You can pass images directly to the model to have it analyze charts, describe photos, or even read handwriting.

JavaScript
JavaScript
/**
 * Analyzes an image using Gemini's multimodal capabilities.
 * @param {string} base64Image - The base64 encoded image string.
 * @param {string} mimeType - The mime type of the image (e.g., "image/jpeg").
 * @return {string} The model's description of the image.
 */
function analyzeImage_(data, mimeType) {
  const projectId = "jpoehnelt-blog";
  const region = "us-central1";
  const modelName = "gemini-2.5-pro";

  const model = `projects/${projectId}/locations/${region}/publishers/google/models/${modelName}`;
  const payload = {
    contents: [
      {
        role: "user",
        parts: [
          { text: "Succintly describe what is happening in this image." },
          {
            inlineData: {
              mimeType,
              data,
            },
          },
        ],
      },
    ],
    generationConfig: {
      maxOutputTokens: 4096,
    },
  };

  try {
    const response = VertexAI.Endpoints.generateContent(payload, model);
    const description = response?.candidates?.[0]?.content?.parts?.[0]?.text;
    console.log(description);
    return description;
  } catch (e) {
    console.error("Analysis failed:", e.message);
    return "Error: Could not see the image.";
  }
}

function runMultimodalDemo() {
  // Fetch an image from the web (or Drive)
  const imageUrl =
    "https://media.githubusercontent.com/media/jpoehnelt/blog/refs/heads/main/apps/site/src/lib/images/mogollon-monster-100/justin-poehnelt-during-ultramarathon.jpeg";
  const imageBlob = UrlFetchApp.fetch(imageUrl).getBlob();
  const base64Image = Utilities.base64Encode(imageBlob.getBytes());
  const mimeType = imageBlob.getContentType() || "image/jpeg";
  analyzeImage_(base64Image, mimeType);
}

The result:

3:26:28 PM	Notice	Execution started
3:26:40 PM	Info	A male trail runner, competing in a mountain ultramarathon with bib number 49, uses trekking poles to ascend a steep and rocky path.
3:26:40 PM	Notice	Execution completed

Why this rocks

  • No more UrlFetchApp: The service handles the underlying network requests.
  • Built-in Auth: ScriptApp.getOAuthToken() is handled more seamlessly, though you still need standard scopes.
  • Cleaner Syntax: VertexAI.Endpoints.generateContent(payload, model) is much easier to read than a massive fetch call.

Troubleshooting

“Unexpected error while getting the method or property generateContent…”

Unexpected error while getting the method or property generateContent on object Apiary.aiplatform.endpoints.

If you see this error, it is often because you are using a model that isn’t available in your selected region, or you are trying to use a Preview model without using the GLOBAL endpoint. There is a bug here, and you can use the gemini-2.5-pro model instead.

Disclaimer: I am a member of the Google Workspace Developer Relations team. The opinions expressed here are my own and do not necessarily represent those of Google.

© 2026 by Justin Poehnelt is licensed under CC BY-SA 4.0