Google Apps Script Fetch API: How to Call External APIs With UrlFetchApp (Real Examples)
Why Connect Google Apps Script to External APIs
Most Google Apps Script tutorials stop at Google-to-Google automation. Sheets to Gmail. Forms to Docs. That's useful, but it's also a ceiling. Once you learn to use the google apps script fetch api capabilities through UrlFetchApp, everything changes.
The moment you learn to call external APIs from Apps Script, you break through that ceiling entirely. Suddenly your spreadsheet can pull live weather data, your form submission can trigger a Slack message, and your reporting sheet can sync CRM records from a REST endpoint you built yourself.
If you've been searching for how to use google apps script fetch api calls, this is the guide that actually walks you through it. Not just the syntax. Real examples, real authentication patterns, real error handling.
UrlFetchApp is the built-in service that makes all of this possible. And it's surprisingly capable once you understand how it works.
Google Apps Script Fetch API Basics: UrlFetchApp Is Your Gateway
UrlFetchApp is the service class that lets Apps Script make HTTP requests to any URL. GET, POST, PUT, DELETE, PATCH. If a server accepts HTTP, UrlFetchApp can talk to it.
Before you use it, your project needs the https://www.googleapis.com/auth/script.external_request scope. Apps Script usually adds this automatically when it detects a UrlFetchApp.fetch() call, but if you're managing scopes manually in appsscript.json, make sure it's there.
Three methods matter:
UrlFetchApp.fetch(url, params)sends a single requestUrlFetchApp.fetchAll(requests)sends up to 50 requests in parallelUrlFetchApp.getRequest(url, params)returns the request object without actually sending it (great for debugging)
Most people only ever use fetch(), but fetchAll() is the secret weapon when you need to call multiple endpoints in one script run. It sends all requests concurrently, which can cut execution time dramatically compared to calling fetch() in a loop.
Making a Simple GET Request
The simplest possible API call looks like this:
function fetchDadJoke() {
const url = 'https://icanhazdadjoke.com/';
const options = {
headers: {
'Accept': 'application/json'
}
};
const response = UrlFetchApp.fetch(url, options);
Logger.log(response.getContentText());
}That's it. One function call, one response. The fetch() method returns an HTTPResponse object with a few key methods:
getContentText()returns the body as a stringgetResponseCode()returns the HTTP status code (200, 404, 500, etc.)getHeaders()returns response headers as an objectgetBlob()returns the response as a Blob (useful for files)
Parsing JSON Responses
Almost every modern API returns JSON. Parsing it is straightforward:
function getPublicData() {
const url = 'https://jsonplaceholder.typicode.com/posts/1';
const response = UrlFetchApp.fetch(url);
const json = JSON.parse(response.getContentText());
Logger.log(json.title); // "sunt aut facere..."
Logger.log(json.userId); // 1
}The pattern is always the same: fetch, get the content text, parse it. You'll write this three-line sequence hundreds of times.
POST Requests With Headers and Body
Sending data to an API requires a few more options:
function createPost() {
const url = 'https://jsonplaceholder.typicode.com/posts';
const payload = {
title: 'New Post from Apps Script',
body: 'This was sent via UrlFetchApp',
userId: 1
};
const options = {
method: 'post',
contentType: 'application/json',
payload: JSON.stringify(payload),
muteHttpExceptions: true
};
const response = UrlFetchApp.fetch(url, options);
Logger.log(response.getResponseCode()); // 201
Logger.log(response.getContentText());
}A few things to note here. The method property defaults to 'get' if you don't set it. The contentType tells the server you're sending JSON. And payload must be a string, not an object, so you need JSON.stringify().
That muteHttpExceptions: true flag? Important. Without it, any non-2xx response throws an exception and kills your script. With it, you get the error response back and can handle it yourself. Always set this when you want to inspect error responses rather than crashing on the first 400 or 500.
Authentication: API Keys, OAuth, and Bearer Tokens
Most useful APIs require some form of authentication. Here are the three patterns you'll encounter constantly. Getting this right matters because a misconfigured auth header is the number one reason API calls fail silently or return confusing 401 errors.
API Key in Headers or Query Params
The simplest auth method. Some APIs want the key in a header, others in the URL:
// API key in header
function fetchWithApiKeyHeader() {
const options = {
headers: {
'X-Api-Key': 'your-api-key-here'
}
};
const response = UrlFetchApp.fetch('https://api.example.com/data', options);
return JSON.parse(response.getContentText());
}
// API key in query parameter
function fetchWithApiKeyParam() {
const url = 'https://api.example.com/data?api_key=your-api-key-here';
const response = UrlFetchApp.fetch(url);
return JSON.parse(response.getContentText());
}PropertiesService.getScriptProperties().getProperty('API_KEY').Which approach should you use? Check the API's documentation. Some providers require header-based auth for security reasons, since query parameters can show up in server logs and browser history. When both options are available, headers are the safer choice.
Bearer Token Authentication
When an API gives you a token (JWT, access token, whatever), it goes in the Authorization header:
function fetchWithBearerToken() {
const token = PropertiesService.getScriptProperties().getProperty('API_TOKEN');
const options = {
headers: {
'Authorization': 'Bearer ' + token
}
};
const response = UrlFetchApp.fetch('https://api.example.com/protected', options);
return JSON.parse(response.getContentText());
}OAuth 2.0 With ScriptApp.getOAuthToken()
For calling Google's own APIs (or any API that accepts Google OAuth tokens), Apps Script has a built-in shortcut:
function fetchGoogleDriveFiles() {
const url = 'https://www.googleapis.com/drive/v3/files?pageSize=10';
const options = {
headers: {
'Authorization': 'Bearer ' + ScriptApp.getOAuthToken()
}
};
const response = UrlFetchApp.fetch(url, options);
const files = JSON.parse(response.getContentText());
files.files.forEach(file => Logger.log(file.name));
}For third-party OAuth 2.0 flows (where you need to redirect users to authorize), use the apps-script-oauth2 library. It handles the token exchange, refresh, and storage. But that's a whole separate article.
Recipe 1: Pull Weather Data Into Google Sheets
This is a good one to start with because Open-Meteo is a free API that requires no authentication. Zero setup friction. You can have live weather data flowing into a spreadsheet in under two minutes.
Weather data is one of the most common first API integrations people build in Apps Script, and for good reason. The data is publicly available, the response format is predictable, and the results are immediately visual in a spreadsheet. This recipe also demonstrates the core pattern you'll reuse for any API that returns structured data: fetch the response, parse the JSON, and write specific fields into sheet rows.
Here's a practical example using the Open-Meteo API (free, no key required):
function getWeatherData() {
const lat = 40.7128;
const lon = -74.0060;
const url = 'https://api.open-meteo.com/v1/forecast'
+ '?latitude=' + lat
+ '&longitude=' + lon
+ '¤t_weather=true'
+ '&temperature_unit=fahrenheit';
const response = UrlFetchApp.fetch(url);
const weather = JSON.parse(response.getContentText());
const current = weather.current_weather;
const sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
sheet.appendRow([
new Date(),
current.temperature + '\u00b0F',
current.windspeed + ' mph',
current.weathercode
]);
}Set this on a time-driven trigger and you've got a weather log updating itself every hour. No Zapier subscription needed. Swap the coordinates for your city and you've got a local weather tracker built entirely inside Google Sheets. You can extend this further by adding Open-Meteo's hourly or daily forecast parameters to build a multi-day outlook directly in your spreadsheet.
Recipe 2: Send Slack Messages From a Sheet
Slack incoming webhooks accept a simple POST with a JSON body. This is probably the most common google apps script http request pattern people search for, and it's dead simple once you have the webhook URL.
To set up the webhook, go to your Slack workspace settings, create a new incoming webhook under the Apps section, and copy the URL it generates. Store that URL in Script Properties rather than pasting it directly into your code. Each webhook is tied to a specific channel, so you'll need separate webhooks if you want to post to multiple channels.
function sendSlackAlert() {
const webhookUrl = PropertiesService.getScriptProperties()
.getProperty('SLACK_WEBHOOK_URL');
const message = {
text: 'New form submission received!',
channel: '#alerts'
};
const options = {
method: 'post',
contentType: 'application/json',
payload: JSON.stringify(message)
};
const response = UrlFetchApp.fetch(webhookUrl, options);
Logger.log(response.getResponseCode()); // 200 if successful
}Wire this to an onFormSubmit trigger and every Google Form response pings your Slack channel. Five minutes of setup. You can customize the message format with Slack's Block Kit to include rich formatting, buttons, and structured data from the form submission. A common extension is to pull specific form fields into the message body so your team sees the actual submission content without opening the spreadsheet.
Recipe 3: Sync CRM Data From a REST API
This one pulls contact data from an external CRM API and writes it to a sheet. It's the pattern you'll use any time you need to import data from a third-party system on a schedule. The key detail here is using setValues() for bulk writes instead of looping through appendRow() calls.
Why does the write method matter so much? Every appendRow() call is a separate round-trip to the Sheets service. When you're writing 100 rows, that's 100 individual operations, each taking roughly a second. With setValues(), you write the entire block in a single call. For large datasets, this difference alone can determine whether your script finishes within the six-minute execution limit or gets killed halfway through.
function syncCrmContacts() {
const apiKey = PropertiesService.getScriptProperties().getProperty('CRM_API_KEY');
const url = 'https://api.example-crm.com/v1/contacts?limit=100';
const options = {
headers: {
'Authorization': 'Bearer ' + apiKey
},
muteHttpExceptions: true
};
const response = UrlFetchApp.fetch(url, options);
if (response.getResponseCode() !== 200) {
Logger.log('API error: ' + response.getContentText());
return;
}
const contacts = JSON.parse(response.getContentText()).data;
const sheet = SpreadsheetApp.getActiveSpreadsheet()
.getSheetByName('CRM Contacts');
// Clear old data, keep header
if (sheet.getLastRow() > 1) {
sheet.getRange(2, 1, sheet.getLastRow() - 1, 4).clearContent();
}
const rows = contacts.map(c => [c.name, c.email, c.company, c.lastContact]);
if (rows.length > 0) {
sheet.getRange(2, 1, rows.length, 4).setValues(rows);
}
Logger.log('Synced ' + rows.length + ' contacts');
}The setValues() call is way faster than appendRow() in a loop. For 100 rows, the difference is seconds vs. minutes. This matters a lot when you're running on a trigger, since Apps Script kills any execution that exceeds six minutes on consumer accounts.
Recipe 4: Post Google Sheets Data to a Webhook
Sometimes you need to push data out, not pull it in. This pattern shows up when you're using Sheets as a lightweight data entry tool and need to sync records into another system.
Webhooks are the inverse of API calls. Instead of your script requesting data from a server, your script sends data to a server that's listening for incoming payloads. This makes webhooks ideal for event-driven workflows where Sheets acts as the data source and an external system handles the processing. Tools like n8n, Make (formerly Integromat), and custom backend services all support webhook receivers out of the box.
Here's how to send sheet data to any webhook endpoint:
function postSheetDataToWebhook() {
const sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
const data = sheet.getDataRange().getValues();
const headers = data[0];
const rows = data.slice(1);
const records = rows.map(row => {
const obj = {};
headers.forEach((header, i) => obj[header] = row[i]);
return obj;
});
const options = {
method: 'post',
contentType: 'application/json',
payload: JSON.stringify({ records: records }),
muteHttpExceptions: true
};
const webhookUrl = PropertiesService.getScriptProperties()
.getProperty('WEBHOOK_URL');
const response = UrlFetchApp.fetch(webhookUrl, options);
Logger.log('Webhook response: ' + response.getResponseCode());
}This pattern works with any system that accepts webhooks: n8n, Make, custom APIs, data pipelines. You name it. One thing to watch out for: if your sheet has thousands of rows, the JSON payload can get large. Split it into batches of 500 or so to avoid hitting the 50MB request limit. You can also add a last-modified column to your sheet and filter rows so you only send records that changed since the last sync, which keeps payload sizes small and avoids unnecessary processing on the receiving end.
Error Handling and Retry Logic
APIs fail. Networks time out. Rate limits get hit. Your scripts need to handle this gracefully, especially when they run unattended on time-driven triggers. Nobody's going to be watching the execution log at 3am when your sync job runs into a 503.
Here's a reusable wrapper with exponential backoff:
function fetchWithRetry(url, options, maxRetries) {
maxRetries = maxRetries || 3;
options = options || {};
options.muteHttpExceptions = true;
for (var attempt = 0; attempt < maxRetries; attempt++) {
try {
var response = UrlFetchApp.fetch(url, options);
var code = response.getResponseCode();
if (code >= 200 && code < 300) {
return response;
}
if (code === 429 || code >= 500) {
var waitTime = Math.pow(2, attempt) * 1000;
Logger.log('Attempt ' + (attempt + 1) + ' got ' + code
+ ', waiting ' + waitTime + 'ms');
Utilities.sleep(waitTime);
continue;
}
// 4xx errors (except 429) are not retryable
Logger.log('Request failed with ' + code + ': '
+ response.getContentText());
return response;
} catch (e) {
Logger.log('Network error on attempt ' + (attempt + 1) + ': ' + e);
if (attempt < maxRetries - 1) {
Utilities.sleep(Math.pow(2, attempt) * 1000);
}
}
}
throw new Error('All ' + maxRetries + ' attempts failed for ' + url);
}The key insight: only retry on 429 (rate limited) and 5xx (server error). A 400 or 403 won't magically fix itself on retry. Those are client-side problems that need code changes, not more attempts. And the exponential backoff (1s, 2s, 4s) prevents you from hammering a struggling server with rapid-fire retries.
Rate Limits and Quotas to Watch
Google imposes its own limits on UrlFetchApp, separate from whatever the external API enforces:
- Consumer Gmail accounts: 20,000 URL fetch calls per day
- Google Workspace accounts: 100,000 URL fetch calls per day
- Response size: 50MB per call
- Request payload: 50MB per call
- Execution time: 6 minutes per script run (30 minutes for Workspace)
Those daily quotas reset at midnight Pacific Time, not midnight in your local timezone. And the limits are per-user, not per-project. So if you have three scripts making API calls under the same Google account, they share the same 20,000 (or 100,000) call pool.
If you're hitting these limits, a few strategies help:
- Use `fetchAll()` to batch requests. It runs up to 50 requests in parallel, which is faster and counts the same against your quota as individual calls.
- Add `Utilities.sleep()` between calls when hitting external rate limits. Even 100-200ms between requests prevents most "too many requests" errors.
- Cache responses with
CacheServicewhen the data doesn't change frequently. This is especially effective for reference data like currency exchange rates or product catalogs that only update a few times per day:
function fetchWithCache(url, cacheDurationSeconds) {
var cache = CacheService.getScriptCache();
var cached = cache.get(url);
if (cached) {
return JSON.parse(cached);
}
var response = UrlFetchApp.fetch(url);
var data = response.getContentText();
cache.put(url, data, cacheDurationSeconds || 600);
return JSON.parse(data);
}One gotcha with CacheService: the maximum value size is 100KB. If your API response is larger than that, you'll need to either cache only the fields you need or skip caching for that endpoint entirely.
Debugging API Calls in Apps Script
When something goes wrong (and it will), here's your debugging toolkit. Most of the time the problem is either a malformed URL, a missing auth header, or the API returning a different JSON structure than you expected. These four techniques cover 95% of debugging scenarios.
1. Use `getRequest()` to inspect what you're actually sending:
function debugRequest() {
var url = 'https://api.example.com/data';
var options = {
method: 'post',
contentType: 'application/json',
payload: JSON.stringify({ key: 'value' }),
headers: { 'Authorization': 'Bearer token123' }
};
// See what will be sent without sending it
var request = UrlFetchApp.getRequest(url, options);
Logger.log(JSON.stringify(request, null, 2));
}This is underrated. If your API call returns unexpected results, check what you're actually sending before blaming the API. Half the time the payload is wrong or a header is missing.
2. Always log the full response on errors:
function debugResponse(response) {
Logger.log('Status: ' + response.getResponseCode());
Logger.log('Headers: ' + JSON.stringify(response.getHeaders()));
Logger.log('Body: ' + response.getContentText());
}3. Check the Execution Log (View > Execution log in the script editor). Every Logger.log() and console.log() call shows up here. The log persists after the script finishes, so you can check what happened even for trigger-based runs.
4. Use try/catch blocks to capture the full exception:
try {
var response = UrlFetchApp.fetch(url, options);
} catch (e) {
Logger.log('Error message: ' + e.message);
Logger.log('Stack trace: ' + e.stack);
}One more tip that saves a lot of headaches: if you're building a new integration, test your API calls with a tool like Postman or curl first, outside of Apps Script. Confirm the request works, then translate it to UrlFetchApp syntax. That way you know the API side is correct and can focus on getting the Apps Script code right.
Complete Starter Template: Fetch, Parse, and Write to Sheet
Here's a copy-paste template that covers 90% of google apps script http request use cases. Swap the URL, adjust the parsing, and you're done:
/**
* Fetches data from an external API and writes results to a Google Sheet.
* Customize the URL, headers, and data mapping for your use case.
*/
function fetchApiAndWriteToSheet() {
// Configuration
var API_URL = 'https://jsonplaceholder.typicode.com/users';
var SHEET_NAME = 'API Data';
var HEADERS_ROW = ['ID', 'Name', 'Email', 'Company'];
// Fetch
var options = {
muteHttpExceptions: true,
headers: {
// 'Authorization': 'Bearer ' + PropertiesService
// .getScriptProperties().getProperty('API_TOKEN')
}
};
var response = UrlFetchApp.fetch(API_URL, options);
if (response.getResponseCode() !== 200) {
Logger.log('Error ' + response.getResponseCode() + ': '
+ response.getContentText());
return;
}
// Parse
var data = JSON.parse(response.getContentText());
// Map to rows
var rows = data.map(function(item) {
return [item.id, item.name, item.email, item.company.name];
});
// Write to sheet
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getSheetByName(SHEET_NAME);
if (!sheet) {
sheet = ss.insertSheet(SHEET_NAME);
sheet.appendRow(HEADERS_ROW);
}
// Clear old data (keep header)
if (sheet.getLastRow() > 1) {
sheet.getRange(2, 1, sheet.getLastRow() - 1, HEADERS_ROW.length)
.clearContent();
}
// Write new data
if (rows.length > 0) {
sheet.getRange(2, 1, rows.length, HEADERS_ROW.length).setValues(rows);
}
Logger.log('Wrote ' + rows.length + ' rows to ' + SHEET_NAME);
}Run it once manually to confirm it works. Then set up a time-driven trigger to run it on whatever schedule you need. Daily syncs, hourly updates, every five minutes if the data changes fast. The trigger setup takes about 30 seconds in the Apps Script editor under Triggers in the left sidebar.
So that's the full picture of google apps script external api integration. UrlFetchApp handles GET, POST, auth, retries, and batch calls. The four recipes above cover the most common patterns. And the starter template gives you a foundation for anything else you need to build.
The only limit is Google's daily quota. And if you're hitting 100,000 API calls a day from a spreadsheet, you might want to rethink your architecture.