Logo

Solving Google Sheets Timeout Issues with Build-Time Static JSON Generation

5 min read
Node.jsGoogle Sheets API

Table of Contents

The Problem

I was building an application that needed to display data from Google Sheets. The initial approach was straightforward: fetch the data from Google Sheets API whenever a user visits the page. However, this created several issues:

  1. Timeout errors: The Google Sheets API would sometimes timeout, especially when fetching large datasets
  2. Rate limiting: Frequent API calls would hit Google's rate limits
  3. Slow performance: Every page load required waiting for the API response
  4. Unnecessary API calls: The data doesn't change frequently, yet we were fetching it on every request

The most frustrating part was seeing timeout errors during deployment builds and in production when users tried to view the data:

1⨯ node_modules/ky/distribution/utils/timeout.js (9:19) @ Timeout.eval
2⨯ Internal error: TimeoutError: Request timed out: POST https://sheets.googleapis.com/v4/spreadsheets/[ID]/:getByDataFilter
3 at Timeout.eval (./node_modules/ky/distribution/utils/timeout.js:14:20)
4 at listOnTimeout (node:internal/timers:581:17)
5 at process.processTimers (node:internal/timers:519:7)
6digest: "664503010"

The Solution: Build-Time Static JSON Generation

Instead of fetching data at runtime, I decided to generate a static JSON file during the build process. This approach offers several benefits:

Implementation

Step 1: Create the Data Fetching Script

I created a script at scripts/generate-data.ts that fetches all the data from Google Sheets and saves it as a JSON file:

1import * as fs from 'fs'
2import * as path from 'path'
3
4import { fetchAllData } from '../src/libs/googleSheets'
5
6async function generateDataJson() {
7 try {
8 const data = await fetchAllData()
9 const outputPath = path.join(process.cwd(), 'public', 'data.json')
10
11 const publicDir = path.join(process.cwd(), 'public')
12 if (!fs.existsSync(publicDir)) {
13 fs.mkdirSync(publicDir, { recursive: true })
14 }
15
16 fs.writeFileSync(outputPath, JSON.stringify(data, null, 2))
17 } catch (error) {
18 process.exit(1)
19 }
20}
21
22generateDataJson()

The script:

Step 2: Configure package.json Scripts

I added scripts to run the generator and integrated it into the build process:

1{
2 "scripts": {
3 "generate-data": "tsx scripts/generate-data.ts",
4 "generate-data:local": "NEXT_PUBLIC_APP_ENV=local tsx scripts/generate-data.ts",
5 "prebuild": "npm run generate-data",
6 "build": "npm run prebuild && next build"
7 }
8}

The prebuild script runs automatically before every build, ensuring the data is always fresh when deploying.

Step 3: Update Components to Use Static JSON

Instead of calling the Google Sheets API from components, I now fetch the static JSON file:

1// Before: Runtime API call
2const data = await fetchAllData() // Could timeout
3
4// After: Static JSON file
5const data = await fetch('/data.json').then((res) => res.json())

Since the file is in the public directory, it's served as a static asset and loads instantly.

Step 4: Handle CI/CD Pipeline

In the GitHub Actions workflow, I made sure to:

  1. Set the Google Sheets credentials as environment variables
  2. Run npm run build (which triggers the prebuild script automatically)
1env:
2 GOOGLE_SHEETS_CREDENTIALS: ${{secrets.GOOGLE_SHEETS_CREDENTIALS}}
3
4steps:
5 - name: Build
6 run: npm run build

The build process now:

  1. Runs the prebuild script
  2. Fetches data from Google Sheets
  3. Generates public/data.json
  4. Builds the Next.js application with the static data

Optimizing the Google Sheets Fetching

While implementing this solution, I also optimized how I fetch data from Google Sheets to handle large datasets efficiently:

1async getAllData(): Promise<Data[]> {
2 const sheet = await this.getSheet()
3 const totalRows = sheet.rowCount
4
5 const BATCH_SIZE = 200
6 const dataArray: Data[] = []
7
8 // Load data in batches to avoid memory issues
9 for (
10 let batchStart = DATA_START_ROW;
11 batchStart < totalRows;
12 batchStart += BATCH_SIZE
13 ) {
14 const batchEnd = Math.min(batchStart + BATCH_SIZE, totalRows)
15
16 await sheet.loadCells({
17 startRowIndex: batchStart,
18 endRowIndex: batchEnd,
19 startColumnIndex: minCol,
20 endColumnIndex: maxCol,
21 })
22
23 // Process batch...
24 }
25
26 return dataArray
27}

Key optimizations:

Results

After implementing this solution:

Bonus: Local Development

For local development, I added a separate script that uses local credentials:

1npm run generate-data:local

This allows developers to generate fresh data locally without needing to set up environment variables for every development session.

When to Use This Approach

This pattern works well when:

This pattern might not be ideal when:

Conclusion

By moving the Google Sheets data fetching from runtime to build time, I eliminated timeout issues, improved performance, and reduced API costs. The trade-off is that data is only as fresh as the last deployment, but for my use case, this is perfectly acceptable.

The key takeaway: not all data needs to be fetched at runtime. If your data is relatively static, consider generating it at build time and serving it as a static asset.

Related Articles