The Problem
I was building an application that needed to display data from Google Sheets. The initial approach was straightforward: fetch the data from Google Sheets API whenever a user visits the page. However, this created several issues:
- Timeout errors: The Google Sheets API would sometimes timeout, especially when fetching large datasets
- Rate limiting: Frequent API calls would hit Google's rate limits
- Slow performance: Every page load required waiting for the API response
- Unnecessary API calls: The data doesn't change frequently, yet we were fetching it on every request
The most frustrating part was seeing timeout errors during deployment builds and in production when users tried to view the data:
1⨯ node_modules/ky/distribution/utils/timeout.js (9:19) @ Timeout.eval2⨯ Internal error: TimeoutError: Request timed out: POST https://sheets.googleapis.com/v4/spreadsheets/[ID]/:getByDataFilter3 at Timeout.eval (./node_modules/ky/distribution/utils/timeout.js:14:20)4 at listOnTimeout (node:internal/timers:581:17)5 at process.processTimers (node:internal/timers:519:7)6digest: "664503010"
The Solution: Build-Time Static JSON Generation
Instead of fetching data at runtime, I decided to generate a static JSON file during the build process. This approach offers several benefits:
- No runtime API calls: The data is pre-fetched and bundled with the application
- Instant page loads: No waiting for external API responses
- No timeout issues: Build-time fetching has more relaxed timeouts
- Lower costs: Significantly fewer API calls to Google Sheets
Implementation
Step 1: Create the Data Fetching Script
I created a script at scripts/generate-data.ts that fetches all the data from Google Sheets and saves it as a JSON file:
1import * as fs from 'fs'2import * as path from 'path'34import { fetchAllData } from '../src/libs/googleSheets'56async function generateDataJson() {7 try {8 const data = await fetchAllData()9 const outputPath = path.join(process.cwd(), 'public', 'data.json')1011 const publicDir = path.join(process.cwd(), 'public')12 if (!fs.existsSync(publicDir)) {13 fs.mkdirSync(publicDir, { recursive: true })14 }1516 fs.writeFileSync(outputPath, JSON.stringify(data, null, 2))17 } catch (error) {18 process.exit(1)19 }20}2122generateDataJson()
The script:
- Fetches all data from Google Sheets
- Creates the
publicdirectory if it doesn't exist - Writes the data to
public/data.json - Exits with an error code if something goes wrong (failing the build)
Step 2: Configure package.json Scripts
I added scripts to run the generator and integrated it into the build process:
1{2 "scripts": {3 "generate-data": "tsx scripts/generate-data.ts",4 "generate-data:local": "NEXT_PUBLIC_APP_ENV=local tsx scripts/generate-data.ts",5 "prebuild": "npm run generate-data",6 "build": "npm run prebuild && next build"7 }8}
The prebuild script runs automatically before every build, ensuring the data is always fresh when deploying.
Step 3: Update Components to Use Static JSON
Instead of calling the Google Sheets API from components, I now fetch the static JSON file:
1// Before: Runtime API call2const data = await fetchAllData() // Could timeout34// After: Static JSON file5const data = await fetch('/data.json').then((res) => res.json())
Since the file is in the public directory, it's served as a static asset and loads instantly.
Step 4: Handle CI/CD Pipeline
In the GitHub Actions workflow, I made sure to:
- Set the Google Sheets credentials as environment variables
- Run
npm run build(which triggers the prebuild script automatically)
1env:2 GOOGLE_SHEETS_CREDENTIALS: ${{secrets.GOOGLE_SHEETS_CREDENTIALS}}34steps:5 - name: Build6 run: npm run build
The build process now:
- Runs the
prebuildscript - Fetches data from Google Sheets
- Generates
public/data.json - Builds the Next.js application with the static data
Optimizing the Google Sheets Fetching
While implementing this solution, I also optimized how I fetch data from Google Sheets to handle large datasets efficiently:
1async getAllData(): Promise<Data[]> {2 const sheet = await this.getSheet()3 const totalRows = sheet.rowCount45 const BATCH_SIZE = 2006 const dataArray: Data[] = []78 // Load data in batches to avoid memory issues9 for (10 let batchStart = DATA_START_ROW;11 batchStart < totalRows;12 batchStart += BATCH_SIZE13 ) {14 const batchEnd = Math.min(batchStart + BATCH_SIZE, totalRows)1516 await sheet.loadCells({17 startRowIndex: batchStart,18 endRowIndex: batchEnd,19 startColumnIndex: minCol,20 endColumnIndex: maxCol,21 })2223 // Process batch...24 }2526 return dataArray27}
Key optimizations:
- Batch processing: Load rows in batches instead of all at once
- Column filtering: Only load the columns we need
- Memory management: Process data incrementally to prevent memory issues
Results
After implementing this solution:
- Zero timeout errors: Build-time fetching has never timed out
- Faster page loads: Data loads instantly from the static JSON file
- Reduced API costs: Google Sheets API is only called during builds
- Better developer experience: Local development can use cached data without hitting API limits
- Predictable builds: Builds fail fast if data fetching fails, rather than failing silently in production
Bonus: Local Development
For local development, I added a separate script that uses local credentials:
1npm run generate-data:local
This allows developers to generate fresh data locally without needing to set up environment variables for every development session.
When to Use This Approach
This pattern works well when:
- Data doesn't change frequently
- You're experiencing timeout or rate limit issues
- You want faster page loads
- The data size is manageable as a static file
This pattern might not be ideal when:
- Data needs to be real-time
- Different users need different data
- Data changes very frequently
- Data is too large to bundle in the application
Conclusion
By moving the Google Sheets data fetching from runtime to build time, I eliminated timeout issues, improved performance, and reduced API costs. The trade-off is that data is only as fresh as the last deployment, but for my use case, this is perfectly acceptable.
The key takeaway: not all data needs to be fetched at runtime. If your data is relatively static, consider generating it at build time and serving it as a static asset.
