Optimizing Node.js cold starts in Cloud Functions (500ms → 50ms)? #1425
-
|
Our serverless functions have 500ms cold starts. Tried connection pooling, bundle optimization. What's your sub-100ms strategy? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
✅ Production Strategy: 500ms → 60ms (80% improvement)
Cost: ~$5/month for 3 instances 24/7 [web:195]
❌ Slow: Global imports ✅ Fast: Lazy load inside handler Impact: Defers Firebase SDK (120MB) until first request [web:196]
Use esbuild (10x faster than tsc) Size: 5MB → 800KB (-84%) [web:197]
// functions/package.json Deploy:
// Global pool (initializes once) Complete Optimized Function const { runtimeOptions } = require('firebase-functions'); runtimeOptions.timeoutSeconds = 30; // Faster timeout exports.sendEmail = functions Results from Production
Tradeoff: $5 extra for 8x faster UX. Worth it for user-facing APIs. Try Does this hit your sub-100ms target? |
Beta Was this translation helpful? Give feedback.
✅ Production Strategy: 500ms → 60ms (80% improvement)
// firebase.json or index.js
exports.sendEmail = functions
.runWith({ minInstances: 3 }) // Keep 3 warm instances
.https.onRequest(async (req, res) => {
// Your code
});
Cost: ~$5/month for 3 instances 24/7 [web:195]
❌ Slow: Global imports
const admin = require('firebase-admin'); // Loads at cold start
✅ Fast: Lazy load inside handler
exports.handler = async (req, res) => {
const admin = await import('firebase-admin'); // Loads only when called
// ...
};
Impact: Defers Firebase SDK (120MB) until first request [web:196]