Next.js works best on Vercel, where deployment is seamless and features like ISR, middleware, and image optimization just work. But many teams need to self-host: compliance requirements, existing AWS infrastructure, cost concerns, or avoiding vendor lock-in. Open Next bridges this gap by converting Next.js output into packages that run on AWS, Docker, or other platforms.
What Open Next does
Open Next takes the output from next build and transforms it into deployment artifacts for non-Vercel platforms.
| Component | What Open Next produces |
|---|---|
| Server | Lambda function or Docker container |
| Static assets | S3 bucket or CDN-ready files |
| Image optimization | Lambda handler for image resizing |
| ISR | Lambda + S3 for incremental regeneration |
| Middleware | CloudFront Function or Lambda@Edge |
The result is a deployment package that replicates Vercel's behavior on your infrastructure.
When to use Open Next
Open Next makes sense when:
- Compliance: You must host on specific clouds or regions
- Cost control: Vercel pricing doesn't fit your traffic patterns
- Existing infrastructure: You already have AWS, and adding Vercel creates complexity
- Vendor lock-in: You want the option to move between platforms
- Edge control: You need custom CloudFront or CDN configurations
For new projects without these constraints, Vercel is simpler. Open Next adds operational overhead.
Installation and setup
Install Open Next as a dev dependency:
npm install -D open-next
Add a build script to package.json:
{
"scripts": {
"build": "next build",
"open-next": "open-next build"
}
}
Run both builds:
npm run build
npm run open-next
The output appears in .open-next/:
.open-next/
├── server-function/ # Lambda handler for server routes
├── image-optimization-function/ # Image resizing Lambda
├── revalidation-function/ # ISR background revalidation
├── warmer-function/ # Optional cold start warmer
└── assets/ # Static files for S3/CDN
Configuring Open Next
Create open-next.config.ts for customization:
import type { OpenNextConfig } from 'open-next/types/open-next';
const config: OpenNextConfig = {
default: {
override: {
wrapper: 'aws-lambda-streaming',
},
},
imageOptimization: {
arch: 'arm64',
},
revalidate: {
wrapper: 'sqs-revalidate',
},
};
export default config;
Options control Lambda architecture, wrapper types, and how revalidation works.
Deploying with SST
SST (Serverless Stack) has first-class Open Next support:
// sst.config.ts
import { SSTConfig } from 'sst';
import { NextjsSite } from 'sst/constructs';
export default {
config() {
return {
name: 'my-nextjs-app',
region: 'us-east-1',
};
},
stacks(app) {
app.stack(function Site({ stack }) {
const site = new NextjsSite(stack, 'site', {
path: '.',
environment: {
DATABASE_URL: process.env.DATABASE_URL,
},
});
stack.addOutputs({
URL: site.url,
});
});
},
} satisfies SSTConfig;
Deploy with:
npx sst deploy
SST handles CloudFront, Lambda, S3, and all the wiring.
Deploying with CDK
For AWS CDK users:
import * as cdk from 'aws-cdk-lib';
import { Nextjs } from 'cdk-nextjs-standalone';
export class NextjsStack extends cdk.Stack {
constructor(scope: cdk.App, id: string, props?: cdk.StackProps) {
super(scope, id, props);
new Nextjs(this, 'NextjsSite', {
nextjsPath: './',
environment: {
DATABASE_URL: process.env.DATABASE_URL!,
},
});
}
}
The cdk-nextjs-standalone package uses Open Next under the hood.
ISR on AWS
Incremental Static Regeneration requires coordination between Lambda, S3, and a revalidation queue:
| Component | Purpose |
|---|---|
| S3 bucket | Stores prerendered HTML and data |
| Lambda | Handles requests and regeneration |
| SQS queue | Queues background revalidation |
| CloudFront | Caches responses at the edge |
Open Next sets up this infrastructure. When a page needs revalidation:
- CloudFront serves stale content
- Lambda triggers background regeneration
- New content uploads to S3
- CloudFront cache invalidates
This matches Vercel's ISR behavior.
Middleware on AWS
Next.js middleware runs on CloudFront Functions or Lambda@Edge:
| Option | Latency | Limitations |
|---|---|---|
| CloudFront Functions | ~1ms | 10KB code limit, no network calls |
| Lambda@Edge | ~50ms | 1MB code, can make network calls |
Open Next chooses based on your middleware complexity. Simple rewrites use CloudFront Functions; middleware with database calls uses Lambda@Edge.
Image optimization
Open Next creates a Lambda function for image optimization:
// open-next.config.ts
{
imageOptimization: {
arch: 'arm64', // Better price-performance
memory: 1024, // MB
},
}
The Lambda resizes images on-demand, similar to Vercel's image optimization. Serve through CloudFront for caching.
Environment variables
Pass environment variables through your deployment tool:
// SST
new NextjsSite(stack, 'site', {
environment: {
DATABASE_URL: process.env.DATABASE_URL,
NEXT_PUBLIC_API_URL: 'https://api.example.com',
},
});
Build-time variables (NEXT_PUBLIC_*) are embedded during build. Runtime variables are injected into Lambda.
Monitoring and debugging
Add observability to your Lambda functions:
// SST example with tracing
new NextjsSite(stack, 'site', {
bind: [table],
environment: { ... },
nodejs: {
install: ['@opentelemetry/api'],
},
});
Use CloudWatch Logs for Lambda output, X-Ray for tracing, and CloudWatch metrics for monitoring. Set up alarms for error rates and latency.
Cold starts
Lambda cold starts affect TTFB. Mitigate with:
| Strategy | Implementation |
|---|---|
| Provisioned concurrency | Keep Lambdas warm |
| Warmer function | Periodic pings to prevent cold starts |
| Smaller bundles | Reduce initialization time |
| arm64 architecture | Faster startup than x86 |
Open Next can generate a warmer function that pings your Lambdas periodically.
Common issues
| Issue | Solution |
|---|---|
| 502 errors | Check Lambda timeout and memory settings |
| ISR not working | Verify SQS queue and revalidation Lambda |
| Slow images | Check image Lambda memory allocation |
| Middleware errors | Review Lambda@Edge logs in us-east-1 |
| Environment variables missing | Ensure they're passed in deployment config |
Trade-offs compared to Vercel
| Aspect | Vercel | Open Next on AWS |
|---|---|---|
| Setup | Zero config | Infrastructure as code |
| Maintenance | Vercel handles | You handle |
| Cost | Per-request pricing | Lambda + CloudFront pricing |
| Edge network | Vercel Edge | CloudFront (or your CDN) |
| Support | Vercel support | Community + your team |
| Feature parity | 100% | ~95%, some edge cases differ |
Summary
Open Next enables self-hosted Next.js with features that normally require Vercel: ISR, middleware, image optimization, and streaming. Use it when you need AWS, cost control, or compliance. Deploy with SST, CDK, or Terraform. Accept the operational overhead in exchange for infrastructure control. For most projects, Vercel is simpler; for specific requirements, Open Next delivers.
