How to create 700+ SEO-optimised pages in 1 hour with Next.js, OpenAI and Postgres
Programmatic SEO gets a bad reputation because most of it is just content spam. Done right, it's closer to what a good encyclopedia does: take a structured dataset and render a useful page per row.
The recipe
- A dataset worth indexing — list of tools, cities, comparisons, whatever. The rows have to have real differences, not just a find-and-replace keyword.
- A Postgres table — one row = one page. Slug, title, JSON blob of structured fields.
- OpenAI for the long-form sections — short prompts, one per field, not "write me a whole page."
- Next.js dynamic routes +
generateStaticParams— every row becomes a statically generated page at build time. - A sitemap — generated from the same table.
The gotchas
- Rate limits. Batch your OpenAI calls, don't fire 700 at once.
- Hallucinations on factual fields. Anything that has to be correct should come from the dataset, not the model.
- Build times explode if you try to SSG all of it on Vercel's free plan.
Boring but effective. The pages still rank.