← back

How to create 700+ SEO-optimised pages in 1 hour with Next.js, OpenAI and Postgres

Programmatic SEO gets a bad reputation because most of it is just content spam. Done right, it's closer to what a good encyclopedia does: take a structured dataset and render a useful page per row.

The recipe

  1. A dataset worth indexing — list of tools, cities, comparisons, whatever. The rows have to have real differences, not just a find-and-replace keyword.
  2. A Postgres table — one row = one page. Slug, title, JSON blob of structured fields.
  3. OpenAI for the long-form sections — short prompts, one per field, not "write me a whole page."
  4. Next.js dynamic routes + generateStaticParams — every row becomes a statically generated page at build time.
  5. A sitemap — generated from the same table.

The gotchas

  • Rate limits. Batch your OpenAI calls, don't fire 700 at once.
  • Hallucinations on factual fields. Anything that has to be correct should come from the dataset, not the model.
  • Build times explode if you try to SSG all of it on Vercel's free plan.

Boring but effective. The pages still rank.