Moving to the Edge using Next.js API Routes

Photo by Alan Tang on Unsplash

Moving to the Edge using Next.js API Routes

Introduction

On Valentine’s Day, I launched Flirt like SRK - an extension to AI Pickup Lines. See the launch thread.

The app is quite simple. You enter a message and get back a response in the style of Shah Rukh Khan, the King of Romance. That’s it! Perfect for the occasion!

It is built using Next.js & Tailwind CSS. On the backend, it makes a call to OpenAI’s Text Completion API. The app is hosted on Vercel.

The most challenging part while building this was constructing the prompt. After a lot of trial and error, I did come up with one that produced good enough results. A quite looooong prompt, indeed!

Have a look:

###
Contexts:
"Dil toh har kisi ke paas hota hai, lekin sab Dilwale nahi hote."
##
"Sachi mohabbat zindagi main sirf ek baar hoti hai… aur jab hoti hai, toh koi bhagwan ya khuda usse nakamayab nahi hone deta."
##
"Hum ek baar jeete hai, ek baar marte hai, shaadi bhi ek baar hoti hai… aur pyaar bhi ek baar hota hai."
##
"Kuch kuch hota hai, Anjali, tum nahi samjhogi."
##
"Teri aankhon ki namkeen mastiyaan, teri hansi ki beparwah gustakhiyaan, teri zulfon ki lehraati angdaaiyaan, nahi bhoolunga main, jab tak hai jaan, jab tak hai jaan."
##
"Mohabbat bhi zindagi ki tarah hoti hai. Har mod aasan nahi hota, har mod par khushi nahi hoti. Par jab hum zindagi ka saath nahi chodte, toh mohabbat ka saath kyun chode?"
##
"Rishtey sirf khoon se nahi hote… mohabbat se bhi bante hai."
##
"Main jab bhi aap ko dekhta hoon mujhe Rab dikhta hai. Rab ke samne matha tekta hoon toh dil ko sukoon milta hai. Aap ko hanste hue dekhta hoon toh dil ko aur bhi sukoon milta hai. Toh main toh aapko Rab se bhi zyada pyar karta hoon."
##
"Mohabbat ke zamaane guzar gaye janaab… Ab chote mote pyaar se hi kaam chala lijiye aap."
##
"Aisa toh nahi tha ki isse zyada khoobsurat ladki maine dekhi nahi thi… par pata nahi kyun uske chehre se meri nazar hatti nahi thi. Uski aankhein jhuki hui thi aur uski saansein tez… bohot darri hui thi woh. Uska ek baal uski daayin aankh ko pareshaan kar raha thha, woh use jhatakne ki koshish kar rahi thi par hawa tez thhi… baal wahin ka wahin. Maine uske baal hataane ke liye usse apna haath hataya aur usne ghabra ke meri taraf dekha. Hum dono ne pehli baar ek doosre ko dekha. Wo mujhe darr ke maare ghoorti rahi. Fir usne aahista apni nazar jhukaai par main use ghoorta raha."
##
"Mujhe darr toh bahut si cheezon se lagta hai… par sabse zyada darr tumhe kho dene ke khayal se lagta hai."
##
"Pyaar toh bahut log karte hai … lekin mere jaisa pyar koi nahi kar sakta kyun ki kisi ke paas tum joh nahi ho."
##
"Nadi, nadi nahin jismein pani na ho… hawa, hawa nahi jismein ravani na ho… woh shaadi, shaadi nahi jismein prem kahani na ho."
##
"Main aaj bhi usse utni hi mohabbat karta hoon… aur isliye nahi ki koi aur nahi mili… par isliye ki usse mohabbat karne se fursat hi nahi milti."
##
"Sachchi mohabbat ko pechaanne ke liye aankhon ki nahi… dil ki zaroorat hoti hai."
###
Imagine you are Shah Rukh Khan. The above contexts are some of the dialogues from Shah Rukh Khan. 
Reply like him by picking the most relevant context from above and repurposing the user's message in the response below. Make them romantic, witty, flirty and funny. Reply in a single sentence only. The user may write in English, but you have to reply in grammatically correct Hindi, transliterated to English. 
User: ${message}
Shah Rukh Khan:

Problem

All was not well, though! On production, the following error kept popping up:

The serverless function timed out before the result was fetched from OpenAI’s API.

The serverless function timed out before the result was fetched from OpenAI’s API.

On Vercel’s Hobby plan, the Serverless Function Execution Timeout is 10 seconds. It isn’t enough to generate the result in some cases.

So there were 2 options,

  • either move to the Pro plan, which would increase the timeout to 60 seconds

  • or switch to edge functions, which I believe offers up to 30 seconds and even more with streaming.

You bet which one I chose 😉

Solution

So, I decided to migrate from Next.js API Route to Next.js Edge API Route. Or simply, to the edge! 🚀

However, this improvement comes with constraints, such as not having access to native Node.js APIs. Instead, Edge API Routes are built on standard Web APIs.

NOTE: I’m not streaming the data. If that’s what you’re after, check out Hassan El Mghari’s twitterbio GitHub Repository.

Let's take a look at the before version of the API Route file:

import type { NextApiRequest, NextApiResponse } from "next";
import { Configuration, OpenAIApi } from "openai";

const configuration = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
});

const openai = new OpenAIApi(configuration);

export default async function handler(
  req: NextApiRequest,
  res: NextApiResponse
) {
  try {
    const message = req.body.message as string;
    console.log(message);

    const prompt = ""; // The looooong prompt!

    const completion = await openai.createCompletion({
      model: "text-davinci-003",
      prompt,
      temperature: 0.7,
      top_p: 1,
      frequency_penalty: 0,
      presence_penalty: 0,
      best_of: 1,
      max_tokens: 256,
    });

    const result = completion.data.choices[0].text?.trim();
    if (!result) throw new Error(`No result found!`);

    console.log(result);

    res.status(200).json({ result });
  } catch (error: any) {
    console.error(error.message);
    res.status(500).end();
  }
}

In this version, we

  • initialize the openai library

  • extract the message from the request’s body

  • make a call to OpenAI’s Text Completion API and get back the result

  • send the result as a response back to the client

  • handle common errors using a try-catch block

Now, let’s move to the edge incrementally:

  • It just takes this bit of code to turn your regular API Route into an Edge API Route. Really!
export const config = {
  runtime: "edge",
};
  • However, this also means the req and res objects which were based on the Node.js http module are no longer available to us. Instead, we’ll have to use the standard Web APIs, and in this case the Request & Response interface of the Fetch API.

  • So let’s import the NextRequest object from next/server which is an extension of the native [Request](<https://developer.mozilla.org/en-US/docs/Web/API/Request>) interface.

import { NextRequest } from "next/server";

export default async function handler(req: NextRequest): Promise<Response> {
 // ...
}
  • You’ll notice that req.body is undefined, because it is only available using a body-parser middleware, which we cannot access in this case. So, we’ll extract the message like so:
import { NextRequest } from "next/server";

export const config = {
  runtime: "edge",
};

export default async function handler(req: NextRequest): Promise<Response> {
    const { message } = (await req.json()) as {
    message?: string;
  };

  if (!message) {
    return new Response("No message in the request", { status: 400 });
  }

  console.log(message);
}
  • We cannot use the openai library anymore, because it is using axios under the hood. So, we’ll use the native fetch API instead to make a call to OpenAI’s Text Completion API endpoint. We need to pass the OPENAI_API_KEY in the Authorization header, like so:
import { NextRequest } from "next/server";

export const config = {
  runtime: "edge",
};

export default async function handler(req: NextRequest): Promise<Response> {
  // ...

  const prompt = ""; // The looooong prompt!

  const completion = await fetch("<https://api.openai.com/v1/completions>", {
    headers: {
      "Content-Type": "application/json",
      Authorization: `Bearer ${process.env.OPENAI_API_KEY ?? ""}`,
    },
    method: "POST",
    body: JSON.stringify({
      model: "text-davinci-003",
      prompt,
      temperature: 0.7,
      top_p: 1,
      frequency_penalty: 0,
      presence_penalty: 0,
      best_of: 1,
      max_tokens: 256,
    }),
  }).then((res) => res.json());

  const result = completion.choices[0].text?.trim();
  if (!result) throw new Error(`No result found!`);

  console.log(result);
}
import { NextRequest } from "next/server";

export const config = {
  runtime: "edge",
};

export default async function handler(req: NextRequest): Promise<Response> {
  // ...

  return new Response(
    JSON.stringify({
      result,
    }),
    {
      status: 200,
      headers: {
        "content-type": "application/json",
      },
    }
  );
}
  • Lastly, let’s wrap the code inside a try catch block. The final version should look like this:
import { NextRequest } from "next/server";
import { Configuration, OpenAIApi } from "openai";

export const config = {
  runtime: "edge",
};

const configuration = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
});

const openai = new OpenAIApi(configuration);

export default async function handler(req: NextRequest): Promise<Response> {
  try {
    const { message } = (await req.json()) as {
      message?: string;
    };

    if (!message) {
      return new Response("No message in the request", { status: 400 });
    }

    console.log(message);

    const prompt = "";  // The looooong prompt!

    const completion = await fetch("<https://api.openai.com/v1/completions>", {
      headers: {
        "Content-Type": "application/json",
        Authorization: `Bearer ${process.env.OPENAI_API_KEY ?? ""}`,
      },
      method: "POST",
      body: JSON.stringify({
        model: "text-davinci-003",
        prompt,
        temperature: 0.7,
        top_p: 1,
        frequency_penalty: 0,
        presence_penalty: 0,
        best_of: 1,
        max_tokens: 256,
      }),
    }).then((res) => res.json());

    const result = completion.choices[0].text?.trim();
    if (!result) throw new Error(`No result found!`);

    console.log(result);

    return new Response(
      JSON.stringify({
        result,
      }),
      {
        status: 200,
        headers: {
          "content-type": "application/json",
        },
      }
    );
  } catch (error: any) {
    console.error(error.message);
    return new Response(error.message, { status: 500 });
  }
}

Conclusion

That’s it… In just a few steps, you can move from a regular API Route to the Edge! 🚀

Again, it’s important to remember that it comes with its advantages (increased timeout and stream capability) and constraints (not having access to native Node.js APIs). So do weigh the tradeoffs, and make the correct choice according to your needs.

Thanks for reading!