I’m going to start this post by saying that I’ve been out of front end frameworks for a long time so I’m quite rusty with the best practices of things. If there’s better ways to do this I’m all ears, but for now this seemed like a worth thing to document given how much I had to piece together.
This is everything I have so far on getting a Next.js server set up as a gRPC client so that it can call out to a gRPC server (which is written in Go in this instance, but that doesn’t matter too much). Is gRPC the best choice for this? Possibly not. Was it worth doing to find out how to connect the dots? Very much so. Also note this is very much focussed on a Next.js application being the gRPC client and doesn’t go into much server detail.
###Dependencies So start by installing the following npm packages and saving them as dependencies:
@grpc/grpc-js
@grpc/proto-loader
These two packages are crucial for loading our gRPC services and creating a client with which we can call the server. They also provide a utility so that we can generate type definitions for our services and messages.
File Generation
Assuming that you have already created proto files the next step is to generate type definitions. The grpc-js module comes with a generator built in and has a fairly simple usage. It can be found in the node_modules\.bin\
folder and called like so:
node_modules\.bin\proto-loader-gen-types --longs=String --enums=String --defaults --oneofs --grpcLib=@grpc/grpc-js --outDir=$destinationPath $sourceFile
You might notice that I’m specifying a source file. If you have multiple services spread over multiple .proto files you will need to specify each file individually. Another key component is the .proto files themselves that will be required. So if you’re looking for an easy life a script to loop through the files, copy each proto file and call the generator on each file makes things easier.
Usage
This is an area that caused me a lot of head scratching hence why I wanted to write it down. There’s also a small caveat to file loading in NextJs, especially if like me you’re using Vercel hosting.
In the simplest form possible this is what we need to do:
import * as protoLoader from "@grpc/proto-loader";
import * as grpc from "@grpc/grpc-js";
import {ProtoGrpcType} from "@/app/protos/charactersvc";
import {GetCharacterRequest} from "@/app/protos/charactersvc/GetCharacterRequest";
Import what we need. protoLoader
and grpc
are the modules that we installed earlier. ProtoGrpcType
and GetCharacterRequest
come from our type generations. In this example ProtoGrpcType
is the grpc service that we’re going to be calling and GetCharacterRequest
is a type request object that we’re going to be sending to the service.
const protoPath = path.join(process.cwd() + '/app/protos/charactersvc.proto');
const packageDefinition = protoLoader.loadSync(protoPath);
const CharacterService = ((grpc.loadPackageDefinition(packageDefinition) as unknown) as ProtoGrpcType).charactersvc.CharacterService;
const client = process.env.BACKEND_SECURE === 'true'
? new CharacterService(process.env.BACKEND_API_URL as string, grpc.credentials.createSsl())
: new CharacterService(process.env.BACKEND_API_URL as string, grpc.credentials.createInsecure());
Next we’re going to define a few things before we call our service. Using process.cwd()
we get the current process path and join to the proto file that contains the service we’re going to be calling. I’m not sure why grpc-js needs the proto files but it does. Using the creating path we then load the package definition. We then use the package definition to create a subtype constructor that we can finally use to create our client. Note that here I’m also using an environment flag to see if I running the backend over https or not which is useful for local development.
const request : GetCharacterRequest = {
characterId: id
}
return new Promise((resolve, reject) => {
client.GetCharacter(request, function (err, response) {
if (err !== null) {
console.log("Error getting character: ", err.message);
reject();
} else {
const character = <CharacterField>{
id: id,
userId: response?.userId,
name: response?.name,
credits: response?.credits,
}
resolve(character);
}
})
});
In this code snippet we create a request, call the client and then handle the response. Given the nature of grpc-js is to rely solely on callbacks if we want to handle the return immediately (without the fire, forget and handle later approach) the client call needs to be wrapped in a promise, using reject/resolve to appropriately handle the response. Depending on what you need the response of the service for (and when) this might not always be required, but it is worth noting.
NextJs/Vercel Caveat
As of writing there’s a small caveat if you’re using Vercel hosting and trying to load in proto files. As already said proto files are required by grpc-js so they’re not an option to leave behind. While in the example it looks like we’ve loaded the proto file in fine using the line const protoPath = path.join(process.cwd() + '/app/protos/charactersvc.proto');
this isn’t actually the case if you have your grpc calls in a server component.
When deploying our application to Vercel we need a way to tell the build that our files need to be included in the server build and available during runtime. There’s a couple of way’s that I’ve ready about this but for me I settled on the following:
import path from "path";
export default async function Page() {
path.join(process.cwd() + '/app/protos/charactersvc.proto');
return "";
}
I created a page component with the above logic in it. It returns nothing, but reaching out to the files is enough to let Vercel know that any of the files touched here will be included in the build output and therefore available later on. It’s bit rubbish but it works.
Conclusion
Hopefully someone else will find this useful but if not at least it’s here for me to remind myself what I did should I need to use grpc in NextJs (or a similar NodeJs based application) in the future. I can’t really say that’s it’s something I would recommend doing, and depending on what you’re trying to achieve a basic API over http might be more sufficient.