In this article we’ll discuss how to build a simple, one endpoint gRPC API service with Protocol Buffers, and prepare its client-side and server-side code with gRPC tools. This will be an attempt to start from scratch, take a simple function and expose it via a gRPC interface on our own machine. We won’t be using gRPC in advanced modes (request-streaming, response-streaming, bidirectional-streaming) with additional features such as error handling and authentication. This tutorial will help you to create APIs using gRPC in Python.

What is gRPC?

gRPC or Google Remote Procedure Call is an open-source RPC framework that is designed to be fast and efficient. It was created by Google with the purpose of developing a modern RPC framework that can run in any environment. They wanted something that could run anywhere, including in both data centers or on the last mile of distributed computing to connect devices, mobile applications, and browsers to backend services. It is a modern, open-source, high-performance RPC framework. 

What sets it apart from traditional REST APIs is that it relies on protocol buffers to define both the data structures and serialization formats. gRPC is compatible across multiple programming languages and it has support for load balancing, tracing, health checking, and authentication. It also includes pluggable support for cross-language service discovery.

Creating a function to display

First, let’s create a function (procedure) that we will expose and remotely call . We can create and locate this file in a folder, let’s say called, grpc-tutorial-python.

#calculator.py
import math

def square_root(x):
  y = math.sqrt(x)
  return y

Here we have just created a basic function with the math library to output the square root of x. It is only for testing, you can put whatever you want here, even a basic print function will suffice.

Generating the gRPC classes

Here we will build the gRPC service using the language generated by Protocol Buffers. Protocol Buffers are used to define messages, their data types, and the service models in a proto file. The proto file is a text file that ends in the extension .proto, written using the Protocol Buffer’s syntax. We can make this in a separate folder, cal, to keep things tidy.

#calculator.proto
syntax = "proto3";

message Number {
    float value = 1;
}

service Calculator {
    rpc SquareRoot(Number) returns (Number) {}
}

The service model is basically the definition of how our services will transmit data and interact with each other. Then we’ll generate code for the message and server classes using the protoc compile. 

After the service model is defined in the .proto file, we can generate the service files by running the protoc command on the .proto file. The protoc command will generate code for the New files (and classes) automatically. To that, we’ll need some gRPC libraries or tools which can be installed with the following commands:

pip install grpcio
pip install grpcio-tools

After that, we can run the protoc command on the .proto file by running the following command: 

python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. cal/calculator.proto

The two following files will be created if everything worked properly:

cal/calculator_pb2.py --which  contains message classes
cal/calculator_pb2_grpc.py -- which contains server and client classes

Making a gRPC server

Now that we have generated all the files that we needed to create the server we can finally start writing the server file:

import grpc
from concurrent import futures
import time

#import the generated files
import cal.calculator_pb2 as calculator_pb2
import cal.calculator_pb2_grpc as calculator_pb2_grpc

#import the function files
import calculator

#create a class to define the server functions derived from calculator_pb2_grpc (or the server and client classes)
class CalculatorServicer(calculator_pb2_grpc.CalculatorServicer):

    #calculator.square_root is exposed here
    def SquareRoot(self, request, context):
        response = calculator_pb2.Number()
        response.value = calculator.square_root(request.value)
        return response


#create a gRPC server
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))

#use the function "add_CalculatorServicer_to_server" to add the defined class to the created server
calculator_pb2_grpc.add_CalculatorServicer_to_server(
        CalculatorServicer(), server)

#listen on port 50051
print('Starting server. Listening on port 8000.')
server.add_insecure_port('[::]:8000')
server.start()

# since server.start() will not block be blocked ever, a sleep-loop is added to keep alive
try:
    while True:
        time.sleep(86400)
except KeyboardInterrupt:
    server.stop(0)

gRPC will invoke the overridden square_root method in the CalculatorServicer class automatically when a client accesses the SquareRoot endpoint. We can now also start the server to see if everything works or not, which can be done with:

python server.py

And if you get a response like this means your code worked:

Now we have a live gRPC server, running and listing to events on port 8000.

Next, use CalculatorStub to create a client for the gRPC API service, as showcased below:

import grpc

#import the generated files
import cal.calculator_pb2 as calculator_pb2
import cal.calculator_pb2_grpc as calculator_pb2_grpc

#open and connect to the gRPC channel we just created
channel = grpc.insecure_channel('localhost:8000')

#create a stub (client)
stub = calculator_pb2_grpc.CalculatorStub(channel)

#create a valid request message or just any number in our case
number = calculator_pb2.Number(value=15)

#make the call
response = stub.SquareRoot(number)

#print the results
print(response.value)

The client will be connecting to an API that you’re offering over GRPC. Your server will use the CalculatorService class, with a name parameter to acquire an instance of it. gRPC interprets this Python class as one defined by Protocol Buffers. It will then create an instance of Python by parsing the CalculatorService Protobuf message and passing it off to your server via a GRPC connection. The response from your server, a User Protocol Buffers message, is then parsed into a User Python class and then printed out.

Think of the .proto file as like a contract between the server and its clients. gRPC is the courier and Protocol Buffers is the arbitrator/translator that enforces the contract. Protocol Buffers translate what clients and servers speak from/into a universal translator language with gRPC carrying communications around in HTTP/2, When you work with Protocol Buffers and gRPC, you are able to focus more on your application’s true functionality.

Results

python3 client.py

We can run this file similarly to how we did for the other one:

Note that we have to keep the server file running when we run this because that is where our API is. You may open another terminal for this.  

After running the above file you should get a response similar to this:

This means it worked!!

Conclusion

The microservice architecture is all the rage right now. Microservices allow you to break apart your monolithic architecture into many smaller services, each with its own codebase, data store, and deployable instance and communicate with each other over a network. There are other intricacies that makeup gRPC, such as response streaming and bidirectional streaming, but I hope this post becomes a good reference for those just getting started.

There is a lot of room for improvement when it comes to this project so try and experiment for yourself with the help of Google’s documentation on gRPC. So, that’s how you create APIs using gRPC in Python.

Here are some useful tutorials that you can read: