Overview:
In this tutorial, Lets develop both gRPC and REST based microservices and do the gRPC vs REST Performance Comparison.
If you are new to gRPC, please take a look at these articles first.
TL;DR
Check this youtube video where I show the results.
Sample Application:
Our main goal here is to come up with an application – with 2 different implementations (REST and gRPC) for the exact same functionality. As we had already discussed that gRPC would suit well for client/server application development / inter-Microservices communication where there is more chattiness involved (IO tasks). Lets come up with some requirement to increase the network calls between the services – so that we could easily compare the performance difference.
To keep things simple, Lets consider 2 services. aggregator-service and a square-service. Our square-service is basically a square calculator for the given number. That is if you send 2, it will respond with the result 4.
However the above aggregator service receives a request for N & its wants all the squares from 1 to N. The aggregator does not know how to calculate it and It relies on the back-end square-service. The only way for the aggregator to get the result for all the numbers up to N is to send N requests to the server. That is, it will send a request for 1, 2, 3, …N etc.
When N is 5, the aggregator will be sending 5 requests to the square-service, aggregate all the server responses and respond back to its client as shown below.
[
{
"1":1
},
{
"2":4
},
{
"3":9
},
{
"4":16
},
{
"5":25
}
]
When N is 1000, (for a single aggregator request) aggregator will send 1000 requests to its square-service. Here we intentionally do this way to have more network calls!
We are going to have 2 different implementations for the server side logic as shown here. Based on some flag or parameter, the aggregator will call either rest service or grpc service & give us the results.
gRPC Course:
I learnt gRPC + Protobuf in a hard way. But you can learn them quickly on Udemy. Yes, I have created a separate step by step course on Protobuf + gRPC along with Spring Boot integration for the next generation Microservice development. Click here for the special link.
Proto Service Definition:
- I first create a multi-module maven project as shown here.
- The proto defines the models & service for grpc.
syntax = "proto3";
package vinsmath;
option java_package = "com.vinsguru.model";
option java_multiple_files = true;
message Input {
int32 number = 1;
}
message Output {
int32 number = 1;
int32 result = 2;
}
service SquareRpc {
rpc findSquareUnary(Input) returns (Output) {};
rpc findSquareBiStream(stream Input) returns (stream Output) {};
}
REST – Server Setup:
- This will be a spring boot project
- Controller
@RestController
public class RestSquareController {
@Autowired
private RestSquareService squareService;
@GetMapping("/rest/square/unary/{number}")
public int getSquareUnary(@PathVariable int number){
return number * number;
}
}
- This application will be listening on port 7575
server.port=7575
gRPC – Server Setup:
- Service
- We have 2 different implementations. 1 for unary and 1 for gRPC bi-directional stream
@GrpcService
public class MyGrpcService extends SquareRpcGrpc.SquareRpcImplBase {
@Override
public void findSquareUnary(Input request, StreamObserver<Output> responseObserver) {
int number = request.getNumber();
responseObserver.onNext(
Output.newBuilder().setNumber(number).setResult(number * number).build()
);
responseObserver.onCompleted();
}
@Override
public StreamObserver<Input> findSquareBiStream(StreamObserver<Output> responseObserver) {
return new StreamObserver<>() {
@Override
public void onNext(Input input) {
var number = input.getNumber();
Output output = Output.newBuilder()
.setNumber(number)
.setResult(number * number).build();
responseObserver.onNext(output);
}
@Override
public void onError(Throwable throwable) {
responseObserver.onCompleted();
}
@Override
public void onCompleted() {
responseObserver.onCompleted();
}
};
}
}
- This grpc application will be listening on port 6565
grpc.server.port=6565
Aggregator Service:
This will be the main entry point for us to access both services. This service will act as the client for both gRPC and REST based services we have created above. Take a look at the GitHub URL for the complete source code of this project. The GitHub link is shared at the end of this article.
- Controller for gRPC
@RestController
@RequestMapping("grpc")
public class GrpcAPIController {
@Autowired
private GrpcService service;
@GetMapping("/unary/{number}")
public Object getResponseUnary(@PathVariable int number){
return this.service.getSquareResponseUnary(number);
}
@GetMapping("/stream/{number}")
public Object getResponseStream(@PathVariable int number){
return this.service.getSquareResponseStream(number);
}
}
- Controller for REST
@RestController
@RequestMapping("rest")
public class RestAPIController {
@Autowired
private RestService service;
@GetMapping("/unary/{number}")
public Object getResponseUnary(@PathVariable int number){
return this.service.getUnaryResponse(number);
}
}
- application.properties
server.port=8080
grpc.client.square.address=static://localhost:6565
grpc.client.square.negotiationType=plaintext
rest.square.service.endpoint=http://localhost:7575
Once we build the application, If I send the below request for 10, it will internally send 10 requests to the gRPC service and aggregate the response as shown below.
# request
http://localhost:8080/grpc/unary/10
# response
[{"1":1},{"2":4},{"3":9},{"4":16},{"5":25},{"6":36},{"7":49},{"8":64},{"9":81},{"10":100}]
gRPC vs REST Performance – Unary:
Lets do the performance test by sending 1000 requests to the aggregator service with 100 concurrent requests at a time. We simulate 100 concurrent users load. I use the ApacheBench tool for the performance test. I ran these multiple times (for warming up the servers) & took the best results for comparing.
- Rest request
ab -n 1000 -c 100 http://localhost:8080/rest/unary/1000
- Result
Server Software:
Server Hostname: localhost
Server Port: 8080
Document Path: /rest/unary/1000
Document Length: 14450 bytes
Concurrency Level: 100
Time taken for tests: 65.533 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 14548000 bytes
HTML transferred: 14450000 bytes
Requests per second: 15.26 [#/sec] (mean)
Time per request: 6553.313 [ms] (mean)
Time per request: 65.533 [ms] (mean, across all concurrent requests)
Transfer rate: 216.79 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.8 0 4
Processing: 5804 6492 224.7 6451 7215
Waiting: 5787 6485 225.7 6440 7214
Total: 5807 6493 224.6 6451 7216
Percentage of the requests served within a certain time (ms)
50% 6451
66% 6536
75% 6628
80% 6705
90% 6823
95% 6889
98% 6960
99% 7163
100% 7216 (longest request)
- gRPC request
ab -n 1000 -c 100 http://localhost:8080/grpc/unary/1000
- Result
Server Software:
Server Hostname: localhost
Server Port: 8080
Document Path: /grpc/unary/1000
Document Length: 14450 bytes
Concurrency Level: 100
Time taken for tests: 26.818 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 14548000 bytes
HTML transferred: 14450000 bytes
Requests per second: 37.29 [#/sec] (mean)
Time per request: 2681.798 [ms] (mean)
Time per request: 26.818 [ms] (mean, across all concurrent requests)
Transfer rate: 529.76 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.5 0 2
Processing: 930 2584 682.2 2442 4702
Waiting: 930 2583 682.2 2441 4702
Total: 930 2584 682.0 2442 4702
Percentage of the requests served within a certain time (ms)
50% 2442
66% 3049
75% 3131
80% 3179
90% 3381
95% 3510
98% 3812
99% 4152
100% 4702 (longest request)
- Results Summary:
CPU Utilization | Throughput (Requests/Second) | 50th Percentile Response Time | 90th Percentile Response Time | |
---|---|---|---|---|
REST | ~85% | 15.26 | 6.451 seconds | 6.823 seconds |
gRPC | ~52% | 37.29 | 2.442 seconds | 3.381 seconds |
- Note
-
- All the 3 services were running on the same machine
- Results could vary depends on the CPU/Memory you have
- For the REST based service, I tried modifying the netty server config. I did not see any difference in the performance.
-
gRPC vs REST Performance – Bi-Directional Stream:
gRPC already seems to perform much better than REST for the example we took. As REST is unary by default, it is fair to compare the performance with gRPC’s unary/blocking stub. But what would have been the performance difference If we had gone with bi-directional stream?
Now If I run the same test by using bidirectional steam approach, throughput goes up to ~95 requests/second which is terrific!!
Server Software:
Server Hostname: localhost
Server Port: 8080
Document Path: /grpc/stream/1000
Document Length: 14450 bytes
Concurrency Level: 100
Time taken for tests: 10.528 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 14548000 bytes
HTML transferred: 14450000 bytes
Requests per second: 94.98 [#/sec] (mean)
Time per request: 1052.836 [ms] (mean)
Time per request: 10.528 [ms] (mean, across all concurrent requests)
Transfer rate: 1349.41 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 1.0 0 6
Processing: 911 1048 56.1 1041 1210
Waiting: 911 1046 56.6 1040 1209
Total: 911 1048 56.1 1042 1210
Percentage of the requests served within a certain time (ms)
50% 1042
66% 1066
75% 1079
80% 1085
90% 1145
95% 1165
98% 1174
99% 1184
100% 1210 (longest request)
CPU Utilization | Throughput (Requests/Second) | 50th Percentile Response Time | 90th Percentile Response Time | |
---|---|---|---|---|
REST | ~85% | 15.26 | 6.451 seconds | 6.823 seconds |
gRPC-Unary | ~52% | 37.29 | 2.442 seconds | 3.381 seconds |
gRPC Bi-Directional Stream | ~42% | 94.98 | 1.042 seconds | 1.148 seconds |
gRPC vs REST – Response Time:
- Lower the better
gRPC vs REST – Throughput:
- Higher the better
The Source code is available here
Happy learning 🙂