Zero: A simple, fast, high performance and low latency Python framework (RPC + PubSub) for building microservices or distributed servers

301
16
Python

Zero is a simple Python framework (RPC like) to build fast and high performance microservices or distributed servers



Features:

  • Zero provides faster communication (see benchmarks) between the microservices using zeromq under the hood.
  • Zero uses messages for communication and traditional client-server or request-reply pattern is supported.
  • Support for both async and sync.
  • The base server (ZeroServer) utilizes all cpu cores.
  • Code generation! See example ๐Ÿ‘‡

Philosophy behind Zero:

  • Zero learning curve: The learning curve is tends to zero. Just add functions and spin up a server, literally thatโ€™s it! The framework hides the complexity of messaging pattern that enables faster communication.
  • ZeroMQ: An awesome messaging library enables the power of Zero.

Letโ€™s get started!

Getting started ๐Ÿš€

Ensure Python 3.8+

pip install zeroapi

For Windows, tornado needs to be installed separately (for async operations). Itโ€™s not included with zeroapi because for linux and mac-os, tornado is not needed as they have their own event loops.

  • Create a server.py

    from zero import ZeroServer
    
    app = ZeroServer(port=5559)
    
    @app.register_rpc
    def echo(msg: str) -> str:
        return msg
    
    @app.register_rpc
    async def hello_world() -> str:
        return "hello world"
    
    
    if __name__ == "__main__":
        app.run()
    
  • The RPC functions only support one argument (msg) for now.

  • Also note that server RPC functions are type hinted. Type hint is must in Zero server. Supported types can be found here.

  • Run the server

    python -m server
    
  • Call the rpc methods

    from zero import ZeroClient
    
    zero_client = ZeroClient("localhost", 5559)
    
    def echo():
        resp = zero_client.call("echo", "Hi there!")
        print(resp)
    
    def hello():
        resp = zero_client.call("hello_world", None)
        print(resp)
    
    
    if __name__ == "__main__":
        echo()
        hello()
    
  • Or using async client -

    import asyncio
    
    from zero import AsyncZeroClient
    
    zero_client = AsyncZeroClient("localhost", 5559)
    
    async def echo():
        resp = await zero_client.call("echo", "Hi there!")
        print(resp)
    
    async def hello():
        resp = await zero_client.call("hello_world", None)
        print(resp)
    
    
    if __name__ == "__main__":
        loop = asyncio.get_event_loop()
        loop.run_until_complete(echo())
        loop.run_until_complete(hello())
    

Serialization ๐Ÿ“ฆ

Default serializer

Msgspec is the default serializer. So msgspec.Struct (for high performance) or dataclass or any supported types can be used easily to pass complex arguments, i.e.

from dataclasses import dataclass
from msgspec import Struct
from zero import ZeroServer

app = ZeroServer()

class Person(Struct):
    name: str
    age: int
    dob: datetime

@dataclass
class Order:
    id: int
    amount: float
    created_at: datetime

@app.register_rpc
def save_person(person: Person) -> bool:
    # save person to db
    ...

@app.register_rpc
def save_order(order: Order) -> bool:
    # save order to db
    ...

Return type on client

The return type of the RPC function can be any of the supported types. If return_type is set in the client call method, then the return type will be converted to that type.

@dataclass
class Order:
    id: int
    amount: float
    created_at: datetime

def get_order(id: str) -> Order:
    return zero_client.call("get_order", id, return_type=Order)

Code Generation ๐Ÿค–

Easy to use code generation tool is also provided with schema support!

  • After running the server, like above, it calls the server to get the client code.

    This makes it easy to get the latest schemas on live servers and not to maintain other file sharing approach to manage schemas.

    Using zero.generate_client generate client code for even remote servers using the --host and --port options.

    python -m zero.generate_client --host localhost --port 5559 --overwrite-dir ./my_client
    
  • It will generate client like this -

    from dataclasses import dataclass
    from msgspec import Struct
    from datetime import datetime
    
    from zero import ZeroClient
    
    
    zero_client = ZeroClient("localhost", 5559)
    
    class Person(Struct):
        name: str
        age: int
        dob: datetime
    
    
    @dataclass
    class Order:
        id: int
        amount: float
        created_at: datetime
    
    
    class RpcClient:
        def __init__(self, zero_client: ZeroClient):
            self._zero_client = zero_client
    
        def save_person(self, person: Person) -> bool:
            return self._zero_client.call("save_person", person)
    
        def save_order(self, order: Order) -> bool:
            return self._zero_client.call("save_order", order)
    

    Check the schemas are copied!

  • Use the client -

    from my_client import RpcClient, zero_client
    
    client = RpcClient(zero_client)
    
    if __name__ == "__main__":
        client.save_person(Person(name="John", age=25, dob=datetime.now()))
        client.save_order(Order(id=1, amount=100.0, created_at=datetime.now()))
    

If you want a async client just replace ZeroClient with AsyncZeroClient in the generated code, and update the methods to be async. (Next version will have async client generation, hopefully ๐Ÿ˜…)

Important notes! ๐Ÿ“

For multiprocessing

  • ZeroServer should always be run under if __name__ == "__main__":, as it uses multiprocessing.
  • ZeroServer creates the workers in different processes, so anything global in your code will be instantiated N times where N is the number of workers. So if you want to initiate them once, put them under if __name__ == "__main__":. But recommended to not use global vars. And Databases, Redis, other clients, creating them N times in different processes is fine and preferred.

Letโ€™s do some benchmarking! ๐ŸŽ

Zero is all about inter service communication. In most real life scenarios, we need to call another microservice.

So we will be testing a gateway calling another server for some data. Check the benchmark/dockerize folder for details.

There are two endpoints in every tests,

  • /hello: Just call for a hello world response ๐Ÿ˜…
  • /order: Save a Order object in redis

Compare the results! ๐Ÿ‘‡

Benchmarks ๐Ÿ†

11th Gen Intelยฎ Coreโ„ข i7-11800H @ 2.30GHz, 8 cores, 16 threads, 16GB RAM (Docker in Ubuntu 22.04.2 LTS)

(Sorted alphabetically)

Framework โ€œhello worldโ€ (req/s) 99% latency (ms) redis save (req/s) 99% latency (ms)
aiohttp 14949.57 8.91 9753.87 13.75
aiozmq 13844.67 9.55 5239.14 30.92
blacksheep 32967.27 3.03 18010.67 6.79
fastApi 13154.96 9.07 8369.87 15.91
sanic 18793.08 5.88 12739.37 8.78
zero(sync) 28471.47 4.12 18114.84 6.69
zero(async) 29012.03 3.43 20956.48 5.80

Seems like blacksheep is faster on hello world, but in more complex operations like saving to redis, zero is the winner! ๐Ÿ†

Contribution

Contributors are welcomed ๐Ÿ™

Please leave a star โญ if you like Zero!

"Buy Me A Coffee"