Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support orjson for JSON #304

Closed
ijl opened this issue Jan 4, 2019 · 3 comments
Closed

Support orjson for JSON #304

ijl opened this issue Jan 4, 2019 · 3 comments

Comments

@ijl
Copy link

ijl commented Jan 4, 2019

I'd like to add https://github.com/ijl/orjson as a JSON serializer and deserializer. The motivation is performance.

It's difficult to install, though, because it's a Rust extension. It assumes either a Rust nightly compiler or a prebuilt wheel. So if included, it's probably not appropriate for a [full] install, which requires a C compiler, but some other target.

It can also either replace json.loads or only be a renderer.

To validate the performance motivation, take the patch at the end, and run:

from starlette.applications import Starlette
from starlette.responses import JSONResponse, UJSONResponse, OrJSONResponse
import uvicorn
import orjson


app = Starlette()


with open('twitter.json') as fileh:
    data = orjson.loads(fileh.read())


@app.route('/')
async def root(request):
    return OrJSONResponse(data)


if __name__ == "__main__":
    uvicorn.run(app, host='0.0.0.0', port=8000, log_level="error")

WIth wrk -d 30 http://0.0.0.0:8000/, I see:

JSONRenderer:
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    37.34ms   10.60ms  87.71ms   88.90%
    Req/Sec   134.33      6.86   151.00     90.33%
  8036 requests in 30.03s, 3.50GB read
Requests/sec:    267.61
Transfer/sec:    119.20MB

UJSONRenderer:
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    21.48ms    6.10ms  51.77ms   88.84%
    Req/Sec   233.76     27.50   282.00     51.83%
  13976 requests in 30.03s, 6.16GB read
Requests/sec:    465.41
Transfer/sec:    209.98MB

OrJSONRenderer:
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     5.69ms    1.64ms  18.23ms   88.33%
    Req/Sec     0.88k   101.75     1.01k    51.33%
  52795 requests in 30.02s, 22.96GB read
Requests/sec:   1758.78
Transfer/sec:    783.36MB

Patch to test:

diff --git a/requirements.txt b/requirements.txt
index aae4715..1b8a9c3 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -7,6 +7,7 @@ python-multipart
 pyyaml
 requests
 ujson
+orjson

 # Database backends
 asyncpg
diff --git a/setup.py b/setup.py
index 13189ad..d58ef5d 100644
--- a/setup.py
+++ b/setup.py
@@ -53,6 +53,7 @@ setup(
             'graphene',
             'itsdangerous',
             'jinja2',
+            'orjson',
             'python-multipart',
             'pyyaml',
             'requests',
diff --git a/starlette/requests.py b/starlette/requests.py
index 2132151..7463f0e 100644
--- a/starlette/requests.py
+++ b/starlette/requests.py
@@ -12,6 +12,11 @@ try:
 except ImportError:  # pragma: nocover
     parse_options_header = None  # type: ignore

+try:
+    from orjson import loads as json_loads
+except ImportError:  # pragma: nocover
+    from json import loads as json_loads  # type: ignore
+

 class ClientDisconnect(Exception):
     pass
@@ -161,7 +166,7 @@ class Request(HTTPConnection):
     async def json(self) -> typing.Any:
         if not hasattr(self, "_json"):
             body = await self.body()
-            self._json = json.loads(body)
+            self._json = json_loads(body)
         return self._json

     async def form(self) -> dict:
diff --git a/starlette/responses.py b/starlette/responses.py
index 3b8d275..e363c11 100644
--- a/starlette/responses.py
+++ b/starlette/responses.py
@@ -19,6 +19,11 @@ except ImportError:  # pragma: nocover
     aiofiles = None  # type: ignore
     aio_stat = None  # type: ignore

+try:
+    import orjson
+except ImportError:  # pragma: nocover
+    orjson = None  # type: ignore
+
 try:
     import ujson
 except ImportError:  # pragma: nocover
@@ -184,6 +189,13 @@ class JSONResponse(Response):
         ).encode("utf-8")


+class OrJSONResponse(JSONResponse):
+    media_type = "application/json"
+
+    def render(self, content: typing.Any) -> bytes:
+        return orjson.dumps(content)
+
+
 class UJSONResponse(JSONResponse):
     media_type = "application/json"

diff --git a/tests/test_responses.py b/tests/test_responses.py
index d4bafaf..6a86b56 100644
--- a/tests/test_responses.py
+++ b/tests/test_responses.py
@@ -8,6 +8,7 @@ from starlette.background import BackgroundTask
 from starlette.requests import Request
 from starlette.responses import (
     FileResponse,
+    OrJSONResponse,
     RedirectResponse,
     Response,
     StreamingResponse,
@@ -43,6 +44,19 @@ def test_bytes_response():
     assert response.content == b"xxxxx"


+def test_orjson_response():
+    def app(scope):
+        async def asgi(receive, send):
+            response = OrJSONResponse({"hello": "world"})
+            await response(receive, send)
+
+        return asgi
+
+    client = TestClient(app)
+    response = client.get("/")
+    assert response.json() == {"hello": "world"}
+
+
 def test_ujson_response():
     def app(scope):
         async def asgi(receive, send):

Thoughts?

@tomchristie
Copy link
Member

Thoughts?

That'd be grand. Better suited as a third party package, than built into core.

(Possible we might drop UJSON out of the core package at some point too)

Aside: We might also want to link to example snippets of code from the "Third party packages" section, rather than forcing contributors to create a fully fledged python package?

@caniko
Copy link

caniko commented Jul 13, 2022

I stumbled over this idea as well, and realised that I could just make a simple function:

async def json_loads(request: Request) -> dict:
    return orjson.loads(await request.body())

Using function:

await json_loads(request)

@racinmat
Copy link

And how about passing the parsing function as an optional argument?
The aiohttp is dealing with the same problem and they have it as a second argument, see https://github.com/aio-libs/aiohttp/blob/master/aiohttp/web_request.py#L669 this way you can easily pass the orjson.loads without the need to define your function, although it's a one-liner, and it stays cached in the _json.
Should I make a PR for that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants