socketify.py Ideas & Updates! #10
Replies: 33 comments 55 replies
-
Preliminary Performance ComparisonTests made using ./http_load_test 40 localhost 3000 from uSockets project OS: Debian GNU/Linux bookworm/sid x86_64 CPU: Intel i7-7700HQ (8) @ 3.800GHz Memory: 32066MiB Average Requests per Second
Relative Performance vs C++
All plaintext tests following the TechEmpower rules for plaintext, with Date header updated each 1 second Average Requests per Second using JSON
JSON Relative Performance vs C++ Plain Text
Packages:Ruby using Oj package (at least 20% faster than json package) Python3 using orjson package (about 10% faster than ujson in this test, zzzjson, json are slower than ujson in python3) PyPy3 using json (2 to 3x faster than ujson, orjson is not supported in PyPy) Node.js using JSON.stringify All JSON tests following the TechEmpower rules for application/json, with Date header updated each 1 second |
Beta Was this translation helpful? Give feedback.
-
Added JSON tests with some packages to bring more performance comparisons. Fixed 4 processes Avg for node.js. asyncio is really slow, uvloop is not working in PyPy3, i am work in an CFFI solution for asyncio.
|
Beta Was this translation helpful? Give feedback.
-
First working pip package created + Docker images working ❤️ EDIT: in the future the install experience will be better |
Beta Was this translation helpful? Give feedback.
-
@AmirHmZz was i promised, here some techempower to you compare, is not an valid benchmark in my view because its too short to GC collect something, payload size is too small and other problems, but here are the numbers, i change some code too to updated Ruby and Python version and tried to add some pypy benchs too. I will write an article about benchmarking and give an deep dive in all this tests and more.
|
Beta Was this translation helpful? Give feedback.
-
Added a lot of examples in https://github.com/cirospaciari/socketify.py/tree/main/tests/examples updated Docker images in: PS: python3-alpine is better than python3 (2x faster in average) due to be more stable in 1% and 5% response times, but 97.5% is the same Examples of fetching are using aiohttp, but i will change to a CFFI solution if i find something faster and well maintained HTTPS still not working but i will add boringssl version of uSockets to fix this with priority |
Beta Was this translation helpful? Give feedback.
-
Added a lot of utilities features for post/upload, still missing try_end and for_each_header, but cookies was added as an extra https://github.com/cirospaciari/socketify.py/blob/main/tests/examples/upload_or_post.py Next step is streaming files, and related helpers, after this fix SSL build and add WS methods :D |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Some static files performance data from my personal machine (Debian 12/testing, i7-7700HQ, 32GB RAM, Samsung 970 PRO NVME) using oha -c 400 -z 5s http://localhost:3000/
Conclusions: Anyway we really recommends using NGINX or similar + CDN for production like everybody else |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Without proper asyncio integration (fully integrated like uvloop) we got an performance hit on PyPy3, my last test i got 770k req/s in plaintext vs 1,293k req/s without asyncio run_once solution, is still ok for an first solution but will be not the last solution, i will fully integrate asyncio witth libuv, file IO included, because this performance hit is not acceptable in the long run. #18 I will do some improvements to bring performance up ASAP |
Beta Was this translation helpful? Give feedback.
-
Removing asyncio workarounds we get a performance boost from 770k to 880k req/s still much lower than 1.3 million req/s, i will do some performance checks in older version but at first seems that having a bigger Response and Request object with more functions is impacting performance (but we need all this features), so we will integrate all features first, measure the impact again, integrate your own aiofiles and fetch API with libuv and after this i will start an "uvloop" like project to fully integrate asyncio with libuv, removing the workarounds. Giving it at least 12% to 30% more raw performance. Adding Http3 will improve much more performance than replacing the workaround right now, and its way less effort for some one like me that is working only a few hours per week in this project (until i get support). Migration to HPy in future will bring Python3 performance up a lot. |
Beta Was this translation helpful? Give feedback.
-
Finally support for websockets is done and server names now examples and docs are in progress, soon the first release is coming |
Beta Was this translation helpful? Give feedback.
-
@AmirHmZz @eirnym Included benchmarks with falcon, uvicorn and robyn, did not include vibora or japronto because there are not active projects. in Round 22 robyn, vibora and socketify.py will be featured. This week end i will add docs and release PyPI the first true release v0.1.0 |
Beta Was this translation helpful? Give feedback.
-
WebSockets are benchmarked and a better readme that highlights better what's matter including Stargazers and Forkers, @vytas7 feedback from Falcon gitter was really helpful and meaningful. I'm working in the current Documentations including examples with remate engines like Mako e Jinja that are missing, tutoriais and full API doc is on the works too. The first stable release will be publish after the docs |
Beta Was this translation helpful? Give feedback.
-
I was dumb enough to copy the socketify.rb and forgot to change the language name to Python, i added an PR to TechEmPower to fix that. Socketify is the fastest in active development web framework for Python in TechEmPower. |
Beta Was this translation helpful? Give feedback.
-
@kijk2869 for now on i will be using Thanks <3 |
Beta Was this translation helpful? Give feedback.
-
I also think we need a discord server Discord server: https://discord.socketify.dev/ |
Beta Was this translation helpful? Give feedback.
-
Initial docs: www.socketify.dev or docs.socketify.dev |
Beta Was this translation helpful? Give feedback.
-
Just want to say again how awesome this project is and thank you for all your work on it. I’ve got a benchmarking question. I see that the requests per second benchmark really shines, but I’m curious if there have been any benchmarks with large numbers of simultaneous connections. For example: https://unetworkingab.medium.com/millions-of-active-websockets-with-node-js-7dc575746a01. Is there anything about Python’s memory management or anything like that that would preclude being able to get in the ballpark of the article above? |
Beta Was this translation helpful? Give feedback.
-
Preparations for the first PyPi package are done, in the next version we should publish on PyPi too ❤️ 🚀 |
Beta Was this translation helpful? Give feedback.
-
Implemented option to use object factory for AppRequest() and AppResponse(), and also WeSockets objects: app = App(request_response_factory_max_itens=200_000, websocket_factory_max_itens=1_500_000) TechEmPower plaintext With factory:
Without factory:
More performance information with PyPy + object factory: In CPython the performance increase are less pronounced, up to 10% Benchmarks numbers in README.md will be updated soon |
Beta Was this translation helpful? Give feedback.
-
After some tweaks and testing with ASGI protocol now we can power Falcon for WebSockets.
|
Beta Was this translation helpful? Give feedback.
-
@seanr3 just want to thank you for your help finding bugs, this means a lot! |
Beta Was this translation helpful? Give feedback.
-
asyncio study case: On my machine using more threads push performance up for async. The best results i can get on my machine for this cases bellow are:
EDIT: before socketify.py pypy was 79314, now is 131598 after optimiations, and asgi before was 30157 and now is 53731, also increase 11% more performance in ./http_load_test For comparison some sync numbers:
The socketify advantage is explore async and sync when needed, so you can do work async, cache and respond sync, the Goal is to be the fastest async and sync server and web framework for python. This tests are using oha and this is not using pipelining, we can push way harder than this with wrk or other tools, but this show an more realistic number. The true super power currently is pub/sub in websockets for socketify
WSGI compatibility is also here, but performs really badly, ASGI, SSGI and RSGI will be supported with pub/sub extensions and i will optimize this three, i will not put much effort on WSGI for now maybe in future. |
Beta Was this translation helpful? Give feedback.
-
Current overhead of socketify ASGI and WSGI for reference using TechEmPower plaintext
|
Beta Was this translation helpful? Give feedback.
-
New cli tool to help people to use workers/forks, ASGI, WSGI etc is in the works. python3 -m socketify --help
Running you can easyly run wsgi, asgi or socketify app python3 -m socketify hello_world_cli:run --port 8080 --workers 2
python3 -m socketify falcon_wsgi:app --port 8080 --workers 2
python3 -m socketify falcon_asgi:app --port 8080 --workers 2 Socketify apps requires you to pass an run function, the cli will create the instance for you from socketify import App
# App will be created by the cli with all things you want configured
def run(app: App):
# add your routes here
app.get("/", lambda res, req: res.end("Hello World!")) WebSockets can be in the same or another module, you can still use .ws("/*) to serve Websockets python3 -m socketify hello_world_cli:run --ws hello_world_cli:websocket --port 8080 --workers 2 websocket = {
"open": lambda ws: ws.send("Hello World!", OpCode.TEXT),
"message": lambda ws, message, opcode: ws.send(message, opcode),
"close": lambda ws, code, message: print("WebSocket closed"),
} When running ASGI will be served by default but you can disabled it python3 -m socketify falcon_asgi:app --ws none --port 8080 --workers 2 When running WSGI or ASGI you can still use socketify.py or ASGI websockets in the same server, mixing all available methods python3 -m socketify falcon_wsgi:app --ws falcon:ws none --port 8080 --workers 2 |
Beta Was this translation helpful? Give feedback.
-
Can socketify beeing used as an ASGI server? For example instead of uvicorn for running a FastAPI app? |
Beta Was this translation helpful? Give feedback.
-
Just a quick update last Official TechEmPower plaintext results for socketify, we hit 6 million requests per second 🏆 this is a new record for Python! Go Fiber got 5.9 million we are faster than Fiber! 🚀🦾🔥⚡ |
Beta Was this translation helpful? Give feedback.
-
@cirospaciari Why didn't you use orjson instead of built-in json module of CPython? |
Beta Was this translation helpful? Give feedback.
-
This is a little bit quiet because of the holidays but: First version on PyPI: https://pypi.org/project/socketify/ Work in progress:
Edit: 50% progress |
Beta Was this translation helpful? Give feedback.
-
👋 Welcome Pythonists!
We’re using Discussions as a place to connect with other members of our community. We hope that you:
build together 💪.
Hello people this is the place to get updates, preliminary performance results and more about the state of this project.
Beta Was this translation helpful? Give feedback.
All reactions