Python recently (3.12) has some important performance update that make it faster and better support for multi threading.
Source
Below is the relevant excerpt:
The per-interpreter GIL and subinterpreters
What keeps Python from being truly fast? One of the most common answers is "lack of a better way to execute code across multiple cores." Python does have multithreading, but threads run cooperatively, yielding to each other for CPU-bound work. And Python's support for multiprocessing is top-heavy: you have to spin up multiple copies of the Python runtime for each core and distribute your work between them.
One long-dreamed way to solve this problem is to remove Python's GIL, or Global Interpreter Lock. The GIL synchronizes operations between threads to ensure objects are accessed by only one thread at a time. In theory, removing the GIL would allow true multithreading. In practice—and it's been tried many times—it slows down non-threaded use cases, so it's not a net win.
Core python developer Eric Snow, in his talk, unveiled a possible future solution for all this: subinterpreters, and a per-interpreter GIL. In short: the GIL wouldn't be removed, just sidestepped.
Subinterpreters is a mechanism where the Python runtime can have multiple interpreters running together inside a single process, as opposed to each interpreter being isolated in its own process (the current multiprocessing mechanism). Each subinterpreter gets its own GIL, but all subinterpreters can share state more readily.
While subinterpreters have been available in the Python runtime for some time now, they haven't had an interface for the end user. Also, the messy state of Python's internals hasn't allowed subinterperters to be used effectively.
With Python 3.12, Snow and his cohort cleaned up Python's internals enough to make subinterpreters useful, and they are adding a minimal module to the Python standard library called interpreters. This gives programmers a rudimentary way to launch subinterpreters and execute code on them.
Snow's own initial experiments with subinterpreters significantly outperformed threading and multiprocessing. One example, a simple web service that performed some CPU-bound work, maxed out at 100 requests per second with threads, and 600 with multiprocessing. But with subinterpreters, it yielded 11,500 requests, and with little to no drop-off when scaled up from one client.
The interpreters module has very limited functionality right now, and it lacks robust mechanisms for sharing state between subinterpreters. But Snow believes by Python 3.13 a good deal more functionality will appear, and in the interim developers are encouraged to experiment.