Page 1 of 3 Asyncio is just one way to do async in Python, but it is an important one. Getting started with asyncio is difficult because of the profusion of coroutines, tasks and futures, how they differ and how to use them.
Find out more in this extract from my new book Programmer's Python: Async
Programmer's Python: Async Threads, processes, asyncio & more
Is now available as a print book: Amazon
Contents
1) A Lightning Tour of Python
Python's Origins, Basic Python, Data Structures, Control Structures – Loops, Space Matters, Conditionals and Indenting, Pattern Matching, Everything Is An Object – References, Functions , Objects and Classes, Inheritance, Main and Modules, IDEs for Python, Pythonic – The Meta Philosophy, Where Next, Summary.
2) Asynchronous Explained
A Single Thread, Processes, I/O-Bound and CPU-Bound, Threads, Locking, Deadlock, Processes with Multiple Threads, Single-Threaded Async, Events,,Events or Threads, Callback Hell, More Than One CPU – Concurrency, Summary.
3) Processed-Based Parallelism
Extract 1 Process Based Parallism The Process Class, Daemon, Waiting for Processes, Waiting for the First to Complete, Computing Pi, Fork v Spawn, Forkserve, Controlling Start Method, Summary.
4) Threads
Extract 1 -- Threads The Thread Class, Threads and the GIL, Threading Utilities, Daemon Threads, Waiting for a Thread, Local Variables, Thread Local Storage, Computing Pi with Multiple Threads, I/O-Bound Threads, Sleep(0), Timer Object, Summary.
5) Locks and Deadlock
Race Conditions, Hardware Problem or Heisenbug, Locks, Locks and Processes, Deadlock, Context Managed Locks, Recursive Lock, Semaphore, Atomic Operations, Atomic CPython, Lock-Free Code, Computing Pi Using Locks, Summary.
6) Synchronization
Join, First To Finish, Events, Barrier, Condition Object, The Universal Condition Object, Summary.
7) Sharing Data
Extract 1 - Pipes & Queues The Queue, Pipes, Queues for Threads, Shared Memory, Shared ctypes, Raw Shared Memory, Shared Memory, Manager, Computing Pi , Summary.
8) The Process Pool
Waiting for Pool Processes, Computing Pi using AsyncResult, Map_async, Starmap_async, Immediate Results – imap, MapReduce, Sharing and Locking, Summary.
9) Process Managers
The SyncManager, How Proxies Work, Locking, Computing Pi with a Manager, Custom Managers, A Custom Data Type, The BaseProxy, A Property Proxy, Remote Managers, A Remote Procedure Call, Final Thoughts, Summary.
10) Subprocesses
Running a program, Input/Output, Popen, Interaction, Non-Blocking Read Pipe, Using subprocess, Summary.
11) Futures
Extract 1 Futures,
12) Basic Asyncio
Extract 1 Basic Asyncio Callbacks, Futures and Await, Coroutines, Await, Awaiting Sleep, Tasks, Execution Order, Tasks and Futures, Waiting On Coroutines, Sequential and Concurrent, Canceling Tasks, Dealing With Exceptions, Shared Variables and Locks, Context Variables, Queues, Summary.
13) Using asyncio
Extract 1 Asyncio Web Client Streams, Downloading a Web Page, Server, A Web Server, SSL Server, Using Streams, Converting Blocking To Non-blocking, Running in Threads, Why Not Just Use Threads, CPU-Bound Tasks, Asyncio-Based Modules, Working With Other Event Loops – Tkinter, Subprocesses, Summary.
14) The Low-Level API
Extract 1 - Streams & Web Clients The Event Loop, Using the Loop, Executing Tasks in Processes, Computing Pi With asyncio, Network Functions, Transports and Protocols, A UDP Server, A UDP Client, Broadcast UDP, Sockets, Event Loop Implementation, What Makes a Good Async Operation, Summary.
Appendix I Python in Visual Studio Code
So far we have looked at processes as a way of increasing the speed of CPU-bound programs, and threads as a way of increasing the speed of I/O-bound programs. In the following chapters the emphasis changes to using a single thread to speed up I/O-bound processes. This uses an event queue or some other form of cooperative scheduling-based asynchronous programming. The basic idea is that you can use a single thread more efficiently if you simply arrange for it to do something else instead of just waiting for I/O to complete. That is, if you have a set of tasks that are I/O-bound then a single thread can manage all of them if you allow it to run other tasks while waiting for others to complete I/O.
Some are of the opinion that the alternative of allocating n threads, one to each I/O bound task, is actually slower than sharing a single thread between them all. This is certainly true for Python with the GIL restricting threads to one per Python interpreter. If the GIL is removed in the future it would still be likely that one thread for all I/O-bound tasks is going to be faster than one thread per task. There are examples of Python asyncio programs handling thousands of network connections with few problems but clearly what the limits are in any particular case depends on the task and the machine.
The key to keeping the thread busy is the event queue. This is a queue of tasks waiting to be run and the scheduler selects a task to run. This then uses the thread until it has to wait for something when it releases the thread back to the event queue and another task is selected to run on the thread. The task that had to wait is added back into the event queue and gets a chance to run when it has finished waiting. This way the single thread always has a task to keep it occupied. Notice that if the thread empties the queue then it just waits for something to do and this is the only time the thread waits.
In the rest of this chapter the focus is on using the asyncio module and this single-threaded multi-tasking approach is a different mindset to the earlier approaches using multiple threads or processes. Not only does it introduce new approaches, it also introduces new problems. It is also worth realizing that asyncio is focused on network operations rather than being a general purpose single thread asynchronous module. In particular, it isn’t an event processing system of the sort you would find as part of a typical GUI such as Tkinter or Qt. This doesn’t stop it from being used as a general purpose approach to async, but the main application in the mind of its creators is to handle network connections.
In this account of basic asyncio we only use the high-level API. This is the part that programmers using, rather than extending, asyncio should restrict themselves to. The deeper low-level API, which is the subject of Chapter 14, should only be used to create frameworks based on asyncio. Notice that many accounts of asyncio were written before the high-level API was complete and so tend to use low-level functions. Even worse, many examples and tutorials mix the use of high- and low-level functions simply because they haven’t caught up with best practices.
In book but not included in this extract
- Callbacks, Futures and Await
Coroutines
The main idea in sharing a single thread is the event loop, a basic cooperative scheduler. This is simply a queue of tasks that will be run on the thread as and when it can. However, this relies on the idea that a function can be suspended and restarted from where it was forced to wait. In a multi-threaded environment this is nothing special because the thread can just be suspended and restarted by the operating system. In a single-threaded environment the thread has to save its current state, start or resume work on another function and restore the state when it returns to the previous function. A function that can be suspended and restarted in this way is generally called a “coroutine”.
Python originally supported coroutines via generators and yield and yield from. However, support for this was removed in Python 3.10 and trying to understand coroutines via generators is no longer particularly useful. For the rest of this chapter generator coroutines are ignored.
A modern Python coroutine is created using the async keyword:
async def myCo():
print("Hello Coroutine World")
return 42
if you call myCo it doesn’t execute its code, instead it returns a coroutine object which can execute the code. This is very similar to a generator returning a generator object, but you cannot run a coroutine object directly. You have to run it in association with an event loop. To do this you can use low-level functions to create a loop and then submit it. However, it is much easier to use the asyncio.run method which creates and manages the event loop without you having knowing anything about it:
import asyncio
async def myCo():
print("Hello Coroutine World")
return 42
myCoObject=myCo()
result= asyncio.run(myCoObject)
print(result)
This runs the coroutine object and displays:
Hello Coroutine World
42
Instead of passing the coroutine object, the asyncio.run call is usually written as a single action:
result= asyncio.run(myCo())
Also notice that you can pass parameters to the coroutine:
import asyncio
async def myCo(myValue):
print("Hello Coroutine World")
return myValue
result= asyncio.run(myCo(42))
print(result)
It is also important to realize that asyncio.run runs myCo at once and the thread doesn’t return until myCo is completed. While running myCo an event loop is started and if the thread is freed it starts running any tasks queued before returning to myCo. In this sense the call to asyncio.run is where the asynchronous part of your program starts and you can think of it as starting the asynchronous main program.
|