How to use asyncio for asynchronous code

онлайн тренажер по питону
Online Python Trainer for Beginners

Learn Python easily without overwhelming theory. Solve practical tasks with automatic checking, get hints in Russian, and write code directly in your browser — no installation required.

Start Course

What is Asynchronous Programming and Why Do You Need It

Asynchronous programming allows you to perform multiple operations simultaneously without waiting for each one to complete. This is crucial for creating high-performance applications that don't get blocked when performing slow operations.

Key Benefits of Asynchronicity

Asynchronicity is particularly effective in the following scenarios:

  • Network Requests (HTTP requests, working with REST APIs)
  • File System Operations (reading and writing large files)
  • Database Interactions (SQL queries, NoSQL operations)
  • WebSocket Connections and Real-Time Applications
  • Web Page Parsing and Data Scraping
  • Message Queue Processing

Introduction to the asyncio Library

asyncio is the standard Python library for writing asynchronous code based on an event loop. It's part of Python since version 3.4 and has been significantly improved in versions 3.7+.

Key Features of asyncio

With asyncio, you can:

  • Create and manage coroutines
  • Run multiple asynchronous tasks in parallel
  • Efficiently handle thousands of concurrent connections
  • Integrate with asynchronous libraries and frameworks

Core Concepts of asyncio

Coroutines

A coroutine is a special function that can be suspended and resumed later. In Python, coroutines are defined using the async def keyword.

import asyncio

async def greet():
    print("Hello!")
    await asyncio.sleep(1)
    print("1 second has passed.")

# Running a coroutine
asyncio.run(greet())

The await Keyword

await is used to suspend the execution of a coroutine until another asynchronous operation is complete. This is a key mechanism for transferring control back to the event loop.

async def delayed_message():
    print("Starting execution")
    await asyncio.sleep(2)
    print("Message after 2 seconds")

asyncio.run(delayed_message())

Event Loop

The Event Loop is the heart of asyncio, a mechanism that manages the execution of all asynchronous tasks. Starting with Python 3.7, it's recommended to use asyncio.run() to start asynchronous code.

# Modern way (Python 3.7+)
asyncio.run(main_coroutine())

# Deprecated way (before Python 3.7)
loop = asyncio.get_event_loop()
loop.run_until_complete(main_coroutine())

Creating and Managing Asynchronous Tasks

Parallel Task Execution

Asynchronous code reveals its power when executing multiple tasks in parallel:

async def task(name, delay):
    print(f"Task {name} started")
    await asyncio.sleep(delay)
    print(f"Task {name} completed after {delay} seconds")
    return f"Result from {name}"

async def main():
    # Method 1: asyncio.gather()
    results = await asyncio.gather(
        task("A", 2),
        task("B", 1),
        task("C", 3)
    )
    print("All results:", results)

asyncio.run(main())

Creating Tasks with asyncio.create_task()

For better control over execution, use asyncio.create_task():

async def main():
    # Create tasks
    task_a = asyncio.create_task(task("A", 2))
    task_b = asyncio.create_task(task("B", 1))
    task_c = asyncio.create_task(task("C", 3))
    
    # Wait for them to complete
    await task_a
    await task_b
    await task_c

asyncio.run(main())

Asynchronous HTTP Requests

Installing and Using aiohttp

To work with HTTP requests in asynchronous code, use the aiohttp library:

pip install aiohttp

Simple GET Request

import aiohttp
import asyncio

async def fetch_url(url):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:
            return await response.text()

async def main():
    html = await fetch_url('https://httpbin.org/delay/1')
    print(f"Received {len(html)} characters")

asyncio.run(main())

Multiple HTTP Requests

async def fetch_multiple_urls():
    urls = [
        'https://httpbin.org/delay/1',
        'https://httpbin.org/delay/2',
        'https://httpbin.org/json'
    ]
    
    async with aiohttp.ClientSession() as session:
        tasks = [session.get(url) for url in urls]
        responses = await asyncio.gather(*tasks)
        
        for response in responses:
            print(f"Status: {response.status}, URL: {response.url}")
            response.close()

asyncio.run(fetch_multiple_urls())

Error and Exception Handling

Basic Exception Handling

async def risky_task():
    try:
        await asyncio.sleep(1)
        raise ValueError("Something went wrong!")
    except ValueError as e:
        print(f"Handled error: {e}")
        return "Task completed with error"

asyncio.run(risky_task())

Handling Errors in a Group of Tasks

async def task_with_error(name, should_fail=False):
    await asyncio.sleep(1)
    if should_fail:
        raise Exception(f"Error in task {name}")
    return f"Success: {name}"

async def main():
    tasks = [
        asyncio.create_task(task_with_error("A", False)),
        asyncio.create_task(task_with_error("B", True)),
        asyncio.create_task(task_with_error("C", False))
    ]
    
    # return_exceptions=True allows getting exceptions as results
    results = await asyncio.gather(*tasks, return_exceptions=True)
    
    for i, result in enumerate(results):
        if isinstance(result, Exception):
            print(f"Task {i} failed with error: {result}")
        else:
            print(f"Task {i} successful: {result}")

asyncio.run(main())

Working with Files Asynchronously

Using aiofiles

pip install aiofiles
import aiofiles
import asyncio

async def read_file_async(filename):
    async with aiofiles.open(filename, 'r', encoding='utf-8') as file:
        content = await file.read()
        return content

async def write_file_async(filename, data):
    async with aiofiles.open(filename, 'w', encoding='utf-8') as file:
        await file.write(data)

async def main():
    # Writing to a file
    await write_file_async('test.txt', 'Hello, asynchronous world!')
    
    # Reading from a file
    content = await read_file_async('test.txt')
    print(f"File content: {content}")

asyncio.run(main())

Advanced asyncio Techniques

Semaphores for Concurrency Limiting

async def limited_task(semaphore, name):
    async with semaphore:
        print(f"Task {name} started execution")
        await asyncio.sleep(2)
        print(f"Task {name} completed")

async def main():
    # Limit the execution to a maximum of 2 tasks simultaneously
    semaphore = asyncio.Semaphore(2)
    
    tasks = [
        limited_task(semaphore, f"Task-{i}")
        for i in range(5)
    ]
    
    await asyncio.gather(*tasks)

asyncio.run(main())

Timeouts and Task Cancellation

async def long_running_task():
    try:
        await asyncio.sleep(10)
        return "Task completed"
    except asyncio.CancelledError:
        print("Task was cancelled")
        raise

async def main():
    try:
        # Limit execution to 3 seconds
        result = await asyncio.wait_for(long_running_task(), timeout=3.0)
        print(result)
    except asyncio.TimeoutError:
        print("Task exceeded the time limit")

asyncio.run(main())

When to Use asyncio

Ideal Application Scenarios

asyncio is most effective in the following cases:

  • Web Development: Creating asynchronous web servers and APIs
  • Network Applications: Chats, messengers, real-time applications
  • Parsing and Scraping: Processing multiple web pages simultaneously
  • IoT and Microservices: Handling a large number of small requests
  • Telegram Bots: Handling multiple users simultaneously

Performance Metrics

Comparison of synchronous and asynchronous code for 10 HTTP requests:

import time
import requests
import aiohttp
import asyncio

# Synchronous version
def sync_requests():
    start_time = time.time()
    for i in range(10):
        response = requests.get('https://httpbin.org/delay/1')
    end_time = time.time()
    print(f"Synchronously: {end_time - start_time:.2f} seconds")

# Asynchronous version
async def async_requests():
    start_time = time.time()
    async with aiohttp.ClientSession() as session:
        tasks = [
            session.get('https://httpbin.org/delay/1')
            for i in range(10)
        ]
        responses = await asyncio.gather(*tasks)
        for response in responses:
            response.close()
    end_time = time.time()
    print(f"Asynchronously: {end_time - start_time:.2f} seconds")

# sync_requests()  # ~10 seconds
# asyncio.run(async_requests())  # ~1 second

Best Practices and Recommendations

What to Do

  • Use asyncio.run() to start asynchronous code in Python 3.7+
  • Use async with for asynchronous context managers
  • Use await asyncio.sleep() instead of time.sleep()
  • Group tasks with asyncio.gather() or asyncio.create_task()
  • Handle exceptions correctly in asynchronous code

What to Avoid

  • Don't call blocking functions inside asynchronous code without wrappers
  • Don't mix requests with asyncio - use aiohttp
  • Don't ignore exception handling in asynchronous tasks
  • Don't create too many concurrent tasks without limitations

Integration with Popular Libraries

Asynchronous ORMs

# Example with SQLAlchemy (async)
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession

async def database_example():
    engine = create_async_engine("sqlite+aiosqlite:///example.db")
    
    async with AsyncSession(engine) as session:
        # Asynchronous database operations
        result = await session.execute("SELECT * FROM users")
        users = result.fetchall()
        return users

Web Frameworks

Popular asynchronous frameworks:

  • FastAPI: Modern, fast web framework
  • Starlette: Lightweight ASGI framework
  • AIOHTTP: Full-featured web framework
  • Sanic: Fast web framework inspired by Flask

Frequently Asked Questions

Can I Use asyncio with Threads and Processes?

Yes, asyncio provides methods for integrating with synchronous code:

import asyncio
import concurrent.futures

def cpu_intensive_task(n):
    # Heavy computational task
    return sum(i * i for i in range(n))

async def main():
    loop = asyncio.get_running_loop()
    
    # Execution in a thread pool
    with concurrent.futures.ThreadPoolExecutor() as pool:
        result = await loop.run_in_executor(pool, cpu_intensive_task, 100000)
        print(f"Result: {result}")

asyncio.run(main())

What is the Difference Between asyncio and Multithreading?

asyncio (Single-Threaded):

  • Runs in a single thread
  • Managed by the event loop
  • Effective for I/O operations
  • No GIL issues
  • Lower overhead

Threading (Multithreaded):

  • Uses multiple OS threads
  • Suitable for CPU-intensive tasks
  • Limited by GIL in Python
  • Higher overhead due to context switching

How to Test Asynchronous Code?

import pytest
import asyncio

# Using pytest-asyncio
@pytest.mark.asyncio
async def test_async_function():
    result = await async_function()
    assert result == "expected_value"

# Or with unittest
import unittest

class TestAsyncCode(unittest.TestCase):
    def test_async_function(self):
        loop = asyncio.new_event_loop()
        asyncio.set_event_loop(loop)
        result = loop.run_until_complete(async_function())
        self.assertEqual(result, "expected_value")
        loop.close()

How to Work with asyncio in Jupyter Notebook?

In Jupyter Notebook, additional configuration may be required:

import nest_asyncio
nest_asyncio.apply()

# Now you can use asyncio.run() in Jupyter
async def notebook_example():
    await asyncio.sleep(1)
    return "Works in Jupyter!"

asyncio.run(notebook_example())

Conclusion

Asynchronous programming with asyncio is a powerful tool for creating high-performance Python applications. The library allows you to efficiently handle I/O operations, create responsive web services, and scalable network applications.

Key Benefits of asyncio:

  • High performance for I/O-intensive tasks
  • Efficient use of system resources
  • Ease of application scaling
  • Rich ecosystem of asynchronous libraries

By mastering the basics of asyncio, you can create modern, fast, and efficient applications that can handle thousands of concurrent operations without blocking the main execution thread.

News