As modern applications increasingly rely on network I/O—APIs, scraping, microservices—writing efficient asynchronous code is no longer optional. Python’s asyncio ecosystem, combined with aiohttp, enables high-performance, non-blocking applications with minimal overhead.
In this tutorial, you’ll learn:
- Core concepts of asynchronous programming in Python
- How
asyncioworks under the hood - Building concurrent tasks with coroutines
- Making async HTTP requests using
aiohttp - Real-world use cases (API aggregation & web scraping)
- Performance comparison vs synchronous code
Prerequisites
- Python 3.11+ (recommended 3.12+)
- Basic Python knowledge
- Familiarity with HTTP APIs
Install dependencies:
pip install aiohttp
1. What is Asynchronous Programming?
Traditional (synchronous) code runs sequentially:
import time
def task():
print("Start")
time.sleep(2)
print("End")
task()
This blocks execution.
Async approach:
import asyncio
async def task():
print("Start")
await asyncio.sleep(2)
print("End")
asyncio.run(task())
Key Concepts
- Coroutine: Function defined with
async def - await: Pauses execution without blocking
- Event Loop: Manages and schedules tasks
2. Running Multiple Tasks Concurrently
import asyncio
async def fetch_data(id):
print(f"Start task {id}")
await asyncio.sleep(2)
print(f"End task {id}")
return id
async def main():
tasks = [fetch_data(i) for i in range(5)]
results = await asyncio.gather(*tasks)
print(results)
asyncio.run(main())
Output (fast!):
Start task 0
Start task 1
...
End task 4
[0,1,2,3,4]
All tasks run concurrently instead of sequentially.
3. Introducing aiohttp
aiohttp allows asynchronous HTTP requests.
Basic Example
import aiohttp
import asyncio
async def fetch(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
return await response.text()
async def main():
html = await fetch("https://example.com")
print(html[:200])
asyncio.run(main())
4. Multiple HTTP Requests Concurrently
import aiohttp
import asyncio
urls = [
"https://jsonplaceholder.typicode.com/posts/1",
"https://jsonplaceholder.typicode.com/posts/2",
"https://jsonplaceholder.typicode.com/posts/3",
]
async def fetch(session, url):
async with session.get(url) as response:
return await response.json()
async def main():
async with aiohttp.ClientSession() as session:
tasks = [fetch(session, url) for url in urls]
results = await asyncio.gather(*tasks)
for result in results:
print(result["title"])
asyncio.run(main())
5. Handling Errors Gracefully
async def fetch(session, url):
try:
async with session.get(url) as response:
response.raise_for_status()
return await response.json()
except Exception as e:
return {"error": str(e)}
6. Limiting Concurrency (Semaphore)
Avoid overwhelming APIs:
import asyncio
import aiohttp
semaphore = asyncio.Semaphore(3)
async def fetch(session, url):
async with semaphore:
async with session.get(url) as response:
return await response.text()
7. Timeout Handling
timeout = aiohttp.ClientTimeout(total=5)
async with aiohttp.ClientSession(timeout=timeout) as session:
...
8. Real-World Example: Async API Aggregator
import aiohttp
import asyncio
API_URLS = [
"https://jsonplaceholder.typicode.com/users",
"https://jsonplaceholder.typicode.com/posts",
"https://jsonplaceholder.typicode.com/comments",
]
async def fetch(session, url):
async with session.get(url) as response:
return await response.json()
async def main():
async with aiohttp.ClientSession() as session:
tasks = [fetch(session, url) for url in API_URLS]
users, posts, comments = await asyncio.gather(*tasks)
print(f"Users: {len(users)}")
print(f"Posts: {len(posts)}")
print(f"Comments: {len(comments)}")
asyncio.run(main())
9. Performance Comparison
Synchronous (requests):
import requests
for url in urls:
response = requests.get(url)
print(response.status_code)
Async (aiohttp):
# Same as earlier example using asyncio.gather
Result:
| Approach | Time |
|---|---|
| Synchronous | ~3–6 sec |
| Async | ~1 sec |
10. Best Practices
- Reuse
ClientSession(don’t create per request) - Limit concurrency using semaphores
- Handle timeouts and retries
- Use
asyncio.gather(return_exceptions=True)if needed - Avoid blocking calls inside async functions
11. Common Pitfalls
❌ Mixing sync and async code improperly
❌ Forgetting await
❌ Creating too many concurrent tasks
❌ Not closing sessions
12. Advanced Tips (2026)
- Use
asyncio.TaskGroup(Python 3.11+):
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(fetch_data(1))
tg.create_task(fetch_data(2))
- Use
uvloopfor faster event loop (optional)
13. When to Use Async
Use async when:
✅ Many I/O-bound operations
❌ CPU-heavy tasks → use multiprocessing instead
Conclusion
With asyncio and aiohttp, Python becomes a powerful tool for:
- High-performance APIs
- Web scraping at scale
- Real-time systems
- Microservices communication
Mastering async programming will significantly improve your application’s efficiency and scalability.
You can get the full source code on our GitHub.
We know that building beautifully designed Mobile and Web Apps from scratch can be frustrating and very time-consuming. Check Envato unlimited downloads and save development and design time.
That's just the basics. If you need more deep learning about Python, Django, FastAPI, Flask, and related, you can take the following cheap course:
- 100 Days of Code: The Complete Python Pro Bootcamp
- Python Mega Course: Build 20 Real-World Apps and AI Agents
- Python for Data Science and Machine Learning Bootcamp
- Python for Absolute Beginners
- Complete Python With DSA Bootcamp + LEETCODE Exercises
- Python Django - The Practical Guide
- Django Masterclass : Build 9 Real World Django Projects
- Full Stack Web Development with Django 5, TailwindCSS, HTMX
- Django - The Complete Course 2025 (Beginner + Advance + AI)
- Ultimate Guide to FastAPI and Backend Development
- Complete FastAPI masterclass from scratch
- Mastering REST APIs with FastAPI
- REST APIs with Flask and Python in 2025
- Python and Flask Bootcamp: Create Websites using Flask!
- The Ultimate Flask Course
Thanks!
