Background Tasks

Background tasks are functions run after handlers return a response.

Useful for operations where the user gets a response quickly but doesn’t need to wait for the operation to finish. Typical scenarios include:

Note

Background tasks in FastHTML are built on Starlette’s background tasks, with added sugar. Starlette’s background task design is an easy-to-use wrapper around Python’s async and threading libraries. Background tasks make apps snappier to the end user and generally improve an app’s speed.

A simple background task example

In this example we are attaching a task to FtResponse by assigning it via the background argument. When the page is visited, it will display ‘Simple Background Task Example’ almost instantly, while in the terminal it will slowly count upward from 0.

main.py
from fasthtml.common import *
from starlette.background import BackgroundTask
from time import sleep

app, rt = fast_app()

1def counter(loops:int):
    "Slowly print integers to the terminal"
    for i in range(loops):
        print(i)
        sleep(i)

@rt
def index():
2    task = BackgroundTask(counter, loops=5)
3    return Titled('Simple Background Task Example'), task

serve()
1
counter is our task function. There is nothing special about it, although it is a good practice for its arguments to be serializable as JSON
2
We use starlette.background.BackgroundTask to turn counter() into a background task
3
To add a background task to a handler, we add it to the return values at the top level of the response.

A more realistic example

Let’s imagine that we are accessing a slow-to-process critical service. We don’t want our users to have to wait. While we could set up SSE to notify on completion, instead we decide to periodically check to see if the status of their record has changed.

Simulated Slow API Service

First, create a very simple slow timestamp API. All it does is stall requests for a few seconds before returning JSON containing timestamps.

# slow_api.py
from fasthtml.common import *
from time import sleep, time

app, rt = fast_app()

@rt('/slow')
def slow(ts: int):
1    sleep(3)
2    return dict(request_time=ts, response_time=int(time()))

serve(port=8123)
1
This represents slow processing.
2
Returns both the task’s original timestamp and the time after completion

Main FastHTML app

Now let’s create a user-facing app that uses this API to fetch the timestamp from the glacially slow service.

# main.py
from fasthtml.common import *
from starlette.background import BackgroundTask
import time
import httpx

app, rt = fast_app()

db = database(':memory:')

1class TStamp: request_time: int; response_time: int

tstamps = db.create(TStamp, pk='request_time')

2def task_submit(request_time: int):
    client = httpx.Client()
3    response = client.post(f'http://127.0.0.1:8123/slow?ts={request_time}')
4    tstamps.insert(**response.json())

@rt
def submit():
    "Route that initiates a background task and returns immediately."
    request_time = int(time.time())
5    task = BackgroundTask(task_submit, request_time=request_time)
6    return P(f'Request submitted at: {request_time}'), task

@rt
7def show_tstamps(): return Ul(map(Li, tstamps()))

@rt
def index():
    return Titled('Background Task Dashboard',
8        P(Button('Press to call slow service',
            hx_post=submit, hx_target='#res')),
        H2('Responses from Tasks'),
        P('', id='res'),
        Div(Ul(map(Li, tstamps())),
9            hx_get=show_tstamps, hx_trigger='every 5s'),
    )

serve()
1
Tracks when requests are sent and responses received
2
Task function calling slow service to be run in the background of a route handler. It is common but not necessary to prefix task functions with ‘task_’
3
Call the slow API service (simulating a time-consuming operation)
4
Store both timestamps in our database
5
Create a background task by passing in the function to a BackgroundTask object, followed by any arguments.
6
In FtResponse, use the background keyword argument to set the task to be run after the HTTP response is generated.
7
Endpoint that displays all recorded timestamp pairs.
8
When this button is pressed, the ‘submit’ handler will respond instantly. The task_submit function will insert the slow API response into the db later.
9
Every 5 seconds get the tstamps stored in the DB.
Tip

In the example above we use a synchronous background task function set in the FtResponse of a synchronous handler. However, we can also use asynchronous functions and handlers.

Multiple background tasks in a handler

It is possible to add multiple background tasks to an FtResponse.

Warning

Multiple background tasks on a background task are executed in order. In the case a task raises an exception, following tasks will not get the opportunity to be executed.

from starlette.background import BackgroundTasks

@rt
async def signup(email, username):
    tasks = BackgroundTasks()
    tasks.add_task(send_welcome_email, to_address=email)
    tasks.add_task(send_admin_notification, username=username)
    return Titled('Signup successful!'), tasks

async def send_welcome_email(to_address):
    ...

async def send_admin_notification(username):
    ...

Background tasks at scale

Background tasks enhance application performance both for users and apps by handling blocking processes asynchronously, even when defined as synchronous functions.

When FastHTML’s background tasks aren’t enough and your app runs slow on a server, manually offloading processes to the multiprocessing library is an option. By doing so you can leverage multiple cores and bypass the GIL, significantly improving speed and performance at the cost of added complexity.

Sometimes a server reaches its processing limits, and this is where distributed task queue systems like Celery and Dramatiq come into play. They are designed to distribute tasks across multiple servers, offering improved observability, retry mechanisms, and persistence, at the cost of substantially increased complexity.

However most applications work well with built-in background tasks like those in FastHTML, which we recommend trying first. Writing these functions with JSON-serializable arguments ensures straightforward conversion to other concurrency methods if needed.