When building business applications with Odoo, developers often face the challenge of handling long-running or resource-intensive operations. Odoo’s framework is synchronous by default, meaning tasks such as sending thousands of emails, generating reports, or processing heavy data imports can lock the user interface until they are completed.
This is where Celery, a distributed task queue, comes in handy. By integrating Celery with Odoo, you can offload time-consuming tasks to background workers, keep your system responsive, and monitor task progress in real time.
In this article, we will walk through how to integrate Odoo with Celery, using a simple but effective approach with the code you’ve seen:
- A custom Odoo model (celery.job) that acts as a task tracker.
- A Celery application (celery_app.py) that runs jobs and reports back to Odoo using XML-RPC.
By the end, you’ll understand how to queue tasks in Odoo, execute them asynchronously, and update their status automatically.
Why Use Celery with Odoo?
Odoo already has its own job queueing system (queue_job module and cron jobs), so why introduce Celery?
Here are some solid reasons:
- Scalability: Celery is designed to distribute workloads across multiple workers and even across multiple servers.
- Flexibility: It supports different brokers (RabbitMQ, Redis, etc.) and backends (databases, RPC, Redis).
- Monitoring: With tools like Flower, you can track tasks, retries, and failures visually.
- Language-agnostic: Since Celery communicates via brokers, other applications (not just Odoo) can submit tasks.
- Resilience: Celery provides robust retry policies, delayed execution, and error handling.
For enterprises running Odoo at scale, Celery can be the missing piece to keep operations smooth.
Prerequisites: Celery and RabbitMQ
Before we start wiring Celery with Odoo, make sure you have the required tools installed:
- Celery: The task queue system.
- RabbitMQ: The message broker that Celery will use.
Install RabbitMQ using:
sudo apt update
sudo apt install -y rabbitmq-server
sudo systemctl enable rabbitmq-server
sudo systemctl start rabbitmq-server
Install Celery using:
pip install celery
Project Structure
Here’s how our custom Odoo module looks:
+-- celery_app.py
+-- __init__.py
+-- __manifest__.py
+-- models
¦ +-- celery_job.py
¦ +-- __init__.py
celery_job.py: Defines the Odoo model celery.job which queues jobs into Celery.
celery_app.py: Defines the Celery application and the tasks that will be executed by workers.
Let’s start with the Odoo side:
# -*- coding: utf-8 -*-
import json
from odoo import api, fields, models
from ..celery_app import app
class CeleryJob(models.Model):
_name = "celery.job"
_description = "Celery Job"
_order = "create_date desc"
name = fields.Char(
required=True,
copy=False,
readonly=True,
default="New"
)
state = fields.Selection([
("queue", "Queue"),
("success", "Success"),
("error", "Error")
], default="queue"
)
args_json = fields.Text("JSON", help="JSON payload passed.")
task_id = fields.Char("Task", readonly=True, help="Celery id.")
error = fields.Text("Error", help="Error message from celery.")
@api.constrains("args_json")
def add_celery_job(self):
for rec in self:
res = app.send_task(
"odoo_celery.run", kwargs={
'data': json.loads(rec.args_json),
'task_id': rec.id
})
rec.write({
'state': 'queue',
'task_id': res
})
- Each record in celery.job represents a job submitted to Celery.
- args_json holds the parameters for the task, encoded in JSON.
- When the record is created or updated, the method add_celery_job submits the task to Celery.
- The task ID returned by Celery is stored in the task_id field for tracking.
- The job starts in the "queue" state, and later the worker updates it to "success" or "error".
This model essentially acts as a bridge between Odoo and Celery.
Now let’s look at the Celery worker side:
# -*- coding: utf-8 -*-
from celery import Celery, Task
import xmlrpc.client
URL = 'http://localhost:8028'
DATABASE = '18E1'
USERNAME = 'celery'
API_KEY = 'celery'
app = Celery(
"odoo_celery",
broker="amqp://guest:guest@localhost:5672//",
backend="rpc://"
)
class NotifyOdoo(Task):
"""
Return task status to odoo.
"""
def after_return(self, status, retval, task_id, args, kwargs, einfo):
common = xmlrpc.client.ServerProxy('{}/xmlrpc/2/common'.format(URL))
models = xmlrpc.client.ServerProxy('{}/xmlrpc/2/object'.format(URL))
uid = common.authenticate(DATABASE, USERNAME, API_KEY, {})
if status == 'SUCCESS':
models.execute_kw(
DATABASE, uid, API_KEY,
"celery.job", "write", [
[kwargs.get('task_id')], {'state': 'success'}
], {}
)
else:
models.execute_kw(
DATABASE, uid, API_KEY,
"celery.job", "write", [
[kwargs.get('task_id')], {'state': 'error', 'error': retval}
], {}
)
class RunTask(NotifyOdoo):
"""
Run task.
"""
name = "odoo_celery.run"
def run(self, **kw):
common = xmlrpc.client.ServerProxy('{}/xmlrpc/2/common'.format(URL))
models = xmlrpc.client.ServerProxy('{}/xmlrpc/2/object'.format(URL))
uid = common.authenticate(DATABASE, USERNAME, API_KEY, {})
data = kw.get('data')
models.execute_kw(
DATABASE, uid, API_KEY,
data.get('model'), data.get('method'),
data.get('args'), data.get('kwargs')
)
app.register_task(RunTask())
- Celery app initialization:
- The app is named "odoo_celery".
- It connects to RabbitMQ as broker (amqp://...) and uses rpc:// as the backend.
- NotifyOdoo base class:
- Implements after_return, a Celery hook that runs after a task finishes (success or failure).
- It connects back to Odoo via XML-RPC and updates the corresponding celery.job record.
- RunTask:
- Defines the actual task odoo_celery.run.
- Reads the payload sent from Odoo (model, method, args, kwargs).
- Executes the method in Odoo using XML-RPC.
- Task registration:
- The task is registered with Celery so workers can discover it.
Workflow Example
Along with Odoo, we need to run the celery using celery_app.py by running the command
celery -A celery_app worker --loglevel=info
Now let’s see this in action.
{
"model": "res.partner",
"method": "create",
"args": [[{"name": "Celery Partner"}]],
"kwargs": {}
}- A user creates a new celery.job record in Odoo with the JSON.
- The add_celery_job method sends a task to Celery with this payload.
- A Celery worker picks up the task, executes res.partner.create() in Odoo via XML-RPC
- When the task finishes
- If successful > the job’s state is updated to "success
- If an error occurs > the job’s state is updated to "error" and the error message is stored
Users can now track job progress directly from the Odoo UI.