Rational
Some actions may take a long time because they do a lot of things, for example processing a sale or a purchase. We have already separated the validation and the processing and added a cron task to process. But this does not scale up very well (one cron task at a time) and has a long delay (cron run every hours).
Solution like celery_tryton can not be used in core module because of the complexity of the dependencies and setup.
Proposal
We need a way to post tasks in a queue and process them by workers as soon as possible. The queue must be transactional and scale well. The queue manager should be replaceable (ex: by Celery) like the Cache
is.
For PostgreSQL back-end, we could use (or be inspired) by pq. For other backends like SQLite, we could just run the task directly as there are no LISTEN/NOTIFY support.
The posted task which should be a method on a Model
should of course take parameters (*args
, **kwargs
) from the call but also store the user and context. The instantiate
keyword will indicate which argument should be instantiate using the Model class.
A task will have no returned value because they are executed after the transaction (and the answer to the request).
The worker processes will wait for task to be posted in the queue and execute them one at a time. If an operational transaction happens, they will retry the task (same as in the dispatcher). For all other error, it will generate a log entry.
API
General low level:
Transaction().queue.put({'model': 'sale.sale', 'method': 'process', 'user': 1, 'args': [1], 'kwargs': {'context': {}}, 'instantiate': 0})
From Model:
Sale.__queue__.process([sale])