Skip to content

Engines overview

z4j supports six engines, each with its own adapter package (z4j-<engine>).

EnginePackageBrokerStrengthsDefault scheduler
Celeryz4j-celeryRedis / RabbitMQ / SQSMassive ecosystem, chords/groups/chainscelery-beat
RQz4j-rqRedisSimple, small surface, Python-onlyrq-scheduler
Dramatiqz4j-dramatiqRedis / RabbitMQReliable, middleware-firstAPScheduler
Hueyz4j-hueyRedis / Sqlite / in-memoryMinimal deps, great for Djangobuilt-in periodic
arqz4j-arqRedisAsync-native, great for FastAPIarq cron
taskiqz4j-taskiqRedis / RabbitMQ / NATS / in-memoryModern async, pluggable brokerstaskiq-scheduler

Pick one or more. They can coexist in the same process:

Terminal window
pip install z4j-celery z4j-rq z4j-dramatiq
FeatureCeleryRQDramatiqHueyarqtaskiq
Retry (native)polyfill
Cancel (native)partial-partial
Bulk retrypolyfillpolyfillpolyfillpolyfillpolyfillpolyfill
Queue purge
Chord / group aware-----
Result store visibilitypartial
Native schedulercelery-beatrq-scheduler-built-incron jobstaskiq-scheduler

“polyfill” = brain implements via re-enqueue + cancel. Users see no difference in the UI.

Each adapter uses the engine’s native signal / middleware / hook system:

EngineCapture mechanism
Celerytask_sent / task_prerun / task_postrun / task_retry / task_failure signals
RQQueue.enqueue_call monkey-patch + job lifecycle hooks
Dramatiqcustom Middleware that hooks before_process_message / after_process_message
Hueysignal handlers (SIGNAL_ENQUEUED / SIGNAL_EXECUTING / SIGNAL_EXECUTED)
arqon_job_start / on_job_end hooks on the worker
taskiqmiddleware: TaskiqMiddleware.pre_send / post_execute

Patches are additive - they don’t replace your existing signal handlers.

See the left sidebar - one page per engine with install, config, events, actions, and caveats.