arq
Package: z4j-arq - arq 0.25+.
Install
Section titled “Install”pip install z4j-arqWhat it captures
Section titled “What it captures”arq doesn’t expose signals. z4j uses worker on_job_start / on_job_end hooks and wraps the enqueue_job method.
| Source | z4j event |
|---|---|
ArqRedis.enqueue_job | task_sent |
on_job_start hook | task_started |
on_job_end (success=True) | task_succeeded |
on_job_end (success=False) | task_failed |
| retries (tried > 1) | task_retry (synthetic) |
Wiring
Section titled “Wiring”arq’s config is a WorkerSettings class. z4j injects hooks:
from z4j_arq import z4j_worker_settings
class WorkerSettings(z4j_worker_settings(MyBaseSettings)): functions = [my_task, ...] # z4j's on_startup / on_shutdown / on_job_start / on_job_end are mergedOr manual:
from z4j_arq import on_startup, on_shutdown, on_job_start, on_job_end
class WorkerSettings: functions = [...] on_startup = on_startup on_shutdown = on_shutdown on_job_start = on_job_start on_job_end = on_job_endActions
Section titled “Actions”| Verb | How |
|---|---|
retry | polyfill - arq_redis.enqueue_job(function_name, *args) with original payload |
cancel | arq_redis.abort_job(job_id) - works pre-start; mid-run relies on your function’s cancellation cooperation |
purge_queue | arq_redis.flushdb() (scoped to arq prefix) |
Cron jobs
Section titled “Cron jobs”arq’s cron jobs are defined in WorkerSettings.cron_jobs. Same as Huey - code-only discovery, read-only in v1.0. See scheduler: arq-cron.
Caveats
Section titled “Caveats”- No chord/group.
- Max job size limited by Redis payload; large pickled args may be truncated in events (see redaction).
- Worker pool size appears as
metadata.max_jobsin the agent drawer.
Config
Section titled “Config”# FastAPI env-basedZ4J_ARQ_REDIS = "redis://localhost:6379/0"Or pass the ArqRedis instance explicitly via the adapter’s redis= kwarg if you need it.