Files
aritmija/.agents/skills/laravel-best-practices/rules/queue-jobs.md
2026-05-13 17:11:09 +02:00

3.2 KiB

Queue & Job Best Practices

Set retry_after Greater Than timeout

If retry_after is shorter than the job's timeout, the queue worker re-dispatches the job while it's still running, causing duplicate execution.

Incorrect (retry_aftertimeout):

class ProcessReport implements ShouldQueue
{
    public $timeout = 120;
}

// config/queue.php — retry_after: 90 ← job retried while still running!

Correct (retry_after > timeout):

class ProcessReport implements ShouldQueue
{
    public $timeout = 120;
}

// config/queue.php — retry_after: 180 ← safely longer than any job timeout

Use Exponential Backoff

Use progressively longer delays between retries to avoid hammering failing services.

Incorrect (fixed retry interval):

class SyncWithStripe implements ShouldQueue
{
    public $tries = 3;
    // Default: retries immediately, overwhelming the API
}

Correct (exponential backoff):

class SyncWithStripe implements ShouldQueue
{
    public $tries = 3;
    public $backoff = [1, 5, 10];
}

Implement ShouldBeUnique

Prevent duplicate job processing.

class GenerateInvoice implements ShouldQueue, ShouldBeUnique
{
    public function uniqueId(): string
    {
        return $this->order->id;
    }

    public $uniqueFor = 3600;
}

Always Implement failed()

Handle errors explicitly — don't rely on silent failure.

public function failed(?Throwable $exception): void
{
    $this->podcast->update(['status' => 'failed']);
    Log::error('Processing failed', ['id' => $this->podcast->id, 'error' => $exception->getMessage()]);
}

Rate Limit External API Calls in Jobs

Use RateLimited middleware to throttle jobs calling third-party APIs.

public function middleware(): array
{
    return [new RateLimited('external-api')];
}

Use Bus::batch() when jobs should succeed or fail together.

Bus::batch([
    new ImportCsvChunk($chunk1),
    new ImportCsvChunk($chunk2),
])
->then(fn (Batch $batch) => Notification::send($user, new ImportComplete))
->catch(fn (Batch $batch, Throwable $e) => Log::error('Batch failed'))
->dispatch();

retryUntil() Needs $tries = 0

When using time-based retry limits, set $tries = 0 to avoid premature failure.

public $tries = 0;

public function retryUntil(): \DateTimeInterface
{
    return now()->addHours(4);
}

Use ShouldBeUniqueUntilProcessing for Early Lock Release

ShouldBeUnique holds the lock until the job completes. ShouldBeUniqueUntilProcessing releases it when processing starts, allowing new instances to queue.

class UpdateSearchIndex implements ShouldQueue, ShouldBeUniqueUntilProcessing
{
    // Lock releases when processing begins, not when it finishes
}

Use Horizon for Complex Queue Scenarios

Use Laravel Horizon when you need monitoring, auto-scaling, failure tracking, or multiple queues with different priorities.

// config/horizon.php
'environments' => [
    'production' => [
        'supervisor-1' => [
            'connection' => 'redis',
            'queue' => ['high', 'default', 'low'],
            'balance' => 'auto',
            'minProcesses' => 1,
            'maxProcesses' => 10,
            'tries' => 3,
        ],
    ],
],