KMP PHP API Reference

Task
in package
implements TaskInterface uses LocatorAwareTrait

AbstractYes

Queue Task.

Common Queue plugin tasks properties and methods to be extended by custom tasks.

Table of Contents

Interfaces

TaskInterface
Any task needs to at least implement run().

Properties

$costs  : int
Activate this if you want cost management per server to avoid server overloading.
$QueuedJobs  : QueuedJobsTable
$queueModelClass  : string
$rate  : int
Rate limiting per worker in seconds.
$retries  : int|null
Number of times a failed instance of this task should be restarted before giving up.
$timeout  : int|null
Timeout in seconds, after which the Task is reassigned to a new worker if not finished successfully.
$unique  : bool
Set to true if you want to make sure this specific task is never run in parallel, neither on the same server, nor any other server. Any worker running will not fetch this task, if any job here is already in progress.
$io  : Io
$logger  : LoggerInterface|null

Methods

__construct()  : mixed
taskName()  : string

Properties

$costs

Activate this if you want cost management per server to avoid server overloading.

public int $costs = 0

Expensive tasks (CPU, memory, ...) can have 1...100 points here, with higher points preventing a similar cost intensive task to be fetched on the same server in parallel. Smaller ones can easily still be processed on the same server if some an expensive one is running.

$queueModelClass

public string $queueModelClass = 'Queue.QueuedJobs'

$rate

Rate limiting per worker in seconds.

public int $rate = 0

Activate this if you want to stretch the processing of a specific task per worker.

$retries

Number of times a failed instance of this task should be restarted before giving up.

public int|null $retries = null

Defaults to Config::defaultworkerretries().

$timeout

Timeout in seconds, after which the Task is reassigned to a new worker if not finished successfully.

public int|null $timeout = null

This should be high enough that it cannot still be running on a zombie worker (>> 2x). Defaults to Config::defaultworkertimeout().

$unique

Set to true if you want to make sure this specific task is never run in parallel, neither on the same server, nor any other server. Any worker running will not fetch this task, if any job here is already in progress.

public bool $unique = false

$logger

protected LoggerInterface|null $logger = null

Methods

__construct()

public __construct([Io|null $io = null ][, LoggerInterface|null $logger = null ]) : mixed
Parameters
$io : Io|null = null

IO

$logger : LoggerInterface|null = null

taskName()

public static taskName() : string
Tags
throws
InvalidArgumentException
Return values
string

        
On this page

Search results