Question regarding crawler Latency Factor/LF

Am I understanding it correctly that if i increase LF/Latency Factor in crawler, it’d simply download/request pages a bit more slowly?

And am I correct then guessing the default 0.5 is in seconds . Meaning if i put it to 2.0 , the crawler(s) would aim to download each page within 2 seconds, rather than half a second ? (And thereby consuming slightly less bandwidth ?)

PS: I basically only use Auto-Crawling , so “slow-and-steady” is my preference :slight_smile:

1 Like

Same question

By reading the source code:

Latency.java
...
208: waiting = Math.max(waiting, (int) (host.average() * Switchboard.getSwitchboard().getConfigFloat(SwitchboardConstants.CRAWLER_LATENCY_FACTOR, 0.5f)));
...

LF is a multiplier to the average host response time. The result is compared with the robots.txt delay and agent delay and the larger value is used as the actual delay time.

i.e.
LF = 0.5
average host response = 500ms
robots.txt = 200ms
agent delay = 10ms

=> delay that will be used = MAX(0.5 x 500, 200, 10) = 250ms

There maybe also more factors that affect delay time, Orbiter can explain this better.

1 Like