Skip to content

feature: network/host rate limiting #1413

@atomGit

Description

@atomGit

liferea suffers from the same problem some other readers do; it updates feeds too quickly and this can cause various problems when a given host is hit with multiple requests in quick secession

bitchute.com is one such example where, if there are more than x number of requests in n seconds (and i don't know what x and n are) , then feed fetching is temp-blocked

i had the same problem in a script i wrote to check for broken hyperlinks in a website and got around it by first shuffling the array of urls, then by keeping a rolling list of urls with a time stamp for when they were queried and dropping the next url to be checked to the bottom of the array if the same domain was checked less than x seconds ago

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions