PuppetDB only stores one catalog and factset per node (but n number of reports), so deleting older catalog and fact queue files older than an hour (given runinterval=30
) could allow PuppetDB to catch up on the queue, and would not have an impact on the data in PostgreSQL. To delete older catalogs and factsets (that would have been replaced by newer catalogs and factsets) from the PuppetDB queue:
find /opt/puppetlabs/server/data/puppetdb/stockpile/cmd/q -name "*_catalog_9_*.json.gz" -mmin +60 -delete
find /opt/puppetlabs/server/data/puppetdb/stockpile/cmd/q -name "*_facts_5_*.json.gz" -mmin +60 -delete