For anyone running Akvorado, can you please comment on resource requirements. I'm most concerned with CPU and memory, with the assumption that resources are somewhat linear to flow rate, but also curious about disk usage secondarily. Thanks, Graham
Graham You may want to build a lab and fire a generator at it to gather metrics. Fast search landed on https://github.com/nerdalert/nflow-generator but I am sure there are more. On Fri, Mar 24, 2023 at 8:05 AM Graham Johnston via NANOG <nanog@nanog.org> wrote:
For anyone running Akvorado, can you please comment on resource requirements. I'm most concerned with CPU and memory, with the assumption that resources are somewhat linear to flow rate, but also curious about disk usage secondarily.
Thanks, Graham
-- - Andrew "lathama" Latham -
On 2023-03-24 15:01, Graham Johnston via NANOG wrote:
For anyone running Akvorado, can you please comment on resource requirements. I'm most concerned with CPU and memory, with the assumption that resources are somewhat linear to flow rate, but also curious about disk usage secondarily.
A VM with 64 GB, 24 vCPU can sustain about 100k flows/s. 1 TB seems enough to keep data for about 5 years at 30k flows/s with the default setup. This is however highly dependent on how well your flows can be compressed. The main table with the default retention can use around 600 GB by itself. The data compresses well outside of the main table. You should test on your setup and let it run for a few days. You can check how much space each table uses and extrapolate depending on the TTL set on each table.
Thanks, Vincent, I appreciate the feedback. Regards, Graham -----Original Message----- From: Vincent Bernat <bernat@luffy.cx> Sent: Friday, March 24, 2023 2:35 PM To: Graham Johnston <graham.johnston@execulinktelecom.ca>; nanog@nanog.org Subject: Re: Akvorado Resource Requirements On 2023-03-24 15:01, Graham Johnston via NANOG wrote:
For anyone running Akvorado, can you please comment on resource requirements. I'm most concerned with CPU and memory, with the assumption that resources are somewhat linear to flow rate, but also curious about disk usage secondarily.
A VM with 64 GB, 24 vCPU can sustain about 100k flows/s. 1 TB seems enough to keep data for about 5 years at 30k flows/s with the default setup. This is however highly dependent on how well your flows can be compressed. The main table with the default retention can use around 600 GB by itself. The data compresses well outside of the main table. You should test on your setup and let it run for a few days. You can check how much space each table uses and extrapolate depending on the TTL set on each table.
participants (3)
-
Andrew Latham
-
Graham Johnston
-
Vincent Bernat