Better IO controller(H700) with his NVcache will make a great job. Especially if you have more SAS disks and some SSD. For nfdump is much better a big SAS array build from six or more 900GB SAS HDD in RAID 5 (10k 2.5'' disks are good for this task). Pavel On 17.1.2013 17:04, PC wrote:
I agree here with Christopher; A SSD to handle the high IOPS requirements of real time data logging; combined with a scheduled transfer which can "move" the stored data in a linear large block copy operation to ordinary spindles, would be a cost effective hybrid solution.
This of course is assuming the application can handle this separation of data; and I know nothing about Nfsen
On Thu, Jan 17, 2013 at 9:01 AM, Christopher Morrow <morrowc.lists@gmail.com
wrote: On Thu, Jan 17, 2013 at 9:05 AM, Joe Loiacono <jloiacon@csc.com> wrote:
Tim Calvin <tcalvin@tlsn.net> wrote on 01/16/2013 05:51:11 PM:
PowerEdge R610 -
2x Intel E5540, 2.53GHz Quad Core Processor
32GB RAM
2x 300gb 10k 2.5" SAS HDD Since netflow processing is generally I/O bound, you may want to invest in 15K drives. I had suggested off-list that perhaps primary storage as SSD was a better path, is there a reason to not do that? (with some larger storage on spinning-media for historical storage/query).