TBs of data is not really that much data on average when  you average it over thousands of customers. The data is summarized, There are a ton of other things happening in the background that I've already explained in the thread and are really irrelevant to the task at hand which is finding a facility in Africa that does Bare Metal servers. I've had a lot of helpful people, despite the naysayers. 

Thanks!

On Tue, 16 Jul 2019 at 11:23, Valdis Klētnieks <valdis.kletnieks@vt.edu> wrote:
On Tue, 16 Jul 2019 10:39:59 -0600, Ken Gilmour said:

> These are actual real problems we face. thousands of customers load and
> reload TBs of data every few seconds on their dashboards.

If they're reloading TBs of data every few seconds, you really should have been
doing summaries during data ingestion and only reloading the summaries.
(Overlooking the fact that for dashboards, refreshing every few seconds is
usually pointless because you end up looking at short-term statistical spikes
rather than anything that you can react to at human speeds.  If you *care* in
real time that the number of probes on a port spiked to 457% of average for 2
seconds you need to be doing automated responses....

Custom queries are more painful - but those don't happen "every few seconds".