You could modify a radix tree to include a consolidation function and resulting confidence. Then walk the nodes of the tree, check to see if the subtree meets the requirements for consolidation, if so, prune and record the confidence. You would need to re-run the consolidation from the original data every time an individual IP was added/removed from the list as the consolidation function is lossy.
Alternatively, you could do consolidation on the fly losslessly if you had a custom tree walk algorithm. That's probably the way I would do it. I'm not a programmer so I assume there are better ways out there.
Your processing time for 5k IPs should be measured in seconds (ie: less than one) rather than minutes on any modern core. Based on your pseudocode (sort -n | uniq) I get the impression that you're using BASH which isn't ideal for performing this sort of operation at high speed.
On the flip side, I think an extra 100k routes isn't that much unless you're suffering from hardware routing table limitations. In my world the cost of a false positive match would far outweigh the cost of upgrading hardware. YMMV.
Do you have a git repo?