----- Original Message -----
From: "Robert E. Seastrom" <rs@seastrom.com>
Data point, which makes the rest of this discussion moot:
Since telcos are historically myopic and don't build (much) extra fiber into their plant to support future technologies, the only use for existing fiber in the ground in passive optical applications is to connect the COs. There is not enough running out towards the customers to support retrofitting it for PON.
It doesn't make it moot for me; I'm greenfield.
Some more data that may inform your conceptualization - Split ratios of 128 and 64 only work in the lab. Proper engineering (overlap of dB and bits/sec/customer) will dictate split ratios of 16 or 32 (depending on modulation scheme, and no, going to 10gbit modulation doesn't help; you still have the link budget problem) last time I did the math.
Yeah, I sorta figured this.
Still, the power budget improvements by not going with a single strand active ethernet solution (which were another suggested technology and has actually been deployed by some muni PON folks like Clarkesville, TN) are huge. Imagine a 24 port switch that draws 100 watts. OK, that's 4w per customer. 30k customers from a served location, that's 120kw ($13k power bill if you had 100% efficient UPSes and 0 cost cooling, neither of which is true) just for the edge, not counting any aggregation devices or northbound switch gear.
Hmm. the optics don't have auto power control?
Back at NN, we discounted this as a technology almost immediately based on energy efficiency alone.
Anyway, in summary, for PON deployments the part that matters *is* a greenfield deployment and if the fiber plant is planned and scaled accordingly the cost differential is noise.
I assume you mean "the cost diff between GPON plant and home-run plant"; that's the answer I was hoping for. Cheers, -- jra -- Jay R. Ashworth Baylink jra@baylink.com Designer The Things I Think RFC 2100 Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII St Petersburg FL USA #natog +1 727 647 1274