On 10-jan-05, at 12:26, Stephen J. Wilcox wrote:
Shifting topic a little.. any idea why DF is used anyway? I've never understood what the purpose of not fragmenting is, and if DF didnt exist we wouldnt experience the PMTU missing icmp issues
Good question. According to RFC 791: If the Don't Fragment flag (DF) bit is set, then internet fragmentation of this datagram is NOT permitted, although it may be discarded. This can be used to prohibit fragmentation in cases where the receiving host does not have sufficient resources to reassemble internet fragments. One example of use of the Don't Fragment feature is to down line load a small host. A small host could have a boot strap program that accepts a datagram stores it in memory and then executes it.
Windows appears to always set DF, is there a reason why they did that?
<msbash> Greed, what else? </msbash> Of course I wanted to see this for myself. I used Quicktime to generate some UDP, but no DFs, either on Win98 or XP.