On Tue, 11 Feb 2020 at 09:09, Ahmed Borno <amaged@gmail.com> wrote:
> So yeah iACLs, CoPP and all sorts of basic precautions are needed, but I'm thinking something more needs to be done, specially if these ancient code stacks are being imported into new age 'IoT' devices, multiplying the attack vector by a factor of too many.
I can't see situation getting better. Why should vendor invest in high
quality code, certainly the cultural shift will cost something, it's
not 0 cost and what is the upside? If IOS and JunOS realistically were
significantly less buggy many of us would stop buying support, because
we either know how to configure these or can get help faster free from
the community, we largely need the support because the software
quality is so bad _everyone_ finds new bugs all the time and we don't
have the source code to fix it as a community.
So I suspect significantly better quality software would at least
initially cost more to produce and it would reduce revenue in loss of
support.
I also think the way we develop needs to be fundamentally rethought,
we need to stop believing I am the person who can code working C, it's
the other people who are incompetent. At some point we should look,
are the tools we using the right tools? Can we move complexity from
human to computers at compile time to create more guarantees of
correctness? MSFT claims 70% of their bugs are memory safety, issue
which could be solved almost perfectly programmatically by making the
compiler and language smarter, not the person more resistant to
mistakes.
I think ANET at least for some part essentially writes their own DSL
which compiles to C++, I think solution like this for any large
long-lived project probably quickly pays dividends in quality, because
you can address lot of the systematic errors during the compilation
time and in DSL design.
--
++ytti