I remember my conversation with an executive one day, where I was enlightened on corporate greed.

I asked, why is there no investment in quality code, and I was schooled.

The exec said, one dollar spent on fixing bugs, returns zero dollars but one dollar spent on nee features brings in 3 dollars ;)

These vendors tried different DSL throughout the years and it is way too difficult for them to execute on that shift, too many things at stake, i mean their business, not the end customer...of course.

They are not even being 'creative', they can easily pull some tricks like the one you mentioned (compiler built sanity) in their system configuration logic, like you cant turn on an interface without an iACL applied...but then why would they :)

Sorry for the sad tone, i just wish network operators would find a way to challenge these vendors and call their less than optimal quality. 

~A

On Tue, Feb 11, 2020 at 2:05 AM Saku Ytti <saku@ytti.fi> wrote:
On Tue, 11 Feb 2020 at 09:09, Ahmed Borno <amaged@gmail.com> wrote:

> So yeah iACLs, CoPP and all sorts of basic precautions are needed, but I'm thinking something more needs to be done, specially if these ancient code stacks are being imported into new age 'IoT' devices, multiplying the attack vector by a factor of too many.

I can't see situation getting better. Why should vendor invest in high
quality code, certainly the cultural shift will cost something, it's
not 0 cost and what is the upside? If IOS and JunOS realistically were
significantly less buggy many of us would stop buying support, because
we either know how to configure these or can get help faster free from
the community, we largely need the support because the software
quality is so bad _everyone_ finds new bugs all the time and we don't
have the source code to fix it as a community.
So I suspect significantly better quality software would at least
initially cost more to produce and it would reduce revenue in loss of
support.

I also think the way we develop needs to be fundamentally rethought,
we need to stop believing I am the person who can code working C, it's
the other people who are incompetent. At some point we should look,
are the tools we using the right tools? Can we move complexity from
human to computers at compile time to create more guarantees of
correctness? MSFT claims 70% of their bugs are memory safety, issue
which could be solved almost perfectly programmatically by making the
compiler and language smarter, not the person more resistant to
mistakes.
I think ANET at least for some part essentially writes their own DSL
which compiles to C++, I think solution like this for any large
long-lived project probably quickly pays dividends in quality, because
you can address lot of the systematic errors during the compilation
time and in DSL design.


--
  ++ytti