Vadim Antonov wrote:
Lack of real strong typing, built-in var-size strings (so the compiler can actually optimize string ops) and uncontrollable pointer operations is enough to guarantee that any complicated program will have buffer-overflow vulnerabilities.
Typing can be enforced if the programmer chooses to.
So does assembler - way more than C.
I agree. I love both.
Presumeably, a non-idiot can produce ideal code in significant quantities. May I politely inquire if you ever wrote anything bigger than 10k lines - because anyone who did knows for sure that no program is
Of course, and yes.
ideal, and that humans forget, make mistakes and cannot hold the entire project in mind, and that our minds tend to see how things are supposed to be, not how they are - making overlooking silly mistakes a certainity.
Correct. It cannot be held in mind, which is why a model must be fashioned and enforced so that the programmer minimizes mistakes. There are many tools available to help one do this, and it's not too difficult to write your own. In some cases, it's common sense.
C is a workable language, but it is not close (by far) to a language which would incorporate support for known best practices for large-scale software engineering. C++ is somewhat better, but it fails horribly in
How do you figure? Best Practices is what you do with what you have, not what you have itself. Ingress/Egress filtering is considered a best practice. Yet it isn't performed throughout the 'net. Solid programming can be done, but if the individual(s) or company do not wish to take the time to do it right, then it will have problems.
Overhead. To get reasonable performance on boundary-checked arrays you need compiler to do deeper optimization than is possible with calling library routines (or even inlining them - because the semantics of procedure call is restrictive).
Let me get this strait. You have a language which is very low on overhead and adding any overhead is unacceptable despite the fact that it would still have less overhead than many other languages?
I don't use flowcharts - they're less compact than text, so they hinder comprehension of complex pieces of code (it is a well-known fact that splitting text onto separate pages which need to be flipped back and forth significantly degrades speed and accuracy of comprehension - check any textbook on cognitive psychology). There were many graphical programming projects (this is a perinneal mania in programming tool-smith circles), none of them yielded any significant improvement in productivity or quality.
I imagine that they didn't. The flowchart was more of a training tool than anything. I create 3 dimentional flowcharts in my head when dealing with any process; programming or engineering. 9 times out of 10, I will beat someone else to the solution to a problem. Why? Because I know how to quickly break a process down from end to end into it's tiniest pieces, rule out layers that don't apply to the problem and quickly follow the path to where reality differs from theory.
Sendmail is a horrible kludge, and, frankly, I'm amazed that it is still being supplied as a default MTA with Unix-like OS-es.
Yeah, but a flowchart on the wall would really look cool. :P
A professional programmer will choose a language which lets him do the required job with minimal effort. Since quality is not generally a project requirement in this industry (for reasons I already mentioned) the result is predictable - use of languages which allow quick and dirty programming; getting stuff to do something fast, so it can be shipped, and screw the user, who is by now well-conditioned to do the three-finger salute instead of asking for refund.
Social issue, not technical. Change the perspective of the programmer, project manager, etc, etc and the same language can be used to produce quality code. -Jack