--=-sFVAwQY0p26r8nFOk9Ww Content-Type: text/plain Content-Transfer-Encoding: quoted-printable
On Wed, 2010-06-09 at 08:50 -0500, Joe Greco wrote:
Primarily because the product that they've been given to use is defective by design.
Indeed. So one approach is to remove the protection such defective designs currently enjoy.
That's not going to happen (but I'll be happy to be proven wrong). As it stands, were software manufacturers to be held liable for the damages caused by their products, think of what would happen. How much does it cost for NerdForce to disinfect a computer? How many man-hours did that MS-SQL Slammer worm cost us? How much is lost when a website is down? What legislator is going to vote for software liability reforms that will ruin major software companies? When their own staff and experts will be willing to state that outcome, in no uncertain terms? What are the outcomes here? We pass such legislation, it doesn't magically fix things. It just means that companies like Adobe and Microsoft are suddenly on the hook for huge liabilities if they continue to sell their current products. Do we expect them to *stop* selling Windows, etc.,?
supposed to play out for the single mom with a latchkey kid? Let's be realistic here. It's the computer that ought to be safer.
Fine. Agreed. Now what mechanisms do you suggest for achieving that? Technical suggestions are no good, because noone will implement them unless they have to, or unless implementing them in some way improves the product so it sells better.
That's the problem, isn't it. If we were serious about it, we could approach the problem differently: rather than trying to tackle it from a marketplace point of view, perhaps we could instead tackle it from a regulatory point of view. Could we mandate that the next generation of browsers must have certain qualities? It's an interesting discussion, and in some way parallels the car safety examples I provided earlier.
modest improvements on the part of users, sure, but to place it all on=20 them is simply a fantastic display of incredible naivete.
Indeed. And certainly not something I'd advocate. at least not without making sure that they, in turn, could pass the responsibility on.
That shows an incredible lack of understanding of how the market actually works. It's nice in theory.
It would be a lot more pleasant discussing things with you if you understood that people may disagree with you without necessarily being naive or stupid.
It's not a pleasant discussion, because in all visible directions are pure suck. I'll call naive when I see it.
We (as technical people) have caused this problem because we've failed to= =20 design computers and networks that are resistant to this sort of thing.
And why did we do that? What allowed us to get away with it? Answer: Inadequate application of ordinary product liability law to the producers of software. Acceptance of ridiculous EULAs that in any sane legal system would not be worth the cellophane they are printed behind. And so forth. I know the ecosystem that arose around software is more complicated than that, but you get the idea.
I certainly agree, but it isn't going to be wished away in a minute. To do so would effectively destroy some major technology companies.
Trying to pin it on the users is of course easy, because users (generally speaking) are "stupid" and are "at fault" for not doing "enough" to "secure" their own systems, but that's a ridiculous smugness on our part.
You're right. And again, I am not advocating that. People are always going to be stupid (or ignorant, which is not the same thing as stupid). The trick is to give them a way out - whether it's insurance, education or effective legal remedy. That way they can choose how to handle the risk that *they* represent - in computers just as in any other realm of life.
Actually, IRL, we've been largely successful in making much safer cars. It's by no means a complete solution, but it seems to be the best case scenario at this time. Software is devilishly hard to make safer, of course, and companies with a decade of legacy sludge being dragged along for the ride do not have it easy. (I really do feel sorry for Microsoft in a way) That's one of the reasons I had predicted more appliance-like computers, and now they seem to be appearing in the form of app-running devices like the iPad. From a network operator's point of view, that's just great, because the chance of a user being able to do something bad to the device is greatly reduced.
I'm fine with that, but as long as we keep handing loaded guns without=20 any reasonably-identifiable safeties to the end users, we can expect to keep getting shot at now and then.
You keep stating the problem, where what others are trying to do is frame a solution. Right now we are just absorbing the impact; that is not sustainable, as long as the people providing the avenues of attack (through ignorance or whatever) have no obligation at all to do better.
Right, but rewriting the product liability laws to hold software vendors accountable, by proxying through the end user, is kind of a crazy solution, and one that would appear not to be workable. Was there another solution being framed that I missed?
Yep! And the fastest way to get more secure systems is to make consumer= s accountable, so that they demand accountability from their vendors. And so it goes, all the way up the chain. Make people accountable. At every level. =20 Again, that shows an incredible lack of understanding of how the market actually works. It's still nice in theory.
There are whole industries built around vehicular safety. There are numerous varieties of insurance that protect people - at every level - from their own failures.
Where there is no accountability in a human system, failure is practically guaranteed - whether in the form of tyranny, monopoly, danger to life and limb or whatever. The idea of accountability and the drive to attain it forms the basis of most legal and democratic systems, and of uncountable numbers of smaller systems in democratic societies. Now, what were you saying about "theory"?
That's nice. How much accountability should one have for having visited a web site that was broken into by Russian script kiddies, though? And we're not talking about driving a PC through a field of pedestrians, as someone else so colorfully put it. Who is going to "insure" me against the possibility that Russian script kiddies sent me a virus via Flash on some web site, and even now are trying to break into British intel via my computer, so one fine day the FBI comes a'knockin'? How do I even find out what happened, when I'm in jail for a year for "hacking the Brits"? That's got to be one hell of an insurance plan.
Do you really think that the game of
telephone works? Are we really going to be able to hold customers accountable? And if we do, are they really going to put vendor feet to the fire? Or is Microsoft just going to laugh and point at their EULA, and say, "our legal department will bankrupt you, you silly little twerp"= ?
Please, read more carefully. "At every level". If the consumer is made responsible, they must simultaneously get some avenue of recourse. Those ridiculous EULAs should be the first things against the wall :-)
Should be? Fine. Will be? Not fine. You won't manage to sell that to me without a lot of convincing. And if you can't get rid of those EULA's, we're back in the land of "end user holding the bag." So feel free to convince me of why Microsoft, Apple, Adobe, etc., are all going to just sit idly by while their EULA protections are legislated away.
Everyone has carefully made it clear that they're not liable to the users= , so the users are left holding the bag, and nobody who's actually responsible is able to be held responsible by the end users.
Correct. That is the current situation, and it needs to be altered. On the one hand consumers benefit because they will finally have recourse for defective software, but with that gain comes increased responsibility.
Yes, "we" needs to include all the technical stakeholders, and "we" as network operators ought to be able to tell "we" the website operators to tell "we" the web designers to stop using Flash if it's that big a liability. This, of course, fails for the same reasons that expecting end users to hold vendors responsible does, but there are a lot less of us technical stakeholders than there are end users, so if we really want to play that sort of game, we should try it here at home first.
Try what?
Go tell every webmaster who is hosting Flash on your network that it's now prohibited, as a security risk, due to the bulletin issued last week, and that any website hosting Flash on your network a week from now will be null routed. And then follow through. I mean, really, if we can't do that, we're just shoveling the responsibility off to the poor victim end-users. I'm just trying to frame this in a way that people can understand. It's great to say "end users should be responsible" and "end users need to be security-conscious." However, are we, as network operators, willing to be equally responsible and security-conscious? ... JG -- Joe Greco - sol.net Network Services - Milwaukee, WI - http://www.sol.net "We call it the 'one bite at the apple' rule. Give me one chance [and] then I won't contact you again." - Direct Marketing Ass'n position on e-mail spam(CNN) With 24 million small businesses in the US alone, that's way too many apples.