From mboxrd@z Thu Jan 1 00:00:00 1970 From: craig@jcb-sc.com To: rms@gnu.org Cc: craig@jcb-sc.com Subject: Re: type based aliasing again Date: Thu, 30 Sep 1999 18:02:00 -0000 Message-ID: <19990917081937.24996.qmail@deer> References: <9909151630.AA36168@marc.watson.ibm.com> <19990915165930.15281.qmail@deer> <199909170623.CAA19786@psilocin.gnu.org> X-SW-Source: 1999-09n/msg00755.html Message-ID: <19990930180200.COH1gv5NbPQmb24l4ImPXcM5IBq5OgPG1T-cpFIrCdU@z> >But if people accept the principle of making small and easy efforts to >keep old code working, subsequent issues will only involve the >comparatively easy job of figuring out how to do that. Yes, having made this decision in *this* direction tempts one to believe future decisions will be made more easily. For all intents and purposes, that's not true -- this decision was probably *well* within the bounds set by many prior decisions to make similar, or even more justifiable, accommodations, yet the discussion ensued regardless. (I can't think of any such decisions that ultimately went the other way, offhand. So those of us arguing against the accommodations have been, in effect, trying to reign in this historical willingness to make them.) I believe that's because not all of us agree on the importance of having GCC capable of strictly conforming to standards; the importance of it doing so in a bug-free manner; and the degree to which doing so should be the default behavior of GCC, versus defaulting to dialectal/historical behaviors. (There are other factors, like short memories, but it seems like most of those objecting to the accommodating proposals have demonstrated both better understanding of what ISO C actually says *and* long-term memories of GCC developments and discussions in the past, as compared to those calling for GCC to make the proposed accommodations.) So I don't see how accepting "the principle of making small and easy efforts to keep old code working" simplifies things substantially. (Particularly revealing is that I, myself, probably would have been much more willing to accept it in this case a few years ago -- which indicates the possibility that others might change their views as they, as I have, continue to try and maintain a complicated product in an ever-changing environment while coping with all sorts of demands from users.) Being a *strong* enthusiast for the *concept* of some accommodations (the engineering concept of tolerance, which gave us CDs with well over 74 minutes of music), I nevertheless accept that, for N individuals working on a project, there will be, at any given time, N definitions of "small", N of "easy", and N of "old code" (as versus "broken code"). The reason I increasingly promote the idea of taking the public, and the default internal, stance of serving up ISO C and not a jot more, in some manner (e.g. no discussion of adding non-ISO-C accommodations; but a patch that plainly simplifies the ISO-C code-paths of GCC, and *happens* to include code that can make such accommodations, is not ruled out on that basis), is that, among those same N people, there are likely to be far *fewer* than N definitions of "ISO C", especially as it applies to any given construct, such as this one. Sure enough, nearly everyone gave different criteria and/or indicated different "decision envelopes", including: make GCC more actively break broken aliasing code; drop -fno-alias-analysis; status quo; warn when trivially detectable; warn when sophisticatedly detectable; make GCC more actively support broken aliasing code out of the box; make GCC always default to supporting it; etc. (Yes, I probably made at least two or three of these myself. ;-) Yet almost everyone made the *same* judgement about whether the code, itself, was broken. I believe all the active GCC developers, especially core developers, said "yes" or abstained from the discussion, anyway. Now *that's* a simplifying assumption: work to a standard most people are likely to understand and agree on. (I still accept that ISO C isn't ideal as a standard, but it's better than anything else we have for a C-like language, AFAIK.) In other words, I believe I can more reliably predict how Mark Mitchell and Craig Burley (in the latter case, independent of the fact that I am he *now*; I'm speaking objectively hear), among the few others campaigning for no accommodation citing ISO C, will assess most *future* requests for similar accommodations. But I cannot predict how you or many others will assess those, because your "simple"/"easy"/"old" variables are just too fuzzily defined. So what we'll learn from future discussions is how those variables are defined for pro-accommodation people at that moment, for the issue de jure, even assuming nobody actually *changes* their particular variables (i.e. they're constants, just fuzzily defined to the outside world). Whereas we basically already *know* these constants for the anti-accommodation people. (Who knows, maybe we already knew those of *others*, who have not participated, from past discussions.) If you haven't done a thorough analysis of the efforts needed to discuss this issue now and in the past, I suggest you do one, or, better yet, hire an experienced consultant from the management-efficiency field (or whatever they're calling it these days) to do so. Then put up the results here for us to discuss, to encourage us all to ask the question "is this amount of effort appropriate for an issue like this?", as well as "could this effort have been better directed towards other, more core, issues regarding GCC?" That involves substantial time and effort, of course, but result *could* be a *huge* improvement in reducing the time and effort taken to discuss issues like this. (Whether it'll convince more or less of the anti-accommodation crowd than the pro-accommodation crowd doesn't really matter, as long as the data being presented was honestly collected and presented.) >In effect, your argument is that this proposal is a bad idea because >Craig Burley and others will spend time arguing against it. I do not >think that the proposal can be held responsible for that ;-). If you view it that way, especially as a spokesman for the GNU project, which controls GCC, you are saying that the only real cost, in terms of added complexity, to being willing to consider changing the compiler to better accommodate broken code is that there will be some people who openly disagree with that policy and/or with the specifics of the proposal. (Assuming my argument is correct, of course...which it is, since I made it. ;-) In effect, you are telling me that if I don't want GCC to become even more complicated and hard to maintain than necessary, or have bugs fixed in it less speedily due to long-drawn-out discussions like this, I should no longer participate in discussions like this, except perhaps to grunt things like "okay by me" or "no way". That's certainly something I've considered, so having the GNU project leader basically tell it to me outright simplifies *my* world -- I no longer need to make an assessment *myself* as to how valuable my input *might* be considered to keep GCC simple, maintainable, and to increase its quality. In a sense, your statement takes on the role of the ISO C standard vis-a-vis this discussion, and I obediently follow it, so I certainly appreciate the argument for simplicity you are making. However, I will point out that you have effectively told not just me, but several others, the same thing. If they interpret your statement the same way as I do, and thus avoid objecting to future requests for accommodations in GCC, someday the day will come when lots of users, not yet weaned from depending on GCC as some sort of "free-software install tool", demand things of GCC that even *you* wish to not (or cannot) accommodate. At that point, you might find it suddenly very difficult to gain support for your unwillingness to accommodate those demands, and, I predict, find it similarly difficult to find people enthusiastic about doing what is necessary to meet them (since a larger percentage of GCC users will be of the unweaned variety, thus not really capable of improving GCC themselves, than if we were to have drawn the line in the sand *here* and *now*). And a big part of the reason I've participated so forcefully in this discussion is that there have been quite a number of statements made that are either incorrect in fact or incomplete in portraying cost/benefit analyses vis-a-vis this issue. Since *most* of those statements have been made to promote accommodation, I've leaned towards no accommodation (or at least presenting myself as such). That is, as I'm sure Linux developers will attest from *their* experiences (based on my impressions), the risk of submitting a request for coming to an agreement, rather than submitting a patch to implement it, of course. ;-) > So people can either fix their code now, or fix it later. > >I notice a pattern that people arguing against concern for the users >tend to exaggerate the situation in this particular way: they change >"some of this code may break, may need to be changed" into "all of >this code will break, will need to be changed." I didn't say that. But, for the purposes of assessing these issues, that's how people maintaining their code should *think*. (Ever hear of Murphy's Law?) In particular, nobody (but hackers) using GCC should seriously entertain *any* hopes that GCC will manage to accommodate bugs hiding in their code, when it comes to assessing the long-term viability of their project, even though they might recognize (as most of us do) how the practicalities of compiler engineering make it *unlikely* certain bugs will be exposed over a certain lifetime. (And hackers shouldn't care what GCC does in the long haul, as I pointed out earlier.) Programmers should view bugs in their code like termites. It doesn't matter that, within one square mile of their home, billions of termites will *never* contribute to the downfall of that home. What *does* matter is that there are probably enough to bring it down, and that there's no way, ahead of time, to predict exactly which ones will, and which ones won't. All one can do is make educated guesses, and use those to best direct one's efforts towards eradicating "suspect" termites. But if one wishes the home to not fall, one either does not build it out of wood in the first place or adopts a zero-tolerance approach to termites (meaning constant watchfulness, for example, or locating one's home in a climate that has no termites). That way, one does not blame one's friend for happening to let in *the* termites that finally brought down the home when he was feeding the pets during one's vacation. Nor does one waste even two minutes trying to convince his friends that *they* must become termite-free, despite their having chosen to live in more secure dwellings themselves, because that's two minutes that could have been spent fighting the termites in or very near the home. > having) to accommodate Merced or McKinley amounts to entertaining > the same hopes, and therefore similar scaling of costs, as the > Y2K problem. > >Compilers did not give warnings for Y2K problems, but we will >make GCC give a warning for most of these problems. So these situations >are hardly similar. Compiles *have* given warnings for Y2K problems. g77 surely does, as of EGCS 1.1 IIRC. And, I've heard proprietary compilers (like Compaq's, formerly Digital's) do quite a bit more in that direction than g77, though I haven't researched the issue further than to figure out what g77 could easily do. But that wasn't my point, anyway. What these situations *do* share *is* related to my point: in both cases, no automaton can reliably warn about all instances of the problem. So there's no silver bullet other than programmers fixing their code or *knowing* (not just hoping, guessing, pretending, etc.) that their programs will be mothballed before the bug wall hits. (For Y2K, that's usually around 2000-01-01. For this bug, that's probably Merced or McKinley on desktops in quantities of a few million or so.) >From the point of view of the free-software industry, the analysis includes considering the risk of losing expertise over time as the bug-wall approaches versus the opportunity to not be hit by it at all due to mothballing. But, I believe the analysis for GCC *itself* is *much* simpler: conform to the pertinent ANSI/ISO standards, etc. (People designing new languages and environments -- like Guile, GNOME, KDE, and so on -- are the ones especially needing to evaluate the larger context in which their *designs* will be used, since long-term acceptance of them depends on the *aggregate* usability, reliability, etc. of systems built on them. The advantages of using off-the-shelf languages and environments like ISO C and POSIX include not having to bother with such evaluations; the dangers include thinking one can get away with "mini-evaluations" for extensions and accommodations outside of those specified for, and by, off-the-shelf components.) tq vm, (burley)