* Re: Re: Where did the warning go? @ 2009-02-25 0:23 Tom St Denis 2009-02-25 13:53 ` Eivind LM 0 siblings, 1 reply; 6+ messages in thread From: Tom St Denis @ 2009-02-25 0:23 UTC (permalink / raw) To: Tom St Denis, Eivind LM; +Cc: gcc-help On Tue 24/02/09 7:09 PM , "Eivind LM" eivliste@online.no sent: > I'm sure there are several opinions on what "annoying" and "useless" > warnings are. A warning can only annoy me if I don't do anything about it. > And "unlikely to be wrong" is not sufficient for me to consider the > warning to be useless. > Yeah, the problem is you can have warnings for things that don't need warnings. Like for example: int a; a = 4; warning: the variable name "a" is too short, and doesn't contain the letter 'e' Is a "valid" warning, since a conforming compiler may produce ANY warning it want (the spec only specifies what warrants a diagnostic (error)). Now if GCC emitted that warning would you find that useful? Or perhaps would it get in the way of real work? Now onto something less trivial ... char a = 'b'; You're assigning an "int" type to a char. splint will warn you about this, even though it's perfectly reasonable and well understood [not to mention portable] code. Is that useful? In your own example passing 2.5 to a function declared with an int parameter. That has well defined behaviour as well. It's no less defined than say passing "5" to a function that accepts a long. > I take all warnings as an opportunity to learn some specific details about > the language. I am a novice, and trust the GCC developers have a good > reason to write each warning. It is usually easy to get the idea of the > possible problem when a warning triggers in my own code. Then I either > find out that the problem can absolutely never ever affect me (and then I > would disable the warning), or I change my code. I have never been sure > enough yet to disable any of the warnings I have seen so far. The goal of good warnings is to detect things that are likely to be bugs, or are at least undefined behaviour. Not to warn about things that have clearly defined behaviours. If you want to learn more about C, pick up the ISO C draft and read it. Don't rely on the warnings from GCC to teach you what is and isn't good C code. > I do of course think differently if I work with old code that has too many > warnings to fix. But encouraging everyone to compile their new code with > as many warning flags as possible could eventually solve that problem? :) It is indeed a good goal to have warning free code, but adding mindless and useless casts everywhere (for instance) is just as annoying coding practice. I wouldn't accept code that read, like char a = (char)'b'; As it's superfluous and will make reading the code harder, not easier. It's why a lot of people avoid tools like splint (if their corporate masters don't dictate it's use) because 99 out of 100 times the warnings produced don't lead to anything that could even remotely possibly be a bug. It's irresponsible to trace down and "fix" what isn't broken. Specially when there are better tools out there like valgrind to help debug your apps. > Anyway, we don't have to agree on this. I just wish to have a flag that > let me compile my own code with as many warnings as possible. And then the > name of the -Wall flag is in the way. I'm trying to help you by persuading you that that's not a good idea. Learn the C standard and code within it's boundaries. Don't rely on superfluous warnings to avoid bugs because at the end of the day it's not a sufficient condition for bug free code to be splint warning free. Tom ^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Re: Where did the warning go? 2009-02-25 0:23 Re: Where did the warning go? Tom St Denis @ 2009-02-25 13:53 ` Eivind LM 2009-02-25 14:20 ` Tom St Denis 0 siblings, 1 reply; 6+ messages in thread From: Eivind LM @ 2009-02-25 13:53 UTC (permalink / raw) To: tstdenis; +Cc: gcc-help On Wed, 25 Feb 2009 01:23:16 +0100, Tom St Denis <tstdenis@ellipticsemi.com> wrote: > On Tue 24/02/09 7:09 PM , "Eivind LM" eivliste@online.no sent: >> I'm sure there are several opinions on what "annoying" and "useless" >> warnings are. A warning can only annoy me if I don't do anything about >> it. >> And "unlikely to be wrong" is not sufficient for me to consider the >> warning to be useless. >> > > Yeah, the problem is you can have warnings for things that don't need > warnings. > Like for example: > > int a; > a = 4; > > warning: the variable name "a" is too short, and doesn't contain the > letter 'e' That is an example of a meaningless warning, and I would simply disable it if it was implemented. No big deal. If I felt like it, I would write an email to the gcc list and ask why anyone had implemented such a warning. Anyway, I am sure GCC developers do their best to write warnings that point on actual potential problems. Likely problems as well as unlikely problems. > Is a "valid" warning, since a conforming compiler may produce ANY > warning it want > (the spec only specifies what warrants a diagnostic (error)). Now if > GCC emitted > that warning would you find that useful? Or perhaps would it get in the > way of > real work? It would probably take me 1 minute to disable the warning, then it would never bother me again. > Now onto something less trivial ... > > char a = 'b'; > > You're assigning an "int" type to a char. splint will warn you about > this, even > though it's perfectly reasonable and well understood [not to mention > portable] > code. Is that useful? I'll repeat myslelf: If the compiler can guarantee that I don't loose precision in the assignment, then I don't want a warning. In this case 'b' is a symbolic constant for the integer value 98, which can be perfectly represented as a char. So I don't want a warning. However, if I have int a; ask_user(&a); char b = a; then I think a warning is in place. If splint gives a warning for the first case, then I would say that is a problem with splint. But I have never tried splint, and probably won't either since you so strongly discourage it :) > In your own example passing 2.5 to a function declared with an int > parameter. > That has well defined behaviour as well. It's no less defined than say > passing > "5" to a function that accepts a long. As I wrote earlier, I consider these as two totally different things. In the first case, the value is changed. In the second case, the value is not changed. The first case might have well defined behaviour. But anyway, my value is changed by 20%. If I wanted to skip the decimals from 2.5, then I would have casted the value to an int explicitly. That's why I want a warning in the cases where any of my values are implicitly changed. This is my personal preference. I am not telling you which warnings you should use when you compile your own code, or which should be default in GCC. I am just saying that 1) I would like to have a warning whenever an implicit conversion happens that might be "value destroying". And 2) since I consider this a serious issue, then I expect the other warnings in GCC (probably also those warnings that I am not aware of) to be serious as well. That's why I would like to enable the whole lot to find out what they can teach me. >> I take all warnings as an opportunity to learn some specific details >> about >> the language. I am a novice, and trust the GCC developers have a good >> reason to write each warning. It is usually easy to get the idea of the >> possible problem when a warning triggers in my own code. Then I either >> find out that the problem can absolutely never ever affect me (and then >> I >> would disable the warning), or I change my code. I have never been sure >> enough yet to disable any of the warnings I have seen so far. > > The goal of good warnings is to detect things that are likely to be > bugs, or are > at least undefined behaviour. Not to warn about things that have > clearly defined > behaviours. So you are saying that the unlikely cases are less serious? Like the int to char assignment, that works fine because the int is *likely* to be in [0,255]? Then it turns out that the int can be 256 before assignment to char, in a very special corner case. How serious this is does not depend on how likely it is. Generally, I would rather say less likely cases are more serious than high likely cases. The highly likely cases are usually easy to discover while testing the software anyway. The less likely cases are the ones that are hard to find when you test, and the most hard-to-debug problems you receive after release. So I won't say nothanks if GCC have ability to warn me about the less likely cases. > If you want to learn more about C, pick up the ISO C draft and read > it. Don't rely on the warnings from GCC to teach you what is and isn't > good C code. I have Bjarne's book for C++, and think it is a great reference. But I can't go about reading the whole thing and expect to be a fluent C++ programmer the next day. There are several ways to learn. One good way for me is if possible problems in my own code are pointed out to me as early as possible. That way I can look up in the book to find what the problem is, and consider whether the problem is a real issue or not. Afterwards, I will actually remember what I read in the spec, since it was directly related to my own code. >> I do of course think differently if I work with old code that has too >> many >> warnings to fix. But encouraging everyone to compile their new code with >> as many warning flags as possible could eventually solve that problem? >> :) > > It is indeed a good goal to have warning free code, but adding mindless > and > useless casts everywhere (for instance) is just as annoying coding > practice. I > wouldn't accept code that read, like > > char a = (char)'b'; I think I understand your concern. But once again, I don't think a cast is mindless or useless if it actually changes the data value. The above cast does not change the data value, and I agree it should not be neccesary. > As it's superfluous and will make reading the code harder, not easier. > > It's why a lot of people avoid tools like splint (if their corporate > masters > don't dictate it's use) because 99 out of 100 times the warnings > produced don't > lead to anything that could even remotely possibly be a bug. It's > irresponsible > to trace down and "fix" what isn't broken. Specially when there are > better tools > out there like valgrind to help debug your apps. > >> Anyway, we don't have to agree on this. I just wish to have a flag that >> let me compile my own code with as many warnings as possible. And then >> the >> name of the -Wall flag is in the way. > > I'm trying to help you by persuading you that that's not a good idea. > Learn the > C standard and code within it's boundaries. Don't rely on superfluous > warnings > to avoid bugs because at the end of the day it's not a sufficient > condition for > bug free code to be splint warning free. I agree it takes more than just warning-free to be bug-free. But some of the hard-to-debug bugs can be avoided by warnings, so I want to use the warnings for all they are worht. But we definitely have very different ideas about this, and probably won't get any closer to agree. But thanks for your opinions though, I learned a lot! :) Eivind ^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Where did the warning go? 2009-02-25 13:53 ` Eivind LM @ 2009-02-25 14:20 ` Tom St Denis 2009-02-25 15:56 ` Eivind LM 0 siblings, 1 reply; 6+ messages in thread From: Tom St Denis @ 2009-02-25 14:20 UTC (permalink / raw) To: Eivind LM; +Cc: gcc-help Eivind LM wrote: >> You're assigning an "int" type to a char. splint will warn you about >> this, even >> though it's perfectly reasonable and well understood [not to mention >> portable] >> code. Is that useful? > > I'll repeat myslelf: If the compiler can guarantee that I don't loose > precision in the assignment, then I don't want a warning. Don't mix types then? I know of no reason to use "char" unless you're dealing with strings or octet data. I'd never use a char in a day-to-day expression (e.g. as an index to an array, counter, etc). > However, if I have > > int a; > ask_user(&a); > char b = a; > > then I think a warning is in place. Why? It's defined behaviour. Your real problem is mixing types, not the promotion problems. > As I wrote earlier, I consider these as two totally different things. > In the first case, the value is changed. In the second case, the value > is not changed. But it has defined behaviour. > The first case might have well defined behaviour. But anyway, my value > is changed by 20%. If I wanted to skip the decimals from 2.5, then I > would have casted the value to an int explicitly. That's why I want a > warning in the cases where any of my values are implicitly changed. Don't mix types? If you're writing a DSP or other math library chances are you wouldn't have random functions that take int and some that take float. > I am just saying that 1) I would like to have a warning whenever an > implicit conversion happens that might be "value destroying". And 2) > since I consider this a serious issue, then I expect the other > warnings in GCC (probably also those warnings that I am not aware of) > to be serious as well. That's why I would like to enable the whole lot > to find out what they can teach me. Ok, but -Wconversion exists. Don't go tacking that onto -Wall so us programmers who know what we're doing get stuck with it. > So you are saying that the unlikely cases are less serious? Like the > int to char assignment, that works fine because the int is *likely* to > be in [0,255]? Then it turns out that the int can be 256 before > assignment to char, in a very special corner case. How serious this is > does not depend on how likely it is. No, it's less serious because it's defined behaviour. > Generally, I would rather say less likely cases are more serious than > high likely cases. The highly likely cases are usually easy to > discover while testing the software anyway. The less likely cases are > the ones that are hard to find when you test, and the most > hard-to-debug problems you receive after release. I have yet to really have any defects found by trivial and hypersensitive syntax checking. Wait till you have a 60,000 line project with hundreds of inter dependencies between functions, then you'll start worrying about something a little more serious than defined behaviour. > So I won't say nothanks if GCC have ability to warn me about the less > likely cases. I have to ask you, what percentage of bugs do you suppose are attributed to storing int's in chars (or similar)? 10%? 1%? 0.001%? And how much will you miss because you spend time worrying about things like this instead of just developing properly to start with? Just like micro-optimizations can be time consuming and wasteful, so can micro-linting. >> If you want to learn more about C, pick up the ISO C draft and read >> it. Don't rely on the warnings from GCC to teach you what is and >> isn't good C code. > > I have Bjarne's book for C++, and think it is a great reference. But I > can't go about reading the whole thing and expect to be a fluent C++ > programmer the next day. There are several ways to learn. One good way > for me is if possible problems in my own code are pointed out to me as > early as possible. That way I can look up in the book to find what the > problem is, and consider whether the problem is a real issue or not. > Afterwards, I will actually remember what I read in the spec, since it > was directly related to my own code. Yeah, but again, you want warnings for things that aren't errors or undefined behaviour. Where do you draw the line? If you want to learn how to develop software, just pick problems and solve them with software. Then test and verify, document and support. GCC won't teach you how to be a good developer. And frankly, there is a heck of a lot more to being a software developer than knowledge of the syntax of a given language. > I think I understand your concern. But once again, I don't think a > cast is mindless or useless if it actually changes the data value. The > above cast does not change the data value, and I agree it should not > be neccesary. But it's your type of thinking that leads to those warnings in the first place. Then customers get wind of that and *demand* that we address them. It's really annoying. > I agree it takes more than just warning-free to be bug-free. But some > of the hard-to-debug bugs can be avoided by warnings, so I want to use > the warnings for all they are worht. Ok, but while you're wasting time chasing down every useless warning, you're *not* learning about proper defensive coding, you're *not* learning about common defects, and you're *not* becoming a good software developer. If you really want to learn how to debug/fix software, get familiar with gdb, valgrind, and the like. Learn about common defects like buffer overflow/runs, race conditions, etc. > But we definitely have very different ideas about this, and probably > won't get any closer to agree. But thanks for your opinions though, I > learned a lot! :) Just wait till you have customers with "coding standards" like MISRA or whatever that say things like "goto can never be used." Right after you put together a package which uses them exclusively (for error handling). Pointless coding rules (of which I lump in useless warnings) lead people to miss the bigger picture, and in the end real defects that plague large software projects. You don't see it now, maybe because you haven't been on the working end of a large project, but trust me. You won't gain experience until you actually work on projects, and those projects will have defects, and your defects will likely not be syntax related. Tom ^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Where did the warning go? 2009-02-25 14:20 ` Tom St Denis @ 2009-02-25 15:56 ` Eivind LM 2009-02-25 16:16 ` Tom St Denis 0 siblings, 1 reply; 6+ messages in thread From: Eivind LM @ 2009-02-25 15:56 UTC (permalink / raw) To: Tom St Denis; +Cc: gcc-help On Wed, 25 Feb 2009 15:20:34 +0100, Tom St Denis <tstdenis@ellipticsemi.com> wrote: > Eivind LM wrote: >>> You're assigning an "int" type to a char. splint will warn you about >>> this, even >>> though it's perfectly reasonable and well understood [not to mention >>> portable] >>> code. Is that useful? >> >> I'll repeat myslelf: If the compiler can guarantee that I don't loose >> precision in the assignment, then I don't want a warning. > Don't mix types then? I know of no reason to use "char" unless you're > dealing with strings or octet data. I'd never use a char in a > day-to-day expression (e.g. as an index to an array, counter, etc). > >> However, if I have >> >> int a; >> ask_user(&a); >> char b = a; >> >> then I think a warning is in place. > > Why? It's defined behaviour. Your real problem is mixing types, not > the promotion problems. So you think it is a problem to mix types? Then we agree on something. The code example was a response to your paragraph above where you wrote that "assigning int type to char is perfectly reasonable and well understood". I would not write such code. If I mistakenly assign an int to a char, then I would like a warning, no matter how well defined the behaviour is. >> As I wrote earlier, I consider these as two totally different things. >> In the first case, the value is changed. In the second case, the value >> is not changed. > > But it has defined behaviour. >> The first case might have well defined behaviour. But anyway, my value >> is changed by 20%. If I wanted to skip the decimals from 2.5, then I >> would have casted the value to an int explicitly. That's why I want a >> warning in the cases where any of my values are implicitly changed. > Don't mix types? If you're writing a DSP or other math library chances > are you wouldn't have random functions that take int and some that take > float. Exactly: don't mix types. Don't send a double as parameter to a function that takes int (which you wrote is well defined behaviour). Don't compare a float to an int (which you earlier wrote is perfectly valid). If I do something like that by mistake, then I would like the compiler to warn me, no matter if it's valid or well defined, because the code might not do what I intended. >> I am just saying that 1) I would like to have a warning whenever an >> implicit conversion happens that might be "value destroying". And 2) >> since I consider this a serious issue, then I expect the other warnings >> in GCC (probably also those warnings that I am not aware of) to be >> serious as well. That's why I would like to enable the whole lot to >> find out what they can teach me. > Ok, but -Wconversion exists. Don't go tacking that onto -Wall so us > programmers who know what we're doing get stuck with it. Yes, I found -Wconversion to be very useful. I wonder how many other flags there are in GCC that might prove to be just as useful for me. If there was a -Weverything flag, then it would be easy to find out. I will not go about tacking any warnings on anyone. The only thing I'm saying about -Wall is that the name is confusing and should be changed. >> So you are saying that the unlikely cases are less serious? Like the >> int to char assignment, that works fine because the int is *likely* to >> be in [0,255]? Then it turns out that the int can be 256 before >> assignment to char, in a very special corner case. How serious this is >> does not depend on how likely it is. > No, it's less serious because it's defined behaviour. We are talking about behavour which is possibly unintended, right? That's when I would like a warning. I don't understand why you think the consequence (or seriousness) of the unintended behaviour is related to its likelihood to fail, or whether the behaviour is well defined or not. >> Generally, I would rather say less likely cases are more serious than >> high likely cases. The highly likely cases are usually easy to discover >> while testing the software anyway. The less likely cases are the ones >> that are hard to find when you test, and the most hard-to-debug >> problems you receive after release. > I have yet to really have any defects found by trivial and > hypersensitive syntax checking. Wait till you have a 60,000 line > project with hundreds of inter dependencies between functions, then > you'll start worrying about something a little more serious than defined > behaviour. Ok. I have about 50,000 lines of C++ code so far. The lines are spread over different libraries though, so it's not the same project. >> So I won't say nothanks if GCC have ability to warn me about the less >> likely cases. > I have to ask you, what percentage of bugs do you suppose are attributed > to storing int's in chars (or similar)? 10%? 1%? 0.001%? And how much > will you miss because you spend time worrying about things like this > instead of just developing properly to start with? Less likely does not mean less serious. I am a human, and make mistakes from time to time. I expect to keep making mistakes, even after the next 50 years of experience with C++. But I don't want to make a mistake in 50 years from now, that a GCC warning today could have tought me to avoid. > Just like micro-optimizations can be time consuming and wasteful, so can > micro-linting. > >>> If you want to learn more about C, pick up the ISO C draft and read >>> it. Don't rely on the warnings from GCC to teach you what is and >>> isn't good C code. >> >> I have Bjarne's book for C++, and think it is a great reference. But I >> can't go about reading the whole thing and expect to be a fluent C++ >> programmer the next day. There are several ways to learn. One good way >> for me is if possible problems in my own code are pointed out to me as >> early as possible. That way I can look up in the book to find what the >> problem is, and consider whether the problem is a real issue or not. >> Afterwards, I will actually remember what I read in the spec, since it >> was directly related to my own code. > Yeah, but again, you want warnings for things that aren't errors or > undefined behaviour. Where do you draw the line? > > If you want to learn how to develop software, just pick problems and > solve them with software. Then test and verify, document and support. > GCC won't teach you how to be a good developer. And frankly, there is a > heck of a lot more to being a software developer than knowledge of the > syntax of a given language. But that does not make the syntax part less important for me. >> I think I understand your concern. But once again, I don't think a cast >> is mindless or useless if it actually changes the data value. The above >> cast does not change the data value, and I agree it should not be >> neccesary. > But it's your type of thinking that leads to those warnings in the first > place. Then customers get wind of that and *demand* that we address > them. It's really annoying. By "those warnings", you mean a warning for something that is absolutely sure to not be a problem under any circumstance? Could you please write and send me some example code that cause a non-trivial warning with gcc, and where you can prove that there is no potential problems with the code? I have yet to see such a warning, and it would be very educating for me to see. >> I agree it takes more than just warning-free to be bug-free. But some >> of the hard-to-debug bugs can be avoided by warnings, so I want to use >> the warnings for all they are worht. > Ok, but while you're wasting time chasing down every useless warning, > you're *not* learning about proper defensive coding, you're *not* > learning about common defects, and you're *not* becoming a good software > developer. > > If you really want to learn how to debug/fix software, get familiar with > gdb, valgrind, and the like. Learn about common defects like buffer > overflow/runs, race conditions, etc. I use gdb and valgrind. I have done my time debugging writes outside array boundaries. I have used pthreads and debugged race conditions. But I still care about compiler warnings. I don't think there is a contradiction there. >> But we definitely have very different ideas about this, and probably >> won't get any closer to agree. But thanks for your opinions though, I >> learned a lot! :) > Just wait till you have customers with "coding standards" like MISRA or > whatever that say things like "goto can never be used." Right after you > put together a package which uses them exclusively (for error > handling). Pointless coding rules (of which I lump in useless warnings) > lead people to miss the bigger picture, and in the end real defects that > plague large software projects. You don't see it now, maybe because you > haven't been on the working end of a large project, but trust me. You > won't gain experience until you actually work on projects, and those > projects will have defects, and your defects will likely not be syntax > related. You keep using the word "likely". If there is only a slightest chance that one of the warnings can save me one of the really hard debugging sessions, then I will keep caring about compiler warnings. Eivind ^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Where did the warning go? 2009-02-25 15:56 ` Eivind LM @ 2009-02-25 16:16 ` Tom St Denis 2009-02-25 17:16 ` John Z. Bohach 0 siblings, 1 reply; 6+ messages in thread From: Tom St Denis @ 2009-02-25 16:16 UTC (permalink / raw) To: Eivind LM; +Cc: gcc-help Eivind LM wrote: > So you think it is a problem to mix types? Then we agree on something. > The code example was a response to your paragraph above where you > wrote that "assigning int type to char is perfectly reasonable and > well understood". I would not write such code. If I mistakenly assign > an int to a char, then I would like a warning, no matter how well > defined the behaviour is. I think it's more important to not make the mistake in the first place. If you're writing code where you freely interchange data types all willy-nilly, you have bigger problems than what warnings GCC emits. It's like saying you need spell check in your email client to write well. > Exactly: don't mix types. Don't send a double as parameter to a > function that takes int (which you wrote is well defined behaviour). > Don't compare a float to an int (which you earlier wrote is perfectly > valid). But I would expect known behaviour. For example, if I were writing a FIR or IIR function and happened to have int data, I wouldn't expect a warning from passing an int to a function that accepts float. If I passed an "int *" to a function that takes "float *" I would expect a warning because the code is clearly wrong and won't work properly. > If I do something like that by mistake, then I would like the compiler > to warn me, no matter if it's valid or well defined, because the code > might not do what I intended. But you shouldn't be in a position where you're freely interchanging data types in random expressions anyways. If you are, you need to re-write your algorithm from scratch. > I will not go about tacking any warnings on anyone. The only thing I'm > saying about -Wall is that the name is confusing and should be changed. Except for everyone else who lives with it and is getting on just fine. I'm ok with an additional flag, I just don't want -Wall to change (in this respect anyways). >>> So you are saying that the unlikely cases are less serious? Like the >>> int to char assignment, that works fine because the int is *likely* >>> to be in [0,255]? Then it turns out that the int can be 256 before >>> assignment to char, in a very special corner case. How serious this >>> is does not depend on how likely it is. >> No, it's less serious because it's defined behaviour. > > We are talking about behavour which is possibly unintended, right? > That's when I would like a warning. I don't understand why you think > the consequence (or seriousness) of the unintended behaviour is > related to its likelihood to fail, or whether the behaviour is well > defined or not. Because not everyone accidentally mixes types. If I store a long in an unsigned char, that I know is in range [or I don't care about the higher order bits] I don't want my compiler bitching and whining to me over something that has clearly defined behaviour. Let me put it this way, you can write perfectly syntactically correct code that has the complete opposite meaning of what you want, for example "if (a = 3) { ... }". I'm for catching that one because it's a typo in 99% of cases and is good to find. Where as storing a long in a char is *not* a typo, it's a design flaw, and it means you don't know what you're doing if you're worried about losing precision. > Ok. I have about 50,000 lines of C++ code so far. The lines are spread > over different libraries though, so it's not the same project. And you think loss of precision is your biggest problem? ... Ok. > Less likely does not mean less serious. It's an irresponsible use of time to hunt down and fix things that aren't actually bugs when you can very likely have real bugs in your software. > I am a human, and make mistakes from time to time. I expect to keep > making mistakes, even after the next 50 years of experience with C++. > But I don't want to make a mistake in 50 years from now, that a GCC > warning today could have tought me to avoid. And what I'm trying to tell you is your not better served by having pedantic warnings about things that aren't undefined behaviour or obvious typos. >> Just like micro-optimizations can be time consuming and wasteful, so >> can micro-linting. I like how you didn't reply to this. > But that does not make the syntax part less important for me. The syntax should be second nature to you. I resort to looking at the draft or the ANSI C spec maybe once a year and even then it's over very obscure things that you don't see on the day to day development tasks. You should know your order of precedences and associativity off the top of your head, you should know the type promotions of expressions and what not right away. > By "those warnings", you mean a warning for something that is > absolutely sure to not be a problem under any circumstance? Not everyone is so unsure about the syntax and language as you are. > Could you please write and send me some example code that cause a > non-trivial warning with gcc, and where you can prove that there is no > potential problems with the code? I have yet to see such a warning, > and it would be very educating for me to see. Well the warning you desired that started this thread is a good example. The sort of things splint warns about are good examples, etc, and so on. > You keep using the word "likely". If there is only a slightest chance > that one of the warnings can save me one of the really hard debugging > sessions, then I will keep caring about compiler warnings. And you will miss a whole slew of real problems because you're worried about micro-linting your code. More warnings is only a good idea if the warnings are in fact useful and likely to represent real life bugs. Warning about the type promotion of expressions is just annoying and frankly, a complete waste of time. Put it this way, for every warning you want to see, try and imagine what percentage of bugs in the real world are attributed to it. Now you're gonna say "but if there is a chance ..." .... but then I'll say the time you waste on it is time not spent shoring up your code, then'll you say "but if there is a chance ..." and I'm just going to give up now. Tom ^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Where did the warning go? 2009-02-25 16:16 ` Tom St Denis @ 2009-02-25 17:16 ` John Z. Bohach 0 siblings, 0 replies; 6+ messages in thread From: John Z. Bohach @ 2009-02-25 17:16 UTC (permalink / raw) To: gcc-help On Wednesday 25 February 2009 08:16:23 am Tom St Denis wrote: > Eivind LM wrote: <...snip...> I've been reading this thread, and there is an important point that hasn't been made yet, or at least I would like to emphasize it if it has: Compiler default behavior changes are _REALLY_ annoying. Even when its done to fix a gcc bug, or for other good reasons, it still causes a lot of churn either in fixing Makefiles or fixing source code. I build custom distributions as well as various other s/w for a living, and just upgrading my toolchain from 4.0.2 to 4.2.2 caused almost half (of the over 200) purely open-source packages that I build to either need a patch or upgrade to a new version...and this is especially true with C++ code. Those clamoring for default behavior changes should consider that many (millions, probably) of source packages would likely need modifications if/when basic default behaviors change. And -Wall changes are as basic as it gets. I think it is a legitimate gripe that the warnings with -Wall are not set in stone already, and sometime change even now, but the solution is certainly not to change it some more. However, I recognize that people may want a -Weverything flag, and that does seem like a reasonable compromise, as that could be used as a poor-man's splint or other static-analysis tool. I happen to agree with Tom that lint and such is not a substitute for good programming practices, but what the heck...if the gcc developers can be convinced to add a -Weverything, why not. AS LONG AS THE CURRENT DEFAULTS STOP CHANGING. ^ permalink raw reply [flat|nested] 6+ messages in thread
end of thread, other threads:[~2009-02-25 17:16 UTC | newest] Thread overview: 6+ messages (download: mbox.gz / follow: Atom feed) -- links below jump to the message on this page -- 2009-02-25 0:23 Re: Where did the warning go? Tom St Denis 2009-02-25 13:53 ` Eivind LM 2009-02-25 14:20 ` Tom St Denis 2009-02-25 15:56 ` Eivind LM 2009-02-25 16:16 ` Tom St Denis 2009-02-25 17:16 ` John Z. Bohach
This is a public inbox, see mirroring instructions for how to clone and mirror all data and code used for this inbox; as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).