public inbox for gcc-bugs@sourceware.org
help / color / mirror / Atom feed
* [Bug c/59128] New: I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA
@ 2013-11-14 10:46 jpmct01 at gmail dot com
  2013-11-14 11:01 ` [Bug c/59128] " ktkachov at gcc dot gnu.org
  2013-11-14 11:01 ` glisse at gcc dot gnu.org
  0 siblings, 2 replies; 3+ messages in thread
From: jpmct01 at gmail dot com @ 2013-11-14 10:46 UTC (permalink / raw)
  To: gcc-bugs

http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59128

            Bug ID: 59128
           Summary: I use #define to set ALPHA to a constant and then (for
                    convenience) define ALPHA2 = ALPHA*ALPHA
           Product: gcc
           Version: 4.8.3
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: c
          Assignee: unassigned at gcc dot gnu.org
          Reporter: jpmct01 at gmail dot com

Created attachment 31215
  --> http://gcc.gnu.org/bugzilla/attachment.cgi?id=31215&action=edit
zip file containing code and results (with line ordered file)

I use #define as follows

#define ALPHA = 10.
#define ALPHA2 ALPHA*ALPHA


In my code I define
f47 = ALPHA2;
which gives
ALPHA2 = 100.000000

but when I calculate f47/ALPHA2 I get the result
 f47/ALPHA2 = 100.000000

when I use the form f47/(1.*ALPHA2)
 f47/(1.*ALPHA2) = 1.000000

see attached code and result


^ permalink raw reply	[flat|nested] 3+ messages in thread

* [Bug c/59128] I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA
  2013-11-14 10:46 [Bug c/59128] New: I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA jpmct01 at gmail dot com
  2013-11-14 11:01 ` [Bug c/59128] " ktkachov at gcc dot gnu.org
@ 2013-11-14 11:01 ` glisse at gcc dot gnu.org
  1 sibling, 0 replies; 3+ messages in thread
From: glisse at gcc dot gnu.org @ 2013-11-14 11:01 UTC (permalink / raw)
  To: gcc-bugs

http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59128

Marc Glisse <glisse at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|UNCONFIRMED                 |RESOLVED
                 CC|                            |ktkachov at gcc dot gnu.org
         Resolution|---                         |INVALID
             Status|UNCONFIRMED                 |RESOLVED
         Resolution|---                         |INVALID

--- Comment #1 from ktkachov at gcc dot gnu.org ---
That's expected behaviour.
ALPHA2 expands to 10.*10.

f47/ALPHA2 is then 100.0 / 10.0 * 10.0

The * and / operators bind from left to right, therefore this is evaluated as
(100.0 / 10.0) * 10.0 = 100.0

That's why it's usually good practice to put parentheses in your #defines:

#define ALPHA2 ((ALPHA) * (ALPHA))

--- Comment #2 from Marc Glisse <glisse at gcc dot gnu.org> ---
> #define ALPHA = 10.

No = there.

> #define ALPHA2 ALPHA*ALPHA

You forgot parentheses.

This has nothing to do with gcc. Look at the output of gcc ZED3.c -E and try to
understand why your code is wrong.


^ permalink raw reply	[flat|nested] 3+ messages in thread

* [Bug c/59128] I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA
  2013-11-14 10:46 [Bug c/59128] New: I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA jpmct01 at gmail dot com
@ 2013-11-14 11:01 ` ktkachov at gcc dot gnu.org
  2013-11-14 11:01 ` glisse at gcc dot gnu.org
  1 sibling, 0 replies; 3+ messages in thread
From: ktkachov at gcc dot gnu.org @ 2013-11-14 11:01 UTC (permalink / raw)
  To: gcc-bugs

http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59128

ktkachov at gcc dot gnu.org changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|UNCONFIRMED                 |RESOLVED
                 CC|                            |ktkachov at gcc dot gnu.org
         Resolution|---                         |INVALID
             Status|UNCONFIRMED                 |RESOLVED
         Resolution|---                         |INVALID

--- Comment #1 from ktkachov at gcc dot gnu.org ---
That's expected behaviour.
ALPHA2 expands to 10.*10.

f47/ALPHA2 is then 100.0 / 10.0 * 10.0

The * and / operators bind from left to right, therefore this is evaluated as
(100.0 / 10.0) * 10.0 = 100.0

That's why it's usually good practice to put parentheses in your #defines:

#define ALPHA2 ((ALPHA) * (ALPHA))

--- Comment #2 from Marc Glisse <glisse at gcc dot gnu.org> ---
> #define ALPHA = 10.

No = there.

> #define ALPHA2 ALPHA*ALPHA

You forgot parentheses.

This has nothing to do with gcc. Look at the output of gcc ZED3.c -E and try to
understand why your code is wrong.


^ permalink raw reply	[flat|nested] 3+ messages in thread

end of thread, other threads:[~2013-11-14 11:01 UTC | newest]

Thread overview: 3+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2013-11-14 10:46 [Bug c/59128] New: I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA jpmct01 at gmail dot com
2013-11-14 11:01 ` [Bug c/59128] " ktkachov at gcc dot gnu.org
2013-11-14 11:01 ` glisse at gcc dot gnu.org

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).