[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.c

#define with semicolon

cc

7/13/2011 6:20:00 PM

Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).
59 Answers

Keith Thompson

7/13/2011 7:38:00 PM

0

cc <scatnubbs@hotmail.com> writes:
> Is it acceptable practice to have a #define with a semicolon in it,
> such as:
>
> #define SMALL 1;
>
> I didn't think it was, but a very good friend of mine claims it's
> perfectly acceptable if you want to prevent the #define from being
> used in an expression like if(SMALL).

Why would you want to prevent it from being used in an expression?
I think "1;" is a poor example of what your friend is talking about.
I'd be interested in seeing a better example.

A #define can contain any token sequence you like. The macro name
will be expanded to that token sequence every time you use it.
If you want that token sequence to include a semicolon, then you
should have a semicolon in the definition.

But most of the time, a macro expansion is used either in an
expression context (in which case it *shouldn't* have any semicolons,
and it should be protected by parentheses where necessary), or
in a statement context (in which case, if it consists of multiple
substatements, you need to use the "do { ... } while (0)" trick).

--
Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.ne...
Nokia
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

cc

7/13/2011 7:57:00 PM

0

On Jul 13, 3:38 pm, Keith Thompson <ks...@mib.org> wrote:
> cc <scatnu...@hotmail.com> writes:
> > Is it acceptable practice to have a #define with a semicolon in it,
> > such as:
>
> > #define SMALL 1;
>
> > I didn't think it was, but a very good friend of mine claims it's
> > perfectly acceptable if you want to prevent the #define from being
> > used in an expression like if(SMALL).
>
> Why would you want to prevent it from being used in an expression?
> I think "1;" is a poor example of what your friend is talking about.
> I'd be interested in seeing a better example.

That was his example. That was also his explanation of why he did it
(so the compiler would complain if he used it as an expression).

Another example was from the linux kernel.

/usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
#define LDO_MAX_VOLT 3300;


> A #define can contain any token sequence you like.  The macro name
> will be expanded to that token sequence every time you use it.
> If you want that token sequence to include a semicolon, then you
> should have a semicolon in the definition.

I know what #define does. I was asking about coding standards more or
less, and if a #define with a semicolon was commonly used and accepted
practice.

> But most of the time, a macro expansion is used either in an
> expression context (in which case it *shouldn't* have any semicolons,
> and it should be protected by parentheses where necessary), or
> in a statement context (in which case, if it consists of multiple
> substatements, you need to use the "do { ... } while (0)" trick).
>

Right. So you see no logical reason to ever use something like #define
SMALL 1;? I don't either, but I was just making sure there wasn't
something I missed.

Ben Pfaff

7/13/2011 8:05:00 PM

0

cc <scatnubbs@hotmail.com> writes:

> Is it acceptable practice to have a #define with a semicolon in it,
> such as:
>
> #define SMALL 1;

No, I'd assume that was a typo.
--
Ben Pfaff
http://be...

Nick

7/13/2011 8:09:00 PM

0

cc <scatnubbs@hotmail.com> writes:

> On Jul 13, 3:38 pm, Keith Thompson <ks...@mib.org> wrote:
>> cc <scatnu...@hotmail.com> writes:
>> > Is it acceptable practice to have a #define with a semicolon in it,
>> > such as:
>>
>> > #define SMALL 1;
>>
>> > I didn't think it was, but a very good friend of mine claims it's
>> > perfectly acceptable if you want to prevent the #define from being
>> > used in an expression like if(SMALL).
>>
>> Why would you want to prevent it from being used in an expression?
>> I think "1;" is a poor example of what your friend is talking about.
>> I'd be interested in seeing a better example.
>
> That was his example. That was also his explanation of why he did it
> (so the compiler would complain if he used it as an expression).
>
> Another example was from the linux kernel.
>
> /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
> #define LDO_MAX_VOLT 3300;

Flippin' heck. I hope I'm nowhere near the keyboard when anything the
kernal controls get close to that. Mainline or not.

>> A #define can contain any token sequence you like.  The macro name
>> will be expanded to that token sequence every time you use it.
>> If you want that token sequence to include a semicolon, then you
>> should have a semicolon in the definition.
>
> I know what #define does. I was asking about coding standards more or
> less, and if a #define with a semicolon was commonly used and accepted
> practice.
>
>> But most of the time, a macro expansion is used either in an
>> expression context (in which case it *shouldn't* have any semicolons,
>> and it should be protected by parentheses where necessary), or
>> in a statement context (in which case, if it consists of multiple
>> substatements, you need to use the "do { ... } while (0)" trick).
>>
>
> Right. So you see no logical reason to ever use something like #define
> SMALL 1;? I don't either, but I was just making sure there wasn't
> something I missed.

I can't think of one.

I had a quick look through my source collection and the only example I
could find where I had a #define ending with a ; was one of those things
where you define a macro one way then include a file, then define it
another and include the file again.
--
Online waterways route planner | http://ca...
Plan trips, see photos, check facilities | http://canalp...

Keith Thompson

7/13/2011 8:30:00 PM

0

cc <scatnubbs@hotmail.com> writes:
> On Jul 13, 3:38 pm, Keith Thompson <ks...@mib.org> wrote:
>> cc <scatnu...@hotmail.com> writes:
>> > Is it acceptable practice to have a #define with a semicolon in it,
>> > such as:
>>
>> > #define SMALL 1;
>>
>> > I didn't think it was, but a very good friend of mine claims it's
>> > perfectly acceptable if you want to prevent the #define from being
>> > used in an expression like if(SMALL).
>>
>> Why would you want to prevent it from being used in an expression?
>> I think "1;" is a poor example of what your friend is talking about.
>> I'd be interested in seeing a better example.
>
> That was his example. That was also his explanation of why he did it
> (so the compiler would complain if he used it as an expression).

How else would he use it?

> Another example was from the linux kernel.
>
> /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
> #define LDO_MAX_VOLT 3300;

I suspect that's just an error. Perhaps it's only used in contexts
where the extra semicolon is harmless, such as
voltage = LDO_MAX_VOLT;
which expands to
voltage = 3300;;
which is an assignment statement followed by an expression statement.

Or, worse, if it's used like this:
voltage = LDO_MAX_VOLT + 1;
then it expands to
voltage = 330; + 1;
where the "+ 1;" is an expression statement that discards the result
(and voltage gets the wrong value).

>> A #define can contain any token sequence you like.  The macro name
>> will be expanded to that token sequence every time you use it.
>> If you want that token sequence to include a semicolon, then you
>> should have a semicolon in the definition.
>
> I know what #define does. I was asking about coding standards more or
> less, and if a #define with a semicolon was commonly used and accepted
> practice.

I'd say no. It's more commonly a mistake -- and if you're unlucky,
the compiler won't warn you about it.

>> But most of the time, a macro expansion is used either in an
>> expression context (in which case it *shouldn't* have any semicolons,
>> and it should be protected by parentheses where necessary), or
>> in a statement context (in which case, if it consists of multiple
>> substatements, you need to use the "do { ... } while (0)" trick).
>
> Right. So you see no logical reason to ever use something like #define
> SMALL 1;? I don't either, but I was just making sure there wasn't
> something I missed.

I won't say there's *never* a reason to do something like that.
There are cases where macros will expand to something other than
an expression or a statement. It usually means you're messing with
the language syntax, which is dangerous but *sometimes* useful.

Many years ago, I wrote something like:

#define EVER ;;
...
for (EVER) {
...
}

but I got better.

--
Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.ne...
Nokia
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

Harald van D?k

7/13/2011 8:51:00 PM

0

On Jul 13, 8:56 pm, cc <scatnu...@hotmail.com> wrote:
> Right. So you see no logical reason to ever use something like #define
> SMALL 1;? I don't either, but I was just making sure there wasn't
> something I missed.

I do, though it does not apply to your case. There are lint-like tools
that allow you to declare that a macro expands to a statement. The
tool will verify that it is only ever used as a statement, but in
return, it has to actually *be* a statement, or it gets very confused.
A macro expansion that would be a statement if you add a semicolon
does not qualify.

Keith Thompson

7/13/2011 9:01:00 PM

0

Harald van Dijk <truedfx@gmail.com> writes:
> On Jul 13, 8:56 pm, cc <scatnu...@hotmail.com> wrote:
>> Right. So you see no logical reason to ever use something like #define
>> SMALL 1;? I don't either, but I was just making sure there wasn't
>> something I missed.
>
> I do, though it does not apply to your case. There are lint-like tools
> that allow you to declare that a macro expands to a statement. The
> tool will verify that it is only ever used as a statement, but in
> return, it has to actually *be* a statement, or it gets very confused.
> A macro expansion that would be a statement if you add a semicolon
> does not qualify.

From your description, it sounds like there are some bad lint-like
tools out there.

A macro that's intended to expand to a statement (and not to an
expression) should use the "do { ... } while (0)" trick to avoid
problems when used with if/else.

--
Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.ne...
Nokia
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

Harald van D?k

7/13/2011 9:14:00 PM

0

On Jul 13, 10:01 pm, Keith Thompson <ks...@mib.org> wrote:
> Harald van D?k <true...@gmail.com> writes:
> > On Jul 13, 8:56 pm, cc <scatnu...@hotmail.com> wrote:
> >> Right. So you see no logical reason to ever use something like #define
> >> SMALL 1;? I don't either, but I was just making sure there wasn't
> >> something I missed.
>
> > I do, though it does not apply to your case. There are lint-like tools
> > that allow you to declare that a macro expands to a statement. The
> > tool will verify that it is only ever used as a statement, but in
> > return, it has to actually *be* a statement, or it gets very confused.
> > A macro expansion that would be a statement if you add a semicolon
> > does not qualify.
>
> From your description, it sounds like there are some bad lint-like
> tools out there.
>
> A macro that's intended to expand to a statement (and not to an
> expression) should use the "do { ... } while (0)" trick to avoid
> problems when used with if/else.

That depends. As long as it warns for empty statements, which includes
the cases where the macro is immediately followed by a semicolon, it
is fine. Regardless of whether the macro appears in an if statement,
it expects the macro to always be used by itself. And if it is always
used by itself, it causes no problems before an else: it looks just as
you would normally use it. It is just as valid as far as C is
concerned. The main thing it has going against it is that it gets very
confusing when you mix it with macros that do expect to be followed by
a semicolon. (I don't use it myself, by the way.)

Joe Pfeiffer

7/13/2011 11:11:00 PM

0

cc <scatnubbs@hotmail.com> writes:

> On Jul 13, 3:38 pm, Keith Thompson <ks...@mib.org> wrote:
>> cc <scatnu...@hotmail.com> writes:
>> > Is it acceptable practice to have a #define with a semicolon in it,
>> > such as:
>>
>> > #define SMALL 1;
>>
>> > I didn't think it was, but a very good friend of mine claims it's
>> > perfectly acceptable if you want to prevent the #define from being
>> > used in an expression like if(SMALL).
>>
>> Why would you want to prevent it from being used in an expression?
>> I think "1;" is a poor example of what your friend is talking about.
>> I'd be interested in seeing a better example.
>
> That was his example. That was also his explanation of why he did it
> (so the compiler would complain if he used it as an expression).
>
> Another example was from the linux kernel.
>
> /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
> #define LDO_MAX_VOLT 3300;

I was curious enough I went and looked that one up -- it's the only
#define in the file that ends with a semicolon (even LDO_MIN_VOLT
doesn't), and a recursive grep fails to turn the symbol up anywhere else
in the kernel. I'm guessing the reason for this one was an
overly-clever way of keeping anybody from using it (for anything!) in
what seems to be a fairly new driver.

Keith Thompson

7/13/2011 11:20:00 PM

0

Joe Pfeiffer <pfeiffer@cs.nmsu.edu> writes:
> cc <scatnubbs@hotmail.com> writes:
>> On Jul 13, 3:38 pm, Keith Thompson <ks...@mib.org> wrote:
>>> cc <scatnu...@hotmail.com> writes:
>>> > Is it acceptable practice to have a #define with a semicolon in it,
>>> > such as:
>>>
>>> > #define SMALL 1;
>>>
>>> > I didn't think it was, but a very good friend of mine claims it's
>>> > perfectly acceptable if you want to prevent the #define from being
>>> > used in an expression like if(SMALL).
>>>
>>> Why would you want to prevent it from being used in an expression?
>>> I think "1;" is a poor example of what your friend is talking about.
>>> I'd be interested in seeing a better example.
>>
>> That was his example. That was also his explanation of why he did it
>> (so the compiler would complain if he used it as an expression).
>>
>> Another example was from the linux kernel.
>>
>> /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
>> #define LDO_MAX_VOLT 3300;
>
> I was curious enough I went and looked that one up -- it's the only
> #define in the file that ends with a semicolon (even LDO_MIN_VOLT
> doesn't), and a recursive grep fails to turn the symbol up anywhere else
> in the kernel. I'm guessing the reason for this one was an
> overly-clever way of keeping anybody from using it (for anything!) in
> what seems to be a fairly new driver.

I'm guessing that it's just a mistake that nobody has fixed yet.

Adding the semicolon won't keep it from being used. In many cases, it
won't change anything:

voltage = LDO_MAX_VOLT;

and in others it can silently change the meaning of the code:

voltage = LDO_MAX_VOLT + 1;

--
Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.ne...
Nokia
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"