[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.c

a doubt about c

Ya Shou

6/11/2011 1:12:00 PM

Hello, friends.

I read some open source c program code, found a problem that let me doubt.
All those programs are using macros to define constants, and did not use an enum to achieve, but I think in the enumeration constants is very convenient.
Enumeration defines constants used in c++, but not seen any in c, why?

Thank you.

---- ---- ---- ----
Ya Shou (@54c3)
54c3.blogspot.com
64 Answers

James Kuyper

6/11/2011 1:36:00 PM

0

On 06/11/2011 09:11 AM, Ya Shou wrote:
> Hello, friends.
>
> I read some open source c program code, found a problem that let me doubt.
> All those programs are using macros to define constants, and did not use an enum to achieve, but I think in the enumeration constants is very convenient.
> Enumeration defines constants used in c++, but not seen any in c, why?

This is mostly a cultural thing; it has little to do with any
differences between the way the two languages work. There are definite
advantages to using enumeration constants over macros, but most of those
advantages apply equally well in both languages. However, those
advantages are also relatively minor, which is why it's still
commonplace to use macros instead.

Enumeration constants have the type of the enumeration in C++, whereas
they always have a type of 'int' in C. Because that affects overload
resolution and template instantiation, that can be an important
difference in C++, whereas if C adopted such a rule, it wouldn't make
much difference.
--
James Kuyper

Angel

6/11/2011 2:37:00 PM

0

On 2011-06-11, Ya Shou <zxyzxy12321@gmail.com> wrote:
> Hello, friends.
>
> I read some open source c program code, found a problem that let me doubt.
> All those programs are using macros to define constants, and did not use
> an enum to achieve, but I think in the enumeration constants is very
> convenient.

There are some advantages of using enums over #defines in C (such as
allowing compilers to do some additional checks) but unlike C++, the
advantages in C are fairly minor.

> Enumeration defines constants used in c++, but not seen any in c, why?

It's mainly historical. Enums were originally not a part of the C
language when K&R designed it; they were first added in C89. Many
sources still use #defines over enums, and both forms are acceptable
as in C there is functionally very little difference between them.


--
"C provides a programmer with more than enough rope to hang himself.
C++ provides a firing squad, blindfold and last cigarette."
- seen in comp.lang.c

Shao Miller

6/11/2011 4:24:00 PM

0

On 6/11/2011 8:11 AM, Ya Shou wrote:
>...
> I read some open source c program code, found a problem that let me doubt.
> All those programs are using macros to define constants, and did not use an enum to achieve, but I think in the enumeration constants is very convenient.
> Enumeration defines constants used in c++, but not seen any in c, why?
> ...

If you mean "integer constant expressions," I prefer to use 'enum's in C
wherever possible.

If an integer constant expression needs to be used as part of an '#if'
pre-processing directive, then I will '#define' and use a macro whose
definition will not contain any identifiers (except other macros) or
keywords.

#define ALL_ONES_CHAR ((1 << CHAR_BIT) - 1)

enum cv {
cv_minute = 60,
cv_timeout = 2 * cv_minute,
cv_sector_size = 1 << 9,
cv_max_buf_size = 4 * cv_sector_size,
/* ... */
cv_zero = 0
};

pete

6/11/2011 5:57:00 PM

0

Shao Miller wrote:
>
> On 6/11/2011 8:11 AM, Ya Shou wrote:
> >...
> > I read some open source c program code, found a problem that let me doubt.
> > All those programs are using macros to define constants, and did not use an enum to achieve, but I think in the enumeration constants is very convenient.
> > Enumeration defines constants used in c++, but not seen any in c, why?
> > ...
>
> If you mean "integer constant expressions," I prefer to use 'enum's in C
> wherever possible.
>
> If an integer constant expression needs to be used as part of an '#if'
> pre-processing directive, then I will '#define' and use a macro whose
> definition will not contain any identifiers (except other macros) or
> keywords.
>
> #define ALL_ONES_CHAR ((1 << CHAR_BIT) - 1)

That's undefined if sizeof (int) equals one.

--
pete

Keith Thompson

6/11/2011 6:55:00 PM

0

Angel <angel+news@spamcop.net> writes:
> On 2011-06-11, Ya Shou <zxyzxy12321@gmail.com> wrote:
[...]
>> Enumeration defines constants used in c++, but not seen any in c, why?
>
> It's mainly historical. Enums were originally not a part of the C
> language when K&R designed it; they were first added in C89. Many
> sources still use #defines over enums, and both forms are acceptable
> as in C there is functionally very little difference between them.

Enums are older than that. If I recall correctly, they were added
shortly after K&R1, which was published in 1978.

One drawback of enums is that they're only of type int; you can't use
them to define constants of other types, including floating-point types.

Arguably using an enum to define a single constant:

enum { max = 1000 };

is an abuse of the feature, which is intended to be used to define a
group of closely related constants. Of course that stop me from using
it that way.

What I'd really like to see is for C to adopt C++'s feature, where
a const object of numeric type whose initializer is a constant
expression can be used as a constant:

const int max = 1000;
const double pi = 3.14159265358979;

But it's probably too late to add that to C201X.

(Incidentally, the word "doubt" is not synonymous with "question",
at least not in most dialects of English. A "doubt" generally
implies that you disbelieve something.)

--
Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.ne...
Nokia
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

dr.oktopus

6/11/2011 7:53:00 PM

0

On 11 Giu, 18:56, pete <pfil...@mindspring.com> wrote:
> Shao Miller wrote:
>
> >    #define ALL_ONES_CHAR ((1 << CHAR_BIT) - 1)
>
> That's undefined if sizeof (int) equals one.
>
> --

I think that ints are required to be at least 16 bits wide, so
sizeof (int) is always at least 2 (in all ANSI conforming
implementation)

Lew Pitcher

6/11/2011 8:02:00 PM

0

On June 11, 2011 15:53, in comp.lang.c, blindwilly@freeonline.zzn.com wrote:

> On 11 Giu, 18:56, pete <pfil...@mindspring.com> wrote:
>> Shao Miller wrote:
>>
>> > #define ALL_ONES_CHAR ((1 << CHAR_BIT) - 1)
>>
>> That's undefined if sizeof (int) equals one.
>>
>> --
>
> I think that ints are required to be at least 16 bits wide, so
> sizeof (int) is always at least 2
> (in all ANSI conforming implementation)

No

sizeof(int) may be 1 in an ISO (and ANSI) conforming implementation
Granted, CHAR_BIT would have to be greater than 8 in such a configuration.

--
Lew Pitcher
Master Codewright & JOAT-in-training | Registered Linux User #112576
Me: http://pitcher.digitalfr... | Just Linux: http://jus...
---------- Slackware - Because I know what I'm doing. ------


Keith Thompson

6/11/2011 8:08:00 PM

0

"dr.oktopus" <blindwilly@freeonline.zzn.com> writes:
> On 11 Giu, 18:56, pete <pfil...@mindspring.com> wrote:
>> Shao Miller wrote:
>>
>> >    #define ALL_ONES_CHAR ((1 << CHAR_BIT) - 1)
>>
>> That's undefined if sizeof (int) equals one.
>
> I think that ints are required to be at least 16 bits wide,

Yes. Specifically INT_MIN <= -32767, and INT_MAX >= 32767,
which implies that you need at least 16 bits.

> so
> sizeof (int) is always at least 2 (in all ANSI conforming
> implementation)

No. sizeof (int) is at least 2 on systems with CHAR_BIT == 8,
but if CHAR_BIT >= 16, sizeof (int) can be 1.

Almost all modern implementations, and probably all actual hosted
implemenations, have CHAR_BIT==8, but I understand that there are
implemntations for DSPs (digital signal processors) with larger
CHAR_BIT values.

Note that a "byte" is defined in C by the value of CHAR_BIT; if
CHAR_BIT is 16, then a byte is 16 bits, not the more common 8 bits.

Incidentally, it makes more sense to talk about ISO conforming
implementations. ISO issued the C99 standard; ANSI, the US standards
body, adopted it after puplication, as did other national standards
bodies. (The previous standard was originally published by ANSI
in 1989 and adopted by ISO in 1990, and ANSI members have a strong
presence on the ISO committee, but ANSI itself has no real standing
outside the US.) Referring to ISO C can also avoid confusion since,
for historical reasons, the phrase "ANSI C" commonly refers to the
1989/1990 standard, not the current 1999 standard.

--
Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.ne...
Nokia
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

Joe Pfeiffer

6/11/2011 8:14:00 PM

0

"dr.oktopus" <blindwilly@freeonline.zzn.com> writes:

> On 11 Giu, 18:56, pete <pfil...@mindspring.com> wrote:
>> Shao Miller wrote:
>>
>> >    #define ALL_ONES_CHAR ((1 << CHAR_BIT) - 1)
>>
>> That's undefined if sizeof (int) equals one.
>>
>> --
>
> I think that ints are required to be at least 16 bits wide, so
> sizeof (int) is always at least 2 (in all ANSI conforming
> implementation)

A brief search of the C99 standard failed to turn this up. I was able
to find (in 6.2.5):

A ``plain'' int object has the natural size suggested by the
architecture of the execution environment (large enough to contain
any value in the range INT_MIN to INT_MAX as defined in the header
<limits.h>).

(which doesn't say you can't have a 1-byte int). That said, I don't
think I've ever encountered a compiler (even for an 8 bit processor)
that didn't use at least a 16 bit int.


Angel

6/11/2011 8:42:00 PM

0

On 2011-06-11, Keith Thompson <kst-u@mib.org> wrote:
> Angel <angel+news@spamcop.net> writes:
>> On 2011-06-11, Ya Shou <zxyzxy12321@gmail.com> wrote:
> [...]
>>> Enumeration defines constants used in c++, but not seen any in c, why?
>>
>> It's mainly historical. Enums were originally not a part of the C
>> language when K&R designed it; they were first added in C89. Many
>> sources still use #defines over enums, and both forms are acceptable
>> as in C there is functionally very little difference between them.
>
> Enums are older than that. If I recall correctly, they were added
> shortly after K&R1, which was published in 1978.

I was going by Wikipedia, which claims it was first introduced by ANSI.
But it is quite possible that you are right. I was only five back then
so I don't remember 1978 too well. ;-)

> One drawback of enums is that they're only of type int; you can't use
> them to define constants of other types, including floating-point types.

Very true.

> Arguably using an enum to define a single constant:
>
> enum { max = 1000 };
>
> is an abuse of the feature, which is intended to be used to define a
> group of closely related constants. Of course that stop me from using
> it that way.

Yeah, I tend to use enums to define groups of closely related constants,
like the byte offsets in my program to parse Infinity Engine files, which
looks like this:

// Constants to help parse header blocks of *.itm files v1 and v1.1.
enum item_v1
{
itm_v1_signature = 0x0000,
itm_v1_version = 0x0004,
itm_v1_generic_name = 0x0008,
itm_v1_specific_name = 0x000c,
[...]
itm_v1_size = 0x0072,
};

That replaces that ugly, non-portable packed structure I first used.

> What I'd really like to see is for C to adopt C++'s feature, where
> a const object of numeric type whose initializer is a constant
> expression can be used as a constant:
>
> const int max = 1000;
> const double pi = 3.14159265358979;
>
> But it's probably too late to add that to C201X.

Yeah, #defines and macros have their place, but this seems like a much
more natural way to define a numerical constant to me.


--
"C provides a programmer with more than enough rope to hang himself.
C++ provides a firing squad, blindfold and last cigarette."
- seen in comp.lang.c