Nick Keighley
10/23/2009 8:18:00 AM
On 23 Oct, 01:11, "bartc" <ba...@freeuk.com> wrote:
> "Alan Curry" <pac...@kosh.dhis.org> wrote in message
> news:hbqq7r$ut6$1@aioe.org...
> > In article <hbqib9$ln...@aioe.org>, jacob navia <j...@nospam.org> wrote:
> >>I am writing bit strings now, and the problem is to find if a
> >>sequence of bits is inside a larger sequence of bits.
> >>Let's say the signature is
>
> >>bool bit_strstr(char *source, size_t source_len,
> >> char *sequence, size_t sequence_len);
>
> > When you write the documentation for this function, please clarify the bit
> > ordering. Bit endianness is not obvious enough to leave unstated. Some
> > people think the least-significant bit naturally comes first, ordering the bits
> > from (1<<0) to (1<<7). Others think the most-significant bit naturally comes
> > first, since that matches the order in which multi-digit numbers are
> > usually written. And when a byte is partially filled, it's also not obvious which
> > end will be unused.
>
> > Is the 4-bit sequence 1101 encoded as:
> > "\x0d" (high bits unused, sequence ends at low bit)
> > "\x0b" (high bits unused, sequence starts at low bit)
> > "\xd0" (low bits unused, sequence starts at high bit)
> > "\xb0" (low bits unused, sequence ends at high bit)
>
> Wouldn't it be better if this information was not disclosed? You seem to
> have explained why it isn't a good idea to directly access the bit data
> rather than going through the interface.
>
> If you imagine these bitstrings can be manipulated like arrays, and indexed
> from 0 to N-1 bits, then it may not be necessary to know the bit order
> within a char value. (And a slice of such an array could even start part-way
> through a char value.)
While I'm all for hiding stuff that shouldn't be seen, this seesm to
be going too far. Consider a bit oriented protocol. I read a bunch of
bits off a channel and then want to find a pattern in that bit
sequence (frame marker or some such). The order the bits appear in is
kind of important!
> Whatever the order, the chances are that a 32-bit slice of a such a
> bitstring, even if it was aligned to a char, would not have the bits laid
> out in memory the same way as an equivalent 32-bit int value, assuming you
> even knew which end was the least significant bit.
I'm not sure if this matters