[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.ruby

Re: consider "..." as ellipsis (Re: Range syntax theory

Peña, Botp

10/2/2004 2:17:00 AM

Bob Sidebotham [mailto:bob@windsong.bc.ca] wrote:
> Zach Dennis wrote:
> > How range operators work seems sort of backwards to me. It would
> > appear
> > as if the "..." (three dots) would be the inclusive one,
> including the
> > last value and the ".." (two dots) would be exclusive and
> exclude the
> > last value.
> If you think of "..." as ellipsis (an English language punctuation
> mark), then these definitions seem to match the Ruby interpretation:
>
> http://www.yaelf.com/punctua...
> Ellipsis: From the Greek for "to come short", originally used
> in geometry.
>
> http://www.u-aizu.ac.jp/~tripp...
> Ellipsis (...) is used to show that something has been omitted.
>
> http://en.wikipedia.org/wik...
> an ellipsis (plural: ellipses) is a row of three dots (...) or
> asterisks
> (* * *) indicating an intentional omission.

consider:

from "I, Peter, Paul, James, and Mary joined the rubyconf last year."
to "I, Peter,... Mary joined the rubyconf last year."

1. "Paul, James" were omitted in the notation but they joined indeed
2. and Mary is _not_ excluded!

I think the ".." notation came fr "...". But the ".." was easier to type
(you know *nix/pl guys better), so the history. But now we used "...", so
the confusion addressed by the op.

One could think the other way too (reminds me of someone mentioning lateral
thinking). Maybe the english notation is _not_ so good after all. Maybe
ellipsis should be two dots ".." (hey, we have computers now, right?). This
would put the ruby notation of "..." as a logical representation of end
exclusion...

In the end, it is not adviseable to think in english when using a
programming language. If you program in ruby, just think in ruby.

>
> Bob Sidebotham
>

kind regards -botp


2 Answers

Bob Sidebotham

10/2/2004 3:00:00 AM

0

Peña, Botp wrote:

> I think the ".." notation came fr "...". But the ".." was easier to type
> (you know *nix/pl guys better), so the history. But now we used "...", so
> the confusion addressed by the op.

Actually, ".." for a range predates nix/pl--it goes all the way back to
Pascal, at least... Or it may come from the fact that Algol60 had a
reference language that was distinct from the so-called implementation
language--this had to map more-or-less one-to-one with the reference
language, for any given implementation. The reference language used ":"
for ranges, but, if I remember correctly, the implementation I used
allowed ".." as a synonym, just in case you couldn't type ":" on your
favourite key punch equipment...

So, I disagree: I think that ".." came from trying to represent ":" in
ancient software/hardware environments. Why ":" was used by languages
like Algol, in the first place, I have no idea.

> In the end, it is not advisable to think in english when using a
> programming language. If you program in ruby, just think in ruby.

I'm inclined to agree. I had, however, been reading why's (poignant)
guide just the other day, and so I was thinking in natural language
terms about ruby... His reasoning around ".." vs. "..." is not the most
compelling, and I prefer the idea that, as a mnemonic, at least, I can
remember that "..." means *something* is elided, whether or not this
works in English sentence analogues.

Of course, I've never actually had occasion to *use* the "..." operator,
but then I'm still a ruby nubie.

Bob

John W Kennedy

10/2/2004 4:34:00 AM

0

Bob Sidebotham wrote:
> The reference language used ":"
> for ranges, but, if I remember correctly, the implementation I used
> allowed ".." as a synonym, just in case you couldn't type ":" on your
> favourite key punch equipment...

You are correct, sir!

> So, I disagree: I think that ".." came from trying to represent ":" in
> ancient software/hardware environments. Why ":" was used by languages
> like Algol, in the first place, I have no idea.

Algol paid no attention to technical restrictions at all. It had left
and right quotes, for example, and assumed availability of lower case
letters in two different alphabets. (ALGOL keywords were represented in
bold. This wasn't just a publishing convention; the language didn't have
reserved words, but, unlike FORTRAN and PL/I, did not have syntax rules
that allowed keywords to be distinguished from variable names, so the
bold distinction was syntactically necessary.) Practical ALGOL,
therefore, was a hideous mess that looked something like this:

'PROCEDURE' SUM10.,
'INTEGER' I, T.,
'BEGIN'
T .= 0.,
'FOR' I .= 1 'STEP' 1 'TO' 10 'DO'
T .= T + I.,
OUTSTRING (1, '('THE SUM OF INTEGERS 1 TO 10 IS ')').,
OUTINTEGER (1, I)
'END'

Some systems that relied heavily on ALGOL used more attractive
representations, but at the cost of no longer being isomorphic to
standard ALGOL.

--
John W. Kennedy
"The poor have sometimes objected to being governed badly; the rich have
always objected to being governed at all."
-- G. K. Chesterton. "The Man Who Was Thursday"