[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.ruby

Ada vs Ruby

Marc Heiler

4/15/2008 11:26:00 AM

Hi,

On http://www.gcn.com/print/27_8/46... Ada is touted briefly.

The sentence(s) that most jumped into my eye (and hurt my brain a bit)
was this:

"[...] Ada has a feature called strong typing. This means that for every
variable a programmer declares, he or she must also specify a range of
all possible inputs.[...]"

"[...] This ensures that a malicious hacker canâ??t enter a long string of
characters as part of a buffer overflow attack or that a wrong value
wonâ??t later crash the program. [...]"

But clearly that is simple to do in ruby as well (and I never heard of a
buffer overflow outside of the C world anyway): Just specify which input
range would be allowed and discard the rest, warn the programmer, or
simply convert it to the nearest allowed value - am I missing on
something? Maybe there are some other reasons why Ada is still so en
vogue for aviation software but I dont really get it (other than legacy
code that was sitting there for thousand of years already). Maybe it is
a paradigm that is only possible in Ada.

Ruby being too slow would be something I could not quite understand
insofar that, after all you could write parts in C anyway, or you could
use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
fast. Somehow despite that Ada is still in use, to me it seems like a
"dead" language (means noone really learns it because there are better
alternatives available)

The biggest confusion I get here is simply that strong typing is touted
as a very good thing to have. I dont know if this is the case or not,
but it seems to me that this is more "behaviour" that is imposed onto
the programmer anyway (as in, he must do extra work to ensure his
variables are a certain way etc..)
For example, the "strong typing" as described here appears to me more a
"force the programmer to do this and that". This may have advantages in
the long run, I dont know, maybe fewer bugs or no buffer overflow
problems, but to me it still is forcing the programmer to comply. I dont
get what is so great about having to worry about many details. And on
blogs you do sometimes see proponents of this solution scold on the
people that use another solution (not only typing, but also test driven
development and so on...)
--
Posted via http://www.ruby-....

95 Answers

Robert Dober

4/15/2008 11:53:00 AM

0

On Tue, Apr 15, 2008 at 1:26 PM, Marc Heiler <shevegen@linuxmail.org> wrote:
> Hi,
>
> On http://www.gcn.com/print/27_8/46... Ada is touted briefly.
>
> The sentence(s) that most jumped into my eye (and hurt my brain a bit)
> was this:
>
> "[...] Ada has a feature called strong typing. This means that for every
> variable a programmer declares, he or she must also specify a range of
> all possible inputs.[...]"
>
> "[...] This ensures that a malicious hacker can't enter a long string of
> characters as part of a buffer overflow attack or that a wrong value
> won't later crash the program. [...]"
>
> But clearly that is simple to do in ruby as well (and I never heard of a
> buffer overflow outside of the C world anyway): Just specify which input
> range would be allowed and discard the rest, warn the programmer, or
> simply convert it to the nearest allowed value - am I missing on
> something? Maybe there are some other reasons why Ada is still so en
> vogue for aviation software but I dont really get it (other than legacy
> code that was sitting there for thousand of years already). Maybe it is
> a paradigm that is only possible in Ada.
I was luck enough to write an Ada debugger in Ada for Ada83 in 1986
and I have to tell you
that it was indeed revolutionary for it's safety concepts. Agility was
of course not at all a design requirement of the DoD which has chosen
the final design of the language as proposed by Jean Ichbiah.

http://en.wikipedia.org/wiki/Ada_%28programming_l...

As you can read above there is some discussion about the real value of
Ada, but I have to admit that living in the Ada world and being payed
for not doing anything else then using and studying it was a nice time
and put me into a mind setup of it's own.

It for sure is the champion of early failure (probably the compiler
detecting more potential runtime errors, especially in multitasking
than any other ) and I believe that this makes it very valuable in
mission critical domains.
> Ruby being too slow would be something I could not quite understand
> insofar that, after all you could write parts in C anyway, or you could
> use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
> fast. Somehow despite that Ada is still in use, to me it seems like a
> "dead" language (means noone really learns it because there are better
> alternatives available)
Dead? I would be very much surprised, just restricted to a domain
where it is useful.
>
> The biggest confusion I get here is simply that strong typing is touted
> as a very good thing to have.
Under some conditions it is.
>I dont know if this is the case or not,
> but it seems to me that this is more "behaviour" that is imposed onto
> the programmer anyway (as in, he must do extra work to ensure his
> variables are a certain way etc..)
Oh it is an awfull lot of work, but less than in C++ I feel.
> For example, the "strong typing" as described here appears to me more a
> "force the programmer to do this and that".
Wait a second it is still the programmer who is defining the types ;)
>This may have advantages in
> the long run, I dont know, maybe fewer bugs or no buffer overflow
> problems, but to me it still is forcing the programmer to comply. I dont
> get what is so great about having to worry about many details. And on
> blogs you do sometimes see proponents of this solution scold on the
> people that use another solution (not only typing, but also test driven
> development and so on...)
If I had been an Ada programmer for the last 20 years I definitely
would not know about the other domains and the usefulness of duck
typing and agile development.
It is an old story repeating itself like history. There were people
programming in assembler (or even machine code) for their life and
then they were asked about Fortran, what did you think they told?

Robert


--
http://ruby-smalltalk.blo...

---
Whereof one cannot speak, thereof one must be silent.
Ludwig Wittgenstein

Michael Neumann

4/15/2008 12:29:00 PM

0

Marc Heiler wrote:
> Hi,
>
> On http://www.gcn.com/print/27_8/46... Ada is touted briefly.
>
> The sentence(s) that most jumped into my eye (and hurt my brain a bit)
> was this:
>
> "[...] Ada has a feature called strong typing. This means that for every
> variable a programmer declares, he or she must also specify a range of
> all possible inputs.[...]"
>
> "[...] This ensures that a malicious hacker canâ??t enter a long string of
> characters as part of a buffer overflow attack or that a wrong value
> wonâ??t later crash the program. [...]"
>
> But clearly that is simple to do in ruby as well (and I never heard of a
> buffer overflow outside of the C world anyway): Just specify which input
> range would be allowed and discard the rest, warn the programmer, or
> simply convert it to the nearest allowed value - am I missing on
> something? Maybe there are some other reasons why Ada is still so en
> vogue for aviation software but I dont really get it (other than legacy
> code that was sitting there for thousand of years already). Maybe it is
> a paradigm that is only possible in Ada.

You're right. The problem in C is that C strings do not have a length,
they are just pointers, and strings have to be zero-terminated. That is
a very bad thing. Imagine there is no terminating zero, then any call to
a string related function will read the whole memory and will most
likely result in an exception. And determining the length of a string is
O(n). But the real security issue is, that some functions that read
input don't take a maximum length. Function gets(3) is one example.
It reads a line into a buffer, regardless how long the buffer is.

But this is more a library related problem, not so much language
related. There are string libraries out there for C that are safe.

Ada compilers have to pass a lot of tests before they get a certificate.
A huge problem is that you can't trust the compiler, especially not
optimizing compilers. They might produce code that is buggy, even if
your program is correct. That's where Ada shines.

Then the language C is not type safe. You can do all kind of type casts.
And there are numerous constructs in C that increase the possibilities
for errors. Ada is here a lot better too. For example you can limit the
range of an integer.

Furthermore, Ada has built-in support for processes and synchronization
primitives. C and C++ just can't reliably do that, as there is no
language support. That's why C++0x, the next upcoming version of C++,
exist. It's goal is to make C++ multi-thread safe.

And Ada's language specification is very detailed, whereas that of C
lets many things open, which is not that desirable. You don't want any
suprise here. This problem came up recently in the Gnu Compiler
Collection (GCC), where they changed the behaviour of the generated
code, just because the C spec didn't specified it. This broke some
applications and operating systems, and possibly introduced a lot
of unknown bugs. Nothing you can build on reliable software.

> Ruby being too slow would be something I could not quite understand
> insofar that, after all you could write parts in C anyway, or you could
> use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
> fast. Somehow despite that Ada is still in use, to me it seems like a
> "dead" language (means noone really learns it because there are better
> alternatives available)

You will never ever be able to use Ruby for aviation software, neither
Lua, Python, Perl etc.

It's not about slowness. Realtime systems can be slow as long as they
meet their deadlines. Indeed, a lot of real-time systems are very slow.
They use 20 year old technology, no caches, no speculation etc., just
because in real-time systems, you always have to calculate with the
longest possible execution time, and modern processors only improve
average execution time.

Ada is not that bad at all. It's a beautiful language, maybe a bit
verbose, but very powerful. Personally, I like it more than C++.

> The biggest confusion I get here is simply that strong typing is touted
> as a very good thing to have. I dont know if this is the case or not,
> but it seems to me that this is more "behaviour" that is imposed onto
> the programmer anyway (as in, he must do extra work to ensure his
> variables are a certain way etc..)
> For example, the "strong typing" as described here appears to me more a
> "force the programmer to do this and that". This may have advantages in
> the long run, I dont know, maybe fewer bugs or no buffer overflow
> problems, but to me it still is forcing the programmer to comply. I dont
> get what is so great about having to worry about many details. And on
> blogs you do sometimes see proponents of this solution scold on the
> people that use another solution (not only typing, but also test driven
> development and so on...)

Well, in the case of safety critical software, you don't want to have
runtime exceptions. This software must not have errors, at least it's
desirable ;-)

Duck-typing doesn't guarantee you anything at compile-time.

Regards,

Michael


Robert Dober

4/15/2008 1:30:00 PM

0

On Tue, Apr 15, 2008 at 2:28 PM, Michael Neumann <mneumann@ntecs.de> wrote:
> Marc Heiler wrote:
<snip>
>
> You will never ever be able to use Ruby for aviation software, neither
> Lua, Python, Perl etc.
Wanna bet?

Robert



--
http://ruby-smalltalk.blo...

---
Whereof one cannot speak, thereof one must be silent.
Ludwig Wittgenstein

Avdi Grimm

4/15/2008 2:15:00 PM

0

On Tue, Apr 15, 2008 at 9:29 AM, Robert Dober <robert.dober@gmail.com> wrote:
> > You will never ever be able to use Ruby for aviation software, neither
> > Lua, Python, Perl etc.
> Wanna bet?

I think it depends on what is meant by "aviation software". I
wouldn't use Ruby for embedded avionics, for several reasons. But I
might use it (or Lua, or...) to power a visual display of the state of
that avionics, for example.

--
Avdi

Home: http:...
Developer Blog: http:.../devblog/
Twitter: http://twitte...
Journal: http://avdi.livej...

Robert Dober

4/15/2008 2:22:00 PM

0

On Tue, Apr 15, 2008 at 4:15 PM, Avdi Grimm <avdi@avdi.org> wrote:
> On Tue, Apr 15, 2008 at 9:29 AM, Robert Dober <robert.dober@gmail.com> wrote:
> > > You will never ever be able to use Ruby for aviation software, neither
> > > Lua, Python, Perl etc.
> > Wanna bet?
>
> I think it depends on what is meant by "aviation software". I
> wouldn't use Ruby for embedded avionics, for several reasons. But I
> might use it (or Lua, or...) to power a visual display of the state of
> that avionics, for example.
>
You know one can bet any value on statements like "X will never
happen". When am I going to pay? I can only win.
Sorry could not resist ;).
R.


--
http://ruby-smalltalk.blo...

---
Whereof one cannot speak, thereof one must be silent.
Ludwig Wittgenstein

Britt

4/15/2008 4:31:00 PM

0

On Apr 15, 6:26 am, Marc Heiler <sheve...@linuxmail.org> wrote:
> Hi,
>
> Onhttp://www.gcn.com/print/27_8/46116... is touted briefly.
>
> The sentence(s) that most jumped into my eye (and hurt my brain a bit)
> was this:
>
> "[...] Ada has a feature called strong typing. This means that for every
> variable a programmer declares, he or she must also specify a range of
> all possible inputs.[...]"
>

I am an Ada programmer. The quoted statement from the GCN article is
not correct as written - "must" should be "may". Many languages
including C++ and Java, claim to be strongly typed. Strong typing is
a very desirable language feature. One key difference with Ada is
that Ada supports strong typing and optional range constraints of
primitive (e.g. integer, fixed point and floating point) types.

> "[...] This ensures that a malicious hacker can't enter a long string of
> characters as part of a buffer overflow attack or that a wrong value
> won't later crash the program. [...]"
>
> But clearly that is simple to do in ruby as well (and I never heard of a
> buffer overflow outside of the C world anyway): Just specify which input
> range would be allowed and discard the rest, warn the programmer, or
> simply convert it to the nearest allowed value - am I missing on
> something? Maybe there are some other reasons why Ada is still so en
> vogue for aviation software but I dont really get it (other than legacy
> code that was sitting there for thousand of years already). Maybe it is
> a paradigm that is only possible in Ada.
>
> Ruby being too slow would be something I could not quite understand
> insofar that, after all you could write parts in C anyway, or you could
> use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
> fast. Somehow despite that Ada is still in use, to me it seems like a
> "dead" language (means noone really learns it because there are better
> alternatives available)

Ada is far from dead - its a great general purpose language and is
currently being used on new projects. In the high assurance domains
where it is principally used, there is currently nothing better,
certainly not C++ or Java. There also exists the SPARK
(www.sparkada,com) subset of Ada and its associated set of formal
methods based static analysis tools. I use SPARK and, though it
requires a certain mindset to use effectively, I think its the "real
deal" for producing the highest quality code (i.e., free of initial
defects). We really don't expect to find many bugs during debugging
or formal testing, at least not many bugs that can't be traced back to
a missing or ambiguous requirement.

>
> The biggest confusion I get here is simply that strong typing is touted
> as a very good thing to have. I dont know if this is the case or not,
> but it seems to me that this is more "behaviour" that is imposed onto
> the programmer anyway (as in, he must do extra work to ensure his
> variables are a certain way etc..)
> For example, the "strong typing" as described here appears to me more a
> "force the programmer to do this and that". This may have advantages in
> the long run, I dont know, maybe fewer bugs or no buffer overflow
> problems, but to me it still is forcing the programmer to comply. I dont
> get what is so great about having to worry about many details. And on
> blogs you do sometimes see proponents of this solution scold on the
> people that use another solution (not only typing, but also test driven
> development and so on...)
> --
> Posted viahttp://www.ruby-....

"worry about many details" isn't great fun but its necessary for
safety and/or security critical software. If a well specified
programming language and its associated compilers/ static analysis
tools help me to manage the details all the way from the big picture
design down to bit-level ASIC interfaces, then I welcome the help.

- Britt

mockturtle

4/15/2008 5:20:00 PM

0

Interesting thread... also because I use both Ruby and Ada. No,
better...
since I _love_ both Ruby and Ada. Yes, they could not be more
different and...
no, I do not have any split-personality problem (at least, non that I
am
aware of it...:-)

In my personal experience, they are both great languages and each one
"shines"
in its field. I use Ruby for small to medium-large applications where
"duck
typing" allows you to write good and flexible software in little
time. However,
I discovered that when I go to large to very large applications, a
pedantic language
as Ada (which would not allow you to write sqrt(5) because "5" is an
integer and
not a float... my first Ada program...) is a better choice since many
errors
are caught at compile time and many others just at the first few runs
by the
checks automatically inserted by the compiler. For example, if you
write

type Month_Day is new Integer range 1..31;

MD : Month_Day := 30;

MD := MD + 3;

you will get a runtime error because MD exit from the allowed range.
In C this bug could comfortably sleeps for centuries...

Moreover, if you define

type Counter is new Integer;

Ada strong typing will prevent you to assign a value of type Month_Day
to
a variable of type Counter (the magic word is "new") and this makes a
lot
of sense, unless in your application you can convert a day into a
counter.
I discovered that when your software grows larger, this kind of
constraints
that you _ask to the compiler_ to enforce on you, can really help.
[there
are *lots* of discussion about the usefulness of the introduction of
new incompatible types. The sentece above is just my opinion,
based on some personal experience. I hope I did not open a new
can of worms...]

Maybe your initial productivity (mesured in lines of code written for
unit of time) can be smaller because of the loss of flexibility,
but if your software is very large you gain in debugging and
maintenace
time.

Of course, if you just want to extract data from a CSV file, or write
a wget-like program, Ada can be a "gun for mosquitos."

Todd Benson

4/15/2008 5:46:00 PM

0

On Tue, Apr 15, 2008 at 12:25 PM, framefritti@gmail.com
<framefritti@gmail.com> wrote:
> Interesting thread... also because I use both Ruby and Ada. No,
> better...
> since I _love_ both Ruby and Ada. Yes, they could not be more
> different and...
> no, I do not have any split-personality problem (at least, non that I
> am
> aware of it...:-)
>
> In my personal experience, they are both great languages and each one
> "shines"
> in its field. I use Ruby for small to medium-large applications where
> "duck
> typing" allows you to write good and flexible software in little
> time. However,
> I discovered that when I go to large to very large applications, a
> pedantic language
> as Ada (which would not allow you to write sqrt(5) because "5" is an
> integer and
> not a float... my first Ada program...) is a better choice since many
> errors
> are caught at compile time and many others just at the first few runs
> by the
> checks automatically inserted by the compiler. For example, if you
> write
>
> type Month_Day is new Integer range 1..31;
>
> MD : Month_Day := 30;
>
> MD := MD + 3;
>
> you will get a runtime error because MD exit from the allowed range.
> In C this bug could comfortably sleeps for centuries...
>
> Moreover, if you define
>
> type Counter is new Integer;
>
> Ada strong typing will prevent you to assign a value of type Month_Day
> to
> a variable of type Counter (the magic word is "new") and this makes a
> lot
> of sense, unless in your application you can convert a day into a
> counter.
> I discovered that when your software grows larger, this kind of
> constraints
> that you _ask to the compiler_ to enforce on you, can really help.
> [there
> are *lots* of discussion about the usefulness of the introduction of
> new incompatible types. The sentece above is just my opinion,
> based on some personal experience. I hope I did not open a new
> can of worms...]

You can "type" your variables in Ruby if you have to. I don't think
that's the problem. It's the possibly reckless meta-programming in
libraries you use (I'm not talking about you, Trans, I think Facets is
great).

Being an engineer and a db guy, you would think that Ruby is the most
god awful thing I've ever seen. Well, it has its place.

For realtime, Michael is right about the "time of execution" being
_the_ important thing. I would like to see in the future, however, a
Ruby that talks to the hardware like RTLinux or QNX. I'd take up such
a project myself, except I don't know enough C or assembly. I suppose
you'd have to make certain objects allowed to have free reign over the
processor/memory. Like an Object#become_real, though that's a little
scary :)

Todd

Bill Kelly

4/15/2008 8:00:00 PM

0


From: <framefritti@gmail.com>
>
> For example, if you write
>
> type Month_Day is new Integer range 1..31;
>
> MD : Month_Day := 30;
>
> MD := MD + 3;
>
> you will get a runtime error because MD exit from the allowed range.
> In C this bug could comfortably sleeps for centuries...

The example you've provided causes me to wonder whether such
language level range limiting could instill a false sense of
security in the programmer.

Please have your ada program send me an email on February 31st!

<grin>

Seems like range checking would work well for Month range 1..12;
but not so well for Month_Day... ?


Regards,

Bill



Eleanor McHugh

4/15/2008 11:55:00 PM

0

On 15 Apr 2008, at 13:28, Michael Neumann wrote:
> You will never ever be able to use Ruby for aviation software, neither
> Lua, Python, Perl etc.

You provide the budget, I'll provide the code ;) Having designed and
implemented avionics systems I see nothing in Ruby or any other
scripting language that would stand in the way of using it to do the
same thing. In fact Lua began its life as a language for device
control. That's not to say that MRI is particularly suited to the
task, but the necessary changes could be made if anyone wanted to
without having to change the language syntax and semantics.

> It's not about slowness. Realtime systems can be slow as long as
> they meet their deadlines. Indeed, a lot of real-time systems are
> very slow.
> They use 20 year old technology, no caches, no speculation etc.,
> just because in real-time systems, you always have to calculate with
> the
> longest possible execution time, and modern processors only improve
> average execution time.

It's true that realtime execution is easier when you get the execution
windows balanced, but it's mostly about coding defensively and knowing
how to handle failure states and recover when calculations exceed
their desired execution budget. The latter is particularly important
as many calculations have unpredictable run-time characteristics.

as for the reason 20 year old technology is so popular, you don't have
to look much further than the low cost of that generation of
processors and the low computational requirements of many problems: a
PIC17C42 for example has all the grunt you could ever want for
steering a light aircraft, and a Dragonball is more than adequate for
real-time GPS navigation. Chucking even a Pentium at these jobs would
be overkill unless you want to run a Windows kernel.

> Well, in the case of safety critical software, you don't want to
> have runtime exceptions. This software must not have errors, at
> least it's desirable ;-)

There's nothing wrong with runtime exceptions so long as you figure
out what the correct fail-safe behaviour of the system is and make
sure it takes it. In fact for high-spec aviation systems where there's
a statistical risk of cosmic ray interference flipping bits at run-
time I'd want to see the fail-safe strategy before I even considered
the rest of the system desing (although admittedly that was a
consideration that always made me laugh when I was doing my CAA
certifications ;).

> Duck-typing doesn't guarantee you anything at compile-time.

True. But nothing guarantees you anything at run-time, including 100%
compliance at compile-time. That's why most CS and IS degrees have
lectures explaining the difference between Verification (what your
compiler does) and Validation (what you do before you start coding).

As a rule of thumb even the highest-quality systems will have one bug
for every 30000 lines of source code (that's only 1% of the bug
density in standard shrink-wrap applications) which can amount to tens
of thousands of defects. These are not 'errors' in the sense that a
compiler understands them, but genuine misunderstandings of the
problem space in question that will lead to actively dangerous
software states.


Ellie

Eleanor McHugh
Games With Brains
http://slides.games-with-...
----
raise ArgumentError unless @reality.responds_to? :reason