Subject: Re: C Language Standard(s)
To: None <current-users@NetBSD.ORG>
From: Simon J. Gerraty <sjg@zen.void.oz.au>
List: current-users
Date: 01/10/1996 08:50:23
> Right. The K&R style definition breaks things. .. except that some
> compilers chose to ignore this when a prototype is in scope.
No the incorrect prototype breaks things. Or you could argue that the
programmer broke things by thinking that
int foo(x)
short x;
{
meant anything.
I've written plenty of code that deals with real-world interfaces like
network protocols and a few hardware devices. I don't recall the last
time I wrote a fucntion that took a sub-int as an argument.
Thus for me old-style defintions lose me nothing - my code always
behaves correctly. And I get maximum portability of my code. I
regularly use systems that do not have an ANSI compiler unless I
install gcc - which I don't always have time to.
> Doing the whole thing ANSI style guarantees the best possible chance of
> being able to pass arguments of the types we want to consistently. I
Yes. Though with the probable exception of the kernel, most times
I've seen functions with sub-int args, they've been ill concieved.
> to the other issues we'd face porting this code to a non ANSI environment,
> it's nothing, and further, it's a sort of nothing that can be done
> automatically if we want it.
Not if you fill the code base with
char foo(short x) {...}
when you convert that sort of thing back to K&R you will have a _lot_
of work to do to debug it.
I would not even consider trying to unproto or whatever the kernel.
Porting gcc to the platform as a first step is far more likely to
work.
--sjg