Subject: Re: a new KNF (and some comments)
To: None <tech-kern-owner@netbsd.org, tech-misc@netbsd.org>
From: Peter Seebach <seebs@plethora.net>
List: tech-misc
Date: 01/21/2000 00:53:02
In message <200001210641.PAA05868@srapc342.sra.co.jp>, Noriyuki Soda writes:
>But what I'm talking is "keeping ABI".
Yes, but the ABI is crufty and inefficient.
I really think we have to be willing to make a little progress here; the ABI
is spending a lot of time converting things around for no good reason.
>But for global functions which are related to ABI, we should not use
>"short" and "char" argument. Since "short" and "char" argument has
>ABI problem.
Well, if we don't use them at all, then we might as well use the ANSI
definition, because there's no difference in ABI.
We've changed ABI's in the past (e.g., a.out vs. ELF), I don't see
it mattering *that* much.
>And if we don't use "short" and "char" argument, the "inefficient"
>issue doesn't matter, because K&R style function exactly produce
>same performance as ANSI style does.
But by the same token, they produce the same code, so we might as well
give the compiler the more standard specification.
>> I'd have to disagree with Chris.
>You are wrong, then. :-)
Quite possibly, but last time I disagreed with him, I was actually right.
:)
>Yes, thus, prototype should be defined as follows. (as Chris said.)
> int foo __P((int));
Ahh, but that's not what we normally do - and it's misleading, because it
implies that the function will understand the normal range of ints, and it
won't.
>If you try to define as follows:
> int foo __P((short));
>then *YOU MADE ABI PROBLEM*.
No, then the idea that our modern systems should handle argument passing
based on the characteristics of early PDP and VAX systems has created an
ABI problem.
ABI problems are like which side of the road you drive on; no one side is
intrinsically more right than the other. As it happens, there's a standard,
so it's probably best if we conform to the standard.
>> If, elsewhere, you say
>> int foo(short);
>> you are allowed to be using different calling conventions.
>And breaks ABI compatibility.
It's just as accurate to claim that the functions declared in the K&R style
are breaking ABI compatability. They're both "wrong".
As it happens, gcc currently does something very interesting; if you do
int foo(short);
int foo(s)
short s;
{
}
it pretends you always did it in the ANSI style, as I recall. Which, as you
note, would break ABI compatability with someone who did
extern int foo();
but we knew that already. :)
>So, it is better to use K&R style for global functions which are
>related to ABI (e.g. functions declared in /usr/include, and kernel
>functions which can be called from device drivers and 3rd party
>filesystems). Because K&R style automatically detects ABI problem
>like above.
No, it automatically ignores them, and/or creates them.
ANSI allows you to tell, looking at the declaration of a function, what
arguments it takes. K&R doesn't.
We *MUST* provide prototypes. They are not optional. If you are providing a
correct prototype, you can't use a K&R definition unless the default
promotions don't change anything.
Now, as it happens, almost nothing in the standard library is affected. I
seem to recall, in fact, that there were only a couple of functions in our
entire source tree that are affected either way. Given that, it won't
make much of a difference... But if we have to pick, how about we pick the
current standard, or at least the decade-old one, because those are the
ones people want to see on conformance checklists.
If we can switch from a.out to ELF, we can have something like three functions
in the entire library change to modern calling conventions. ;)
-s