NetBSD-Bugs archive
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]
Re: standards/42828: Almquist shell always evaluates the contents of ${ENV} even if non-interactive
Now that you know how NetBSD applies $ENV, I would assume that you have
no further problems - ie: for this usage, the difference from other systems
is a minor annoyance, not a serious impediment.
Correct; it's an annoyance and not a serious impediment. I'm arguing
for a fix to remove that annoyance.
If you had examples of the latter where NetBSD's application of $ENV
was a problem I'd be much more tempted to agree to a change, for the sake of
compatibility, for just user setup files, I'm not.
The annoyance is the problem. It's not a severe or urgent problem, but
it is a problem.
I would like to reach consensus on whether the benefits of fixing the
annoyance outweigh the costs. The benefits aren't huge because the
annoyance isn't huge, but I argue that the costs of fixing this are small.
Benefits:
* reduced risk of login shell freezes caused by fork bombs
* faster shell invocation for scripts
* standards compliance/meets user expectations
* reliable initial execution environment for scripts
Costs:
* implementing the fix (already done)
* backward incompatibility (only affects users that have code in
their ${ENV} file that is evaluated by non-interactive shells)
* makes some tasks less convenient
Is there anything I'm forgetting?
Would you be willing to take this up with the POSIX working group?
I don't have the time either ... and in any case, I've pretty much given
up on the current set of standards bodies - they've all become full of
people for whom the standard is what is most important, rather than the
systems that are built using the standard (ie: making the standard itself
seem better looks to me to be of greater importance than making the system
that uses the standard actually work better.)
This criticism brings up a broader question: How much should NetBSD
care about POSIX compliance?
I agree that standards committees can sometimes fail to appreciate
real-world issues in their zeal to create a theoretically better world
(see the controversy around C++98's 'export' feature [1]), but I do
appreciate the uniformity and long-term perspective they provide.
I think a POSIX violation should be classified as a bug. It might not
be fixed if there isn't enough manpower or user interest, but it should
still be considered a bug.
[1] <http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2003/n1426.pdf>
Flawed implies that there are legitimate use cases that are not
supported,
flawed just implies there is a defect - something that could be improved,
or which is less than perfect.
I like to distinguish "design flaw" from "implementation flaw". I
define a design flaw as something that precludes any implementation from
meeting a requirement (such as a use case that must be supported).
I don't think there is a requirement to evaluate ${ENV} for
non-interactive shells. It might be handy in some cases, but I can't
think of a case where it would be needed. That's why I prefer the word
"suboptimal" over "flawed".
but I have not yet come across any such use cases. Can you
provide an example where a user needs ${ENV} evaluated upon invocation
of every non-interactive shell?
"of every non-interactive shell" ? No, of course not, but that's not
what is needed - the converse of "only used by non-interactive shells"
is "used by at least one non-interactive shell" (ie: the opposite of
"none" is "one" not "all").
Let me rephrase by using an example: Suppose a user needed to change
the behavior of one particular shell script. In NetBSD, there are three
ways this might be done:
1. modify the script
2. via ${ENV}
3. change the invoker's environment (e.g., export environment
variables, open/close file descriptors, etc.) before running the script
Option #1 may not be available. Maybe the file is not writable, or
maybe modifications are too impractical (e.g., upgrading the package
providing the script will revert any changes). Option #2 is not
available on other platforms and may have unintended consequences in
other scripts. Option #3 may not be powerful enough.
The big question: Do options #1 and #3 provide enough flexibility in
practice? If so, then option #2 is not needed and we should fix this
bug. Certainly options #1 and #3 are insufficient in come cases, but I
would argue that those cases are rare and the result of an exceptional
flaw that should be dealt with in another way.
And that's easy - I make scripts (personal use scripts) all the time
by simply cut and paste from my history - that is, whenever I detect that
I'm going to re-execute a set of commands that I have done in the not
too distant past, I assume that if I'm doing it twice, I'm going to be
doing it again, so I simply take what I did and stick it in a script, then
run the script. That's a non-interactive shell running the script, and
in order to run correctly, it needs to run with the same environment that
my interactive shell where I first ran the commands had - that is, it
needs to have access to whatever is in ${ENV} because my interactive shell
had that access as well.
And yes, of course, I can ". ${ENV}" in the script, but then I need to
remember I have to add that, every time.
This is an example of where NetBSD's behavior is handy but not needed.
If this bug was fixed, you could still make your scripts work.
For this particular example, current behavior vs. fixed is a tradeoff
between having to remember to add the following at the top of the ${ENV}
file:
case "$-" in *i*);; *) return;; esac
versus having the remember to add the following to the scripts:
[ -r "${ENV}" ] && . "${ENV}"
Either one is annoying, but fixing it is standards compliant and won't
cause your login shell to lock up if you forget the magic line.
Another use, that is less common for me, but I have done on occasion,
is to define shell functions that replace standard commands, and either do
something compatible, but in a different way (like turning "rm" into
a "mv" to a trash directory) or add trace/debugging to what is happening,
so allow debugging of complex sets of scripts to find out just where the
script is doing something it shouldn't be doing, and why. For that, I
can set ENV so a suitable environment is established for the scripts that
I want to investigate - but only if they respect it.
This is another example of where NetBSD's behavior is handy but not
needed. The same results can be achieved by prepending a special
directory to PATH and populating that directory with custom versions of
standard commands.
Perhaps the flaw is in NetBSD's design: The ${ENV} file can arbitrarily
modify the shell execution environment (change the working directory,
modify positional parameters, close file descriptors, trap signals,
define functions, set shell options, etc.). This means that shell
script writers can't rely on a consistent initial environment.
Huh? They can't rely upon most of that anyway - when the shell that starts
the script begins, the environment can be anything.
You're right about file descriptors, but the others have well-defined
initial values. Changing them in ${ENV} could cause problems. Here are
some (somewhat silly) examples to illustrate the potential danger:
* working directory: Initially the current directory matches the
working directory of the invoker. If the user adds 'cd "${HOME}"' to
the ${ENV} file, then every shell script will assume it was invoked from
${HOME}.
* positional parameters: Initially the positional parameters match
the command line arguments passed by the invoker. If the user adds
'shift' somewhere in the ${ENV} file, all shell scripts will ignore the
first command line argument.
* shell options: Initially they're all turned off. If the user adds
'set -C' to ${ENV}, '>' will no longer work as expected for every shell
script.
Any of that that is important to the script needs to be explicitly either
tested, or set, in the script, to ensure the environment is as it expects.
Shell scripts could use $- to test for sane shell options, but none of
them do. (Why would they? The POSIX spec says they're all turned off
by default.) Shell scripts can't tell if the current working directory
or positional parameters were modified so there's nothing to test or set.
It is doubtful this would be a significant problem in practice, but in
theory this could lead to bugs that appear to be in a script file but are
actually caused by the user's ${ENV} file.
Yes, of course it can. If you put stuff in there that breaks things, you
get what you deserve (just the same as if you put commands in directories early
in your PATH with standard names but which don't do what the standard commands
with the same names do). You're entitled to break things if you want to.
So is any other user.
If a user's ${ENV} file follows the POSIX spec, it is reasonable for the
user to expect it to work; the user doesn't "deserve" to have the login
shell lock up.
but only if the
community respectfully deals with legitimate user complaints.
I agree, but you need to understand that there's more than one way to
deal with a problem - simply making every change everyone asks for,
regardless of whether that change makes things better or not would
hardly be a responsible attitude, would it? Nor does it mean that
whatever POSIX says is necessarily correct for us.
I do not mean for respect to entail capitulation, only that the tone be
professional and not condescending or dismissive.
-Richard
Home |
Main Index |
Thread Index |
Old Index