Most shell scripts are quick 'n dirty solutions to non-complex problems. As such, optimizing them for speed is not much of an issue. Consider the case, though, where a script carries out an important task, does it well, but runs too slowly. Rewriting it in a compiled language may not be a palatable option. The simplest fix would be to rewrite the parts of the script that slow it down. Is it possible to apply principles of code optimization even to a lowly shell script?
Check the loops in the script. Time consumed by repetitive operations adds up quickly. If at all possible, remove time-consuming operations from within loops.
Use builtin commands in preference to system commands. Builtins execute faster and usually do not launch a subshell when invoked.
Avoid unnecessary commands, particularly in a pipe.
cat "$file" | grep "$word" grep "$word" "$file" # The above command-lines have an identical effect, #+ but the second runs faster since it launches one fewer subprocess. |
Use the time and times tools to profile computation-intensive commands. Consider rewriting time-critical code sections in C, or even in assembler.
Try to minimize file I/O. Bash is not particularly efficient at handling files, so consider using more appropriate tools for this within the script, such as awk or Perl.
Write your scripts in a modular and coherent form, [1] so they can be reorganized and tightened up as necessary. Some of the optimization techniques applicable to high-level languages may work for scripts, but others, such as loop unrolling, are mostly irrelevant. Above all, use common sense.
For an excellent demonstration of how optimization can dramatically reduce the execution time of a script, see Example 15-47.
[1] | This usually means liberal use of functions. |
Saturday, December 21, 2024 @ 11:37:51 PM
z.ServerAdmin@lam1.us