Home » How can we run a command stored in a variable?

How can we run a command stored in a variable?

Solutons:


This has been discussed in a number of questions on unix.SE, I’ll try to collect all issues I can come up with here. Below is a description of why and how the various attempts fail, a way to do it properly with a function (for a fixed command), or with shell arrays (Bash/ksh/zsh) or the $@ pseudo-array (POSIX sh), and notes about using eval to do this. Some references at the end.


Why it fails

The reason you face those problems is word splitting and the fact that quotes expanded from variables don’t act as quotes, but are just ordinary characters.

(Note that this is similar to every other programming language: e.g. char *s = "foo()"; printf("%sn", s) does not call the function foo() in C, but just prints the string foo(). The shell is a programming language, not a macro processor.)

The cases presented in the question:

The assignment here assigns the single string ls -l "/tmp/test/my dir" to abc:

$ abc="ls -l "/tmp/test/my dir""

Below, $abc is split on whitespace, and ls gets the three arguments -l, "/tmp/test/my and dir" (with a quote at the front of the second and another at the back of the third). The option works, but the path gets incorrectly processed:

$ $abc
ls: cannot access '"/tmp/test/my': No such file or directory
ls: cannot access 'dir"': No such file or directory

Here, the expansion is quoted, so it’s kept as a single word. The shell tries to find a program literally called ls -l "/tmp/test/my dir", spaces and quotes included.

$ "$abc"
bash: ls -l "/tmp/test/my dir": No such file or directory

And here, $abc is split, and only the first resulting word is taken as the argument to -c, so Bash just runs ls in the current directory. The other words are arguments to bash, and are used to fill $0, $1, etc.

$ bash -c $abc
'my dir'

With bash -c "$abc", and eval "$abc", there’s an additional shell processing step, which does make the quotes work, but also causes all shell expansions to be processed again, so there’s a risk of accidentally running e.g. a command substitution from user-provided data, unless you’re very careful about quoting.


Better ways to do it

The two better ways to store a command are a) use a function instead, b) use an array variable (or the positional parameters).

Using a function:

Simply declare a function with the command inside, and run the function as if it were a command. Expansions in commands within the function are only processed when the command runs, not when it’s defined, and you don’t need to quote the individual commands.

# define it
myls() {
    ls -l "/tmp/test/my dir"
}

# run it
myls

Using an array:

Arrays allow creating multi-word variables where the individual words contain white space. Here, the individual words are stored as distinct array elements, and the "${array[@]}" expansion expands each element as separate shell words:

# define the array
mycmd=(ls -l "/tmp/test/my dir")

# run the command
"${mycmd[@]}"

The syntax is slightly horrible, but arrays also allow you to build the command line piece-by-piece. For example:

mycmd=(ls)               # initial command
if [ "$want_detail" = 1 ]; then
    mycmd+=(-l)          # optional flag
fi
mycmd+=("$targetdir")    # the filename

"${mycmd[@]}"

or keep parts of the command line constant and use the array fill just a part of it, like options or filenames:

options=(-x -v)
files=(file1 "file name with whitespace")
target=/somedir

transmutate "${options[@]}" "${files[@]}" "$target"

The downside of arrays is that they’re not a standard feature, so plain POSIX shells (like dash, the default /bin/sh in Debian/Ubuntu) don’t support them (but see below). Bash, ksh and zsh do, however, so it’s likely your system has some shell that supports arrays.

Using "$@"

In shells with no support for named arrays, one can still use the positional parameters (the pseudo-array "$@") to hold the arguments of a command.

The following should be portable script bits that do the equivalent of the code bits in the previous section. The array is replaced with "$@", the list of positional parameters. Setting "$@" is done with set, and the double quotes around "$@" are important (these cause the elements of the list to be individually quoted).

First, simply storing a command with arguments in "$@" and running it:

set -- ls -l "/tmp/test/my dir"
"$@"

Conditionally setting parts of the command line options for a command:

set -- ls
if [ "$want_detail" = 1 ]; then
    set -- "$@" -l
fi
set -- "$@" "$targetdir"

"$@"

Only using "$@" for options and operands:

set -- -x -v
set -- "$@" file1 "file name with whitespace"
set -- "$@" /somedir

transmutate "$@"

(Of course, "$@" is usually filled with the arguments to the script itself, so you’ll have to save them somewhere before re-purposing "$@".)


Using eval (be careful here!)

eval takes a string and runs it as a command, just like if it was entered on the shell command line. This includes all quote and expansion processing, which is both useful and dangerous.

In the simple case, it allows doing just what we want:

cmd='ls -l "/tmp/test/my dir"'
eval "$cmd"

With eval, the quotes are processed, so ls eventually sees just the two arguments -l and /tmp/test/my dir, like we want. eval is also smart enough to concatenate any arguments it gets, so eval $cmd could also work in some cases, but e.g. all runs of whitespace would be changed to single spaces. It’s still better to quote the variable there as that will ensure it gets unmodified to eval.

However, it’s dangerous to include user input in the command string to eval. For example, this seems to work:

read -r filename
cmd="ls -ld '$filename'"
eval "$cmd";

But if the user gives input that contains single quotes, they can break out of the quoting and run arbitrary commands! E.g. with the input '$(whatever)'.txt, your script happily runs the command substitution. That it could have been rm -rf (or worse) instead.

The issue there is that the value of $filename was embedded in the command line that eval runs. It was expanded before eval, which saw e.g. the command ls -l ''$(whatever)'.txt'. You would need to pre-process the input to be safe.

If we do it the other way, keeping the filename in the variable, and letting the eval command expand it, it’s safer again:

read -r filename
cmd='ls -ld "$filename"'
eval "$cmd";

Note the outer quotes are now single quotes, so expansions within do not happen. Hence, eval sees the command ls -l "$filename" and expands the filename safely itself.

But that’s not much different from just storing the command in a function or an array. With functions or arrays, there is no such problem since the words are kept separate for the whole time, and there’s no quote or other processing for the contents of filename.

read -r filename
cmd=(ls -ld -- "$filename")
"${cmd[@]}"

Pretty much the only reason to use eval is one where the varying part involves shell syntax elements that can’t be brought in via variables (pipelines, redirections, etc.). However, you’ll then need to quote/escape everything else on the command line that needs protection from the additional parsing step (see link below). In any case, it’s best to avoid embedding input from the user in the eval command!


References

  • Word Splitting in BashGuide
  • BashFAQ/050 or “I’m trying to put a command in a variable, but the complex cases always fail!”
  • The question Why does my shell script choke on whitespace or other special characters?, which discusses a number of issues related to quoting and whitespace, including storing commands.
  • Escape a variable for use as content of another script

The safest way to run a (non-trivial) command is eval. Then you can write the command as you would do on the command line and it is executed exactly as if you had just entered it. But you have to quote everything.

Simple case:

abc="ls -l "/tmp/test/my dir""
eval "$abc"

not so simple case:

# command: awk '! a[$0]++ { print "foo: " $0; }' inputfile
abc="awk "''! a[$0]++ { print "foo: " $0; }''' inputfile'
eval "$abc"

The second quote sign break the command.

When I run:

abc="ls -l '/home/wattana/Desktop'"
$abc

It gave me an error.

But when I run

abc="ls -l /home/wattana/Desktop"
$abc

There is no error at all

There is no way to fix this at the time(for me) but you can avoid the error by not having space in directory name.

This answer said the eval command can be used to fix this but it doesn’t work for me
🙁

Related Solutions

Joining bash arguments into single string with spaces

[*] I believe that this does what you want. It will put all the arguments in one string, separated by spaces, with single quotes around all: str="'$*'" $* produces all the scripts arguments separated by the first character of $IFS which, by default, is a space....

AddTransient, AddScoped and AddSingleton Services Differences

TL;DR Transient objects are always different; a new instance is provided to every controller and every service. Scoped objects are the same within a request, but different across different requests. Singleton objects are the same for every object and every...

How to download package not install it with apt-get command?

Use --download-only: sudo apt-get install --download-only pppoe This will download pppoe and any dependencies you need, and place them in /var/cache/apt/archives. That way a subsequent apt-get install pppoe will be able to complete without any extra downloads....

What defines the maximum size for a command single argument?

Answers Definitely not a bug. The parameter which defines the maximum size for one argument is MAX_ARG_STRLEN. There is no documentation for this parameter other than the comments in binfmts.h: /* * These are the maximum length and maximum number of strings...

Bulk rename, change prefix

I'd say the simplest it to just use the rename command which is common on many Linux distributions. There are two common versions of this command so check its man page to find which one you have: ## rename from Perl (common in Debian systems -- Ubuntu, Mint,...

Output from ls has newlines but displays on a single line. Why?

When you pipe the output, ls acts differently. This fact is hidden away in the info documentation: If standard output is a terminal, the output is in columns (sorted vertically) and control characters are output as question marks; otherwise, the output is...

mv: Move file only if destination does not exist

mv -vn file1 file2. This command will do what you want. You can skip -v if you want. -v makes it verbose - mv will tell you that it moved file if it moves it(useful, since there is possibility that file will not be moved) -n moves only if file2 does not exist....

Is it possible to store and query JSON in SQLite?

SQLite 3.9 introduced a new extension (JSON1) that allows you to easily work with JSON data . Also, it introduced support for indexes on expressions, which (in my understanding) should allow you to define indexes on your JSON data as well. PostgreSQL has some...

Combining tail && journalctl

You could use: journalctl -u service-name -f -f, --follow Show only the most recent journal entries, and continuously print new entries as they are appended to the journal. Here I've added "service-name" to distinguish this answer from others; you substitute...

how can shellshock be exploited over SSH?

One example where this can be exploited is on servers with an authorized_keys forced command. When adding an entry to ~/.ssh/authorized_keys, you can prefix the line with command="foo" to force foo to be run any time that ssh public key is used. With this...

Why doesn’t the tilde (~) expand inside double quotes?

The reason, because inside double quotes, tilde ~ has no special meaning, it's treated as literal. POSIX defines Double-Quotes as: Enclosing characters in double-quotes ( "" ) shall preserve the literal value of all characters within the double-quotes, with the...

What is GNU Info for?

GNU Info was designed to offer documentation that was comprehensive, hyperlinked, and possible to output to multiple formats. Man pages were available, and they were great at providing printed output. However, they were designed such that each man page had a...

Set systemd service to execute after fstab mount

a CIFS network location is mounted via /etc/fstab to /mnt/ on boot-up. No, it is not. Get this right, and the rest falls into place naturally. The mount is handled by a (generated) systemd mount unit that will be named something like mnt-wibble.mount. You can...

Merge two video clips into one, placing them next to each other

To be honest, using the accepted answer resulted in a lot of dropped frames for me. However, using the hstack filter_complex produced perfectly fluid output: ffmpeg -i left.mp4 -i right.mp4 -filter_complex hstack output.mp4 ffmpeg -i input1.mp4 -i input2.mp4...

How portable are /dev/stdin, /dev/stdout and /dev/stderr?

It's been available on Linux back into its prehistory. It is not POSIX, although many actual shells (including AT&T ksh and bash) will simulate it if it's not present in the OS; note that this simulation only works at the shell level (i.e. redirection or...

How can I increase the number of inodes in an ext4 filesystem?

It seems that you have a lot more files than normal expectation. I don't know whether there is a solution to change the inode table size dynamically. I'm afraid that you need to back-up your data, and create new filesystem, and restore your data. To create new...

Why doesn’t cp have a progress bar like wget?

The tradition in unix tools is to display messages only if something goes wrong. I think this is both for design and practical reasons. The design is intended to make it obvious when something goes wrong: you get an error message, and it's not drowned in...