[EnglishFrontPage] [TitleIndex] [WordIndex

<- Tests and Conditionals | Input and Output ->


Arrays

As mentioned earlier, BASH provides three types of parameters: Strings, Integers and Arrays.

Strings are without a doubt the most used parameter type. But they are also the most misused parameter type. It is important to remember that a string holds just one element. Capturing the output of a command, for instance, and putting it in a string parameter means that parameter holds just one string of characters, regardless of whether that string represents twenty filenames, twenty numbers or twenty names of people.

And as is always the case when you put multiple items in a single string, these multiple items must be somehow delimited from each other. We, as humans, can usually decipher what the different filenames are when looking at a string. We assume that, perhaps, each line in the string represents a filename, or each word represents a filename. While this assumption is understandable, it is also inherently flawed. Each single filename can contain every character you might want to use to separate the filenames from each other in a string. That means there's technically no telling where the first filename in the string ends, because there's no character that can say: "I denote the end of this filename" because that character itself could be part of the filename.

Often, people make this mistake:

    # This does NOT work in the general case
    $ files=$(ls ~/*.jpg); cp $files /backups/

When this would probably be a better idea (using array notation, which is explained later, in the next section):

    # This DOES work in the general case
    $ files=(~/*.jpg); cp "${files[@]}" /backups/

The first attempt at backing up our files in the current directory is flawed. We put the output of ls in a string called files and then use the unquoted $files parameter expansion to cut that string into arguments (relying on Word Splitting). As mentioned before, argument and word splitting cuts a string into pieces wherever there is whitespace. Relying on it means we assume that none of our filenames will contain any whitespace. If they do, the filename will be cut in half or more. Conclusion: bad.

The only safe way to represent multiple string elements in Bash is through the use of arrays. An array is a type of variable that maps integers to strings. That basically means that it holds a numbered list of strings. Since each of these strings is a separate entity (element), it can safely contain any character, even whitespace.

For the best results and the least headaches, remember that if you have a list of things, you should always put it in an array.

Unlike some other programming languages, Bash does not offer lists, tuples, etc. Just arrays, and associative arrays (which are new in Bash 4).



Creating Arrays

There are several ways you can create or fill your array with data. There is no one single true way: the method you'll need depends on where your data comes from and what it is.

The easiest way to create a simple array with data is by using the =() syntax:

    $ names=("Bob" "Peter" "$USER" "Big Bad John")

This syntax is great for creating arrays with static data or a known set of string parameters, but it gives us very little flexibility for adding lots of array elements. If you need more flexibility, you can also specify explicit indexes:

    $ names=([0]="Bob" [1]="Peter" [20]="$USER" [21]="Big Bad John")
    # or...
    $ names[0]="Bob"

Notice that there is a gap between indices 1 and 20 in this example. An array with holes in it is called a sparse array. Bash allows this, and it can often be quite useful.

If you want to fill an array with filenames, then you'll probably want to use Globs in there:

    $ photos=(~/"My Photos"/*.jpg)

Notice here that we quoted the My Photos part because it contains a space. If we hadn't quoted it, Bash would have split it up into photos=('~/My' 'Photos/'*.jpg ) which is obviously not what we want. Also notice that we quoted only the part that contained the space. That's because we cannot quote the ~ or the *; if we do, they'll become literal and Bash won't treat them as special characters anymore.

Unfortunately, its really easy to equivocally create arrays with a bunch of filenames in the following way:

    $ files=$(ls)    # BAD, BAD, BAD!
    $ files=($(ls))  # STILL BAD!

Remember to always avoid using ls. The first would create a string with the output of ls. That string cannot possibly be used safely for reasons mentioned in the Arrays introduction. The second is closer, but it still splits up filenames with whitespace.

This is the right way to do it:

    $ files=(*)      # Good!

This statement gives us an array where each filename is a separate element. Perfect!

This section that we're about to introduce contains some advanced concepts. If you get lost, you may want to return here after you've read the whole guide. You can skip ahead to Using Arrays if you want to keep things simple.

Now, sometimes we want to build an array from a string or the output of a command. Commands (generally) just output strings: for instance, running a find command will enumerate filenames, and separate these filenames with newlines (putting each filename on a separate line). So to parse that one big string into an array we need to tell Bash where each element ends. (Note, this is a bad example, because filenames can contain a newline, so it is not safe to delimit them with newlines! But see below.)

Breaking up a string is what IFS is used for:

    $ IFS=. read -a ip_elements <<< "127.0.0.1"

Here we use IFS with the value . to cut the given IP address into array elements wherever there's a ., resulting in an array with the elements 127, 0, 0 and 1.

(The builtin command read and the <<< operator will be covered in more depth in the Input and Output chapter.)

We could do the same thing with a find command, by setting IFS to a newline. But then our script would fail when someone creates a filename with a newline in it (either accidentally or maliciously).

So, is there any way to get a list of elements from an external program (like find) into a Bash array? In general, the answer is yes, provided there is a reliable way to delimit the elements.

In the specific case of filenames, the answer to this problem is NUL bytes. A NUL byte is a byte which is just all zeros: 00000000. Bash strings can't contain NUL bytes, because of an artifact of the "C" programming language: NUL bytes are used in C to mark the end of a string. Since Bash is written in C and uses C's native strings, it inherits that behavior.

A data stream (like the output of a command, or a file) can contain NUL bytes. Streams are like strings with three big differences: they are read sequentially (you usually can't jump around); they're unidirectional (you can read from them, or write to them, but typically not both); and they can contain NUL bytes.

File names cannot contain NUL bytes (since they're implemented as C strings by Unix), and neither can the vast majority of human-readable things we would want to store in a script (people's names, IP addresses, etc.). That makes NUL a great candidate for separating elements in a stream. Quite often, the command whose output you want to read will have an option that makes it output its data separated by NUL bytes rather than newlines or something else. find (on GNU and BSD, anyway) has the option -print0, which we'll use in this example:

    files=()
    while read -r -d $'\0'; do
        files+=("$REPLY")
    done < <(find /foo -print0)

This is a safe way of parsing a command's output into strings. Understandably, it looks a little confusing and convoluted at first. So let's take it apart:

The first line files=() creates an empty array named files.

We're using a while loop that runs a read command each time. The read command uses the -d $'\0' option, which means that instead of reading a line at a time (up to a newline), we're reading up to a NUL byte (\0). It also uses -r to prevent it from treating backslashes specially.

Once read has read some data and encountered a NUL byte, the while loop's body is executed. We put what we read (which is in the parameter REPLY) into our array.

To do this, we use the +=() syntax. This syntax adds one or more element(s) to the end of our array.

Finally, the < <(..) syntax is a combination of File Redirection (<) and Process Substitution (<(..)). Omitting the technical details for now, we'll simply say that this is how we send the output of the find command into our while loop.

The find command itself uses the -print0 option as mentioned before to tell it to separate the filenames it finds with a NUL byte.





Using Arrays

Walking over array elements is really easy. Because an array is such a safe medium of storage, we can simply use a for loop to iterate over its elements:

    $ for file in "${myfiles[@]}"; do
    >     cp "$file" /backups/
    > done

Notice the syntax used to expand the array here. We use the quoted form: "${myfiles[@]}". Bash replaces this syntax with every single element in the array, properly quoted.

The following two examples have the same effect:

    $ names=("Bob" "Peter" "$USER" "Big Bad John")
    $ for name in "${names[@]}"; do echo "$name"; done

    $ for name in "Bob" "Peter" "$USER" "Big Bad John"; do echo "$name"; done

The first example creates an array named names which is filled up with a few elements. Then the array is expanded into these elements, which are then used by the for loop. In the second example, we skipped the array and just passed the list of elements directly to for.

Remember to quote the ${arrayname[@]} expansion properly. If you don't, you'll lose all benefit of having used an array at all: leaving arguments unquoted means you're telling Bash it's OK to wordsplit them into pieces and break everything again.

The above example expanded the array in a for-loop statement. But you can expand the array anywhere you want to put its elements as arguments; for instance in a cp command:

    myfiles=(db.sql home.tbz2 etc.tbz2)
    cp "${myfiles[@]}" /backups/

This runs the cp command, replacing the "${myfiles[@]}" part with every filename in the myfiles array, properly quoted. After expansion, Bash will effectively run this:

    cp "db.sql" "home.tbz2" "etc.tbz2" /backups/

cp will then copy the files to your /backups/ directory.

You can also expand single array elements by referencing their element number (called index). Remember that by default, arrays are zero-based, which means that their first element has the index zero:

    $ echo "The first name is: ${names[0]}"
    $ echo "The second name is: ${names[1]}"

(You could create an array with no element 0. Remember what we said about sparse arrays earlier -- you can have "holes" in the sequence of indices, and this applies to the beginning of the array as well as the middle. It's your responsibility as the programmer to know which of your arrays are potentially sparse, and which ones are not.)

There is also a second form of expanding all array elements, which is "${arrayname[*]}". This form is ONLY useful for converting arrays into a single string with all the elements joined together. The main purpose for this is outputting the array to humans:

    $ names=("Bob" "Peter" "$USER" "Big Bad John")
    $ echo "Today's contestants are: ${names[*]}"
    Today's contestants are: Bob Peter lhunath Big Bad John

Notice that in the resulting string, there's no way to tell where the names begin and end! This is why we keep everything separate as long as possible.

Remember to still keep everything nicely quoted! If you don't keep ${arrayname[*]} quoted, once again Bash's Wordsplitting will cut it into bits.

You can combine IFS with "${arrayname[*]}" to indicate the character to use to delimit your array elements as you merge them into a single string. This is handy, for example, when you want to comma delimit names:

    $ names=("Bob" "Peter" "$USER" "Big Bad John")
    $ ( IFS=,; echo "Today's contestants are: ${names[*]}" )
    Today's contestants are: Bob,Peter,lhunath,Big Bad John

Notice how in this example we put the IFS=,; echo ... statement in a Subshell by wrapping ( and ) around it. We do this because we don't want to change the default value of IFS in the main shell. When the subshell exits, IFS still has its default value and no longer just a comma. This is important because IFS is used for a lot of things, and changing its value to something non-default will result in very odd behavior if you don't expect it!

Alas, the "${array[*]}" expansion only uses the first character of IFS to join the elements together. If we wanted to separate the names in the previous example with a comma and a space, we would have to use some other technique (for example, a for loop).

The printf command deserves special mention here, because it's a supremely elegant way to dump an array:

    $ names=("Bob" "Peter" "$USER" "Big Bad John")
    $ printf "%s\n" "${names[@]}"
    Bob
    Peter
    lhunath
    Big Bad John

Of course, a for loop offers the ultimate flexibility, but printf and its implicit looping over arguments can cover many of the simpler cases. It can even produce NUL-delimited streams, perfect for later retrieval:

    $ printf "%s\0" "${myarray[@]}" > myfile

One final tip: you can get the number of elements of an array by using ${#array[@]}

    $ array=(a b c)
    $ echo ${#array[@]}
    3



Associative Arrays

Until recently, BASH could only use numbers (more specifically, non-negative integers) as keys of arrays. This means you could not "map" or "translate" one string to another. This is something a lot of people missed. People began to (ab)use variable indirection as a means to address the issue.

Since BASH 4 was released, there is no longer any excuse to use indirection (or worse, eval) for this purpose. You can now use full-featured associative arrays.

To create an associative array, you need to declare it as such (using declare -A). This is to guarantee backward compatibility with the standard indexed arrays. Here's how you do that:

    $ declare -A fullNames
    $ fullNames=( ["lhunath"]="Maarten Billemont" ["greycat"]="Greg Wooledge" )
    $ echo "Current user is: $USER.  Full name: ${fullNames[$USER]}."
    Current user is: lhunath.  Full name: Maarten Billemont.

With the same syntax as for indexed arrays, you can iterate over the keys of associative arrays:

    $ for user in "${!fullNames[@]}"
    > do echo "User: $user, full name: ${fullNames[$user]}."; done
    User: lhunath, full name: Maarten Billemont.
    User: greycat, full name: Greg Wooledge.

Two things to remember, here: First, the order of the keys you get back from an associative array using the ${!array[@]} syntax is unpredictable; it won't necessarily be the order in which you assigned elements, or any kind of sorted order.

Second, you cannot omit the $ if you're using a parameter as the key of an associative array. With standard indexed arrays, the [...] part is actually an arithmetic context (really, you can do math there without an explicit $((...)) markup). In an arithmetic context, a Name can't possibly be a valid number, and so BASH assumes it's a parameter and that you want to use its content. This doesn't work with associative arrays, since a Name could just as well be a valid associative array key.

Let's demonstrate with examples:

    $ indexedArray=( "one" "two" )
    $ declare -A associativeArray=( ["foo"]="bar" ["alpha"]="omega" )
    $ index=0 key="foo"
    $ echo "${indexedArray[$index]}"
    one
    $ echo "${indexedArray[index]}"
    one
    $ echo "${indexedArray[index + 1]}"
    two
    $ echo "${associativeArray[$key]}"
    bar
    $ echo "${associativeArray[key]}"

    $ echo "${associativeArray[key + 1]}"

As you can see, both $index and index work fine with indexed arrays. They both evaluate to 0. You can even do math on it to increase it to 1 and get the second value. No go with associative arrays, though. Here, we need to use $key; the others fail.


<- Tests and Conditonals | Input and Output ->


2012-07-01 04:05