[EnglishFrontPage] [TitleIndex] [WordIndex

I'm reading a file line by line and running ssh or ffmpeg, but everything after the first line is eaten!

When reading a file line by line, if a command inside the loop also reads stdin, it can exhaust the input file. For example:

  # Non-working example
  while IFS= read -r file; do
    ffmpeg -i "$file" -vcodec libxvid -acodec libfaac -ar 32000 "${file%.avi}".mkv
  done < <(find . -name '*.avi')

  # Non-working example
  while read host; do
    ssh "$host" some command
  done <hostslist

What's happening here? Let's take the first example. read reads a line from standard input (FD 0), puts it in the file parameter, and then ffmpeg is executed. Like any program you execute from BASH, ffmpeg inherits standard input, which for some reason it reads. I don't know why. But in any case, when ffmpeg reads stdin, it sucks up all the input from the find command, starving the loop.

Here's how you make it work:

  while IFS= read -r file; do
    ffmpeg -i "$file" -vcodec libxvid -acodec libfaac -ar 32000 "${file%.avi}".mkv </dev/null
  done < <(find . -name '*.avi')

Notice the redirection on the ffmpeg line: </dev/null. The ssh example can be fixed the same way, or with the -n switch (at least with OpenSSH).

Sometimes with large loops it might be difficult to work out what's reading from stdin; or a program might change its behaviour when you add </dev/null to it. In this case you can make read use a different FileDescriptor that a random program is less likely to read from:

  while read <&3 line; do
    ......
  done 3<file

or use read's -u option (Not POSIX):

  # Bash
  while read -u 3 line; do
    ......
  done 3<file

2012-07-01 04:05