1. Introduction

Pipelines are an extremely powerful and versatile feature of Bash. They are most commonly and easily used with native commands provided by Bash or the operating system. Pipelines or pipes may not always be the most efficient method and can even be redundant when certain shell commands natively support the processing of output, as with the find command:

find -exec some_script {} \;

However, when a command doesn’t support native processing of standard output and we need to stream data to a shell script, then piping the output to a Bash function is the right option.

In this tutorial, we’ll explore this functionality by first laying a foundation through piping out to a script, then to a function inside that script, and lastly, how to do so safely.

2. How to Pipe to a Script

Let’s revisit some fundamentals. A pipe in Bash takes the standard output of one process and passes it as standard input into another process. Bash scripts support positional arguments that can be passed in at the command line.

These arguments can then be retrieved using the bash defined variables $0 to $9 inside the script:

$ more myscript.sh
#!/bin/bash 
echo $1

Now, when we call the script with a parameter, we can see the first argument captured in the variable $1 and output to the shell:

$ ./myscript.sh hello
hello

Guiding principle #1: Commands executed in Bash receive their standard input from the process that starts them.

We can see this by defining a basic script sample_one.sh that will cat some undefined data to a file. It’s undefined data at this point because cat is required to take an argument that is a file or mount point, but none is defined:

$ more sample_one.sh
#!/bin/bash 
cat > file_one.txt

If we execute some command, in this case, date, and pipe the output to our script, we can then view the result in file_one.txt:

$ date | ./sample_one.sh 
$ more file_one.txt 
  Sat Oct 23 22:03:33 SAST 2021

This may appear a little magical but is a combination of two things.

  • sample_one.sh receives its standard input from the process that started it ( date | ).
  • cat by its design, when used standalone or without arguments, copies standard input and redirects it to standard output.

Taking what we’ve learned, we can now apply that to a function inside our own custom script.

3. How to Pipe to a Function

Guiding principle #2: Standard input passed into a script is accessible by the first function called.

We can validate this by re-using our code from earlier, except this time we’ll do so inside a function. Don’t forget to call the function (read_stdin) as the script won’t execute otherwise:

$ more sample_two.sh
#!/bin/bash
function read_stdin()
{
  cat > file_two.txt
}
read_stdin

It’s important to note that the function doesn’t need an argument or variable declared to accept or hold standard input. This is automatically passed from the command line and inherited from the parent process into our function. We can use the date command again and review the contents of file_two.txt, which will be the same as before – of course, with the time updated and in accordance with the operating system’s defined locale:

$ date | ./sample_two.sh
$ more file_two.txt
Sat Oct 23 23:03:33 SAST 2021

Great work- we’ve accomplished a pipe of output to a bash function. But this isn’t very readable, it’s hard to maintain for someone with less experience, and it has potential for bugs. Let’s improve it.

4. How to Pipe Safely

4.1. Readability with /dev/stdin

In pursuit of readability and in accordance with the Unix design (everything is a file), we can reference standard input using the file /dev/stdin:

$ more sample_three.sh
#!/bin/bash
function read_stdin()
{
  cat < /dev/stdin > file_three.txt
}
read_stdin

The output in file_three.txt will be no different from that which we have already seen earlier. The advantage to this method is that standard input isn’t shrouded in hidden knowledge of cat but is visibly assigned to a variable. This could save a new developer or administrator some time debugging and troubleshooting.

4.2. Maintainability with read

Another mechanism for receiving input from a pipe into a function is the read utility. This reads a single logical line from standard input into one or more shell variables:

$ more sample_four.sh
#!/bin/bash
function getInput()
{
  read in echo $in > file_four.txt
}
getInput

We read standard input into a user-defined shell variable $in. And instead of using cat, we use echo to write the contents of that variable to standard output and into file_four.txt.

4.3. Robustness with test

There’s one common problem with the sample scripts thus far – they’re brittle. If a user executed any of them without actually passing the standard input, then the script would just hang. In fact, it would wait for input indefinitely. We can simulate this as per the example below:

$ ./sample_one.sh

To remedy this, we can validate the input first using test to confirm if a variable is set. In fact, we can also use test for positional arguments.

For example, we can write a test that does nothing more than check file types and compare values:

$ more sample_five.sh
#!/bin/bash
function getInput() {
    if test -n "$1"; then
        echo "Read from positional argument $1";
    elif test ! -t 0; then
        echo "Read from stdin if file descriptor /dev/stdin is open"
        cat > file4.txt
    else
        echo "No standard input."
    fi
}
getInput

Let’s take a closer look at this script:

  • The first conditional use of test, with the -n parameter, checks that the length of the positional argument is greater than zero.
  • The second conditional use of test, with the -t parameter, checks if the standard input file descriptor is open on a terminal.
  • The else block confirms that there was no standard input.

This is definitely a more robust solution for verifying standard input and is simple to use. While it covers most use cases, it does have shortcomings in being bound to a terminal. For more advanced input validation, we could consider moreutils, specifically the ifne binary.

5. Conclusion

In this tutorial, we incrementally learned how to pipe the standard output to a function, firstly through a script and secondly through a function. Thereafter, we focused our attention on piping standard input safely by improving the readability, maintainability, and robustness of our code.