I yesterday ported the tear-down part of MicroFW from Python back to Bash. (It was just unhappy in Python.) In doing so, I tested it by uploading it to my server, running it and seeing what happens. This is obviously unsatisfactory, but since the script runs a bunch of
iptables calls and the like, how am I supposed to test it? This integrates pretty deeply into the system, no way of mocking that, right?
Well, yes way: You just need to shamelessly abuse the
Mocking system commands
As you may or may not know, bash allows the definition of functions:
These functions can then be invoked like any other command, at least from inside the script:
So if I have a script that heavily invokes
iptables, like so:
If I now were to define a function named
iptables, that function would get called instead of the actual
iptables command, right?
Right! So now I can just run my original script by sourcing it, and whenever it tries to run
iptables, it will instead just append the invocation to a logfile but not actually do anything. I can then
grep through that logfile to see if the invocation I’m looking for is actually there. Pretty nice.
Did you know
source can take args?
Allright, so after we defined the mock functions, let’s source the original script:
Ahh damn, it checks
$1 for a command that tells it what to do. How do we pass one? We can’t just define
$1 with some value:
This just results in
1=tear_down: command not found. But turns out we can pass an argument to
This will work! However, I didn’t end up using it because I needed a
RUNNING_IN_CI variable eventually anyway, so I just added code so I can define the command via a variable directly.
I can haz test runner?
This is hella crude, but you get the idea: Define the test as a function, pass it to a test runner, and the test runner will then take care of telling you that it ran the test and what the result was. But can we actually pass a function to another function, to have it called by that function instead of calling it ourselves? Turns out yes, we can:
See how I use a string variable in the place of a command there, and it just works? Coming from a more high-level language this feels really, really wrong - I’m passing a string for-crying-out-loud, how can I just treat it like a function!? - but that’s how it is. So adding just a tiny bit more ceremony like, you know, an
if statement that checks the result and some result printery, we end up with this:
And nicely enough, since
set -e causes functions to return early when commands fail, I can just add a bunch of tests into my test functions and it’ll just work:
This function runs the code and checks that no
out.txt file has been created by it, effectively testing that the code didn’t do anything. I don’t need to add any ceremony to the
test_no_state function to check the result of that check, because if it fails,
set -e tells bash to just
return 1 from the function and
run_test will declare the test
So, Unit tests in bash are possible. I’m quite flabbergasted, but oh well :D