This post describes how you can ensure that only one instance of a script is running at a time, which is useful if your script:
- uses significant CPU or IO and running multiple instances at the same time would risk overloading the system, or
- writes to a file or other shared resource and running multiple instances at the same time would risk corrupting the resource
In order to prevent multiple instances of a script from running, your script must first acquire a "lock" and hold on to that lock until the script completes. If the script cannot acquire the lock, it must wait until the lock becomes available. So, how do you acquire a lock? There are different ways, but the simplest is to use the lockfile
command to create a "semaphore file". This is shown in the snippet below:
#!/bin/bash set -e # waits until a lock is acquired and # deletes the lock on exit. # prevents multiple instances of the script from running acquire_lock() { lock_file=/var/tmp/foo.lock echo "Acquiring lock ${lock_file}..." lockfile "${lock_file}" trap "rm -f ${lock_file} && echo Released lock ${lock_file}" INT TERM EXIT echo "Acquired lock" } acquire_lock # do stuff
The acquire_lock
function first invokes the lockfile
command in order to create a file. If lockfile
cannot create the file, it will keep trying forever until it does. You can use the -r
option if you only want to retry a certain number of times. Once the file has been created, we need to ensure that it is deleted once the script completes or is terminated. This is done using the trap
command, which deletes the file when the script completes or when the shell receives an interrupt or terminate signal. I also like to use set -e
in all my scripts, which makes the script exit if any command fails. In this case, if lockfile
fails, the script will exit and the trap
will not be set.
lockfile
can be used in other ways as well. For example, instead of preventing multiple instances of the entire script from running, you may want to use a more granular approach and use locks only around those parts of your script which are not safe to run concurrently.
Note, that if you cannot use lockfile
, there are other alternatives such as using mkdir
or flock
as described in BashFAQ/045.
Other posts you might like:
Shell Scripting - Best Practices
Retrying Commands in Shell Scripts
Executing a Shell Command with a Timeout
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.