I use Ubuntu 16.04 and I execute a list of remote scripts that are in the same directory (a GitHub repository):
curl -s https://raw.githubusercontent.com/${user}/${repo}/master/1.sh | tr -d '\r' | bash
curl -s https://raw.githubusercontent.com/${user}/${repo}/master/2.sh | tr -d '\r' | bash
curl -s https://raw.githubusercontent.com/${user}/${repo}/master/3.sh | tr -d '\r' | bash
curl -s https://raw.githubusercontent.com/${user}/${repo}/master/4.sh | tr -d '\r' | bash
curl -s https://raw.githubusercontent.com/${user}/${repo}/master/5.sh | tr -d '\r' | bash
curl -s https://raw.githubusercontent.com/${user}/${repo}/master/6.sh | tr -d '\r' | bash
How would you cope with the awful redundancy?
I think of a for
loop but I have no idea how to construct it. All for
loops I've seen so far doesn't give me a clue on how to do that particular task of reusing a curl
pattern (and piped output) for different files in the same remote directory.
You are more than welcome to share an example.
curl
operations.For two or more files you could use Unix seq:
for var in $(seq 6)
do
curl -s https://raw.githubusercontent.com/${user}/${repo}/master/$var.sh | tr -d '\r' | bash
done
Explanation:
seq
to attain a count up to 6 (as the question lists 6 curl
operations).var
and use this in your curl
command.Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments