I am trying to measure the time it takes for a Kubernetes object to be deployed in a Kubernetes cluster by using the time
utility. I am trying to do that severally with a time sleep to get values for multiple simulation of deployments.
This is the script.
#!/bin/bash
function time_check {
i=$1
time kubectl apply -f deploy.yml --dry-run=client
}
for i in {1..3}
do
time_check $i &
sleep 2
done
This is the Output
deployment.apps/nginx-raw created (dry run)
real 0m0.421s
user 0m0.359s
sys 0m0.138s
deployment.apps/nginx-raw created (dry run)
real 0m0.359s
user 0m0.443s
sys 0m0.158s
deployment.apps/nginx-raw created (dry run)
real 0m0.138s
user 0m0.412s
sys 0m0.122s
deployment.apps/nginx-raw created (dry run)
real 1.483s
user 0m0.412s
sys 0m0.122s
deployment.apps/nginx-raw created (dry run)
real 1.456s
user 0m0.234s
sys 0m0.567s
deployment.apps/nginx-raw created (dry run)
real 2.345
user 0m0.435s
sys 0m0.123s
Goal
I want to pipe the output and take the first row of each iteration’s real 0m0.421s
, Then take the number part 0m0.421s
and strip the 0m
if it’s in seconds or just leave it if it’s in minutes like 1.483
. Also strip the s
at the end
The final results should be output in a CSV file to be plotted. The expected output in CSV
real
0.421
0.359
0.138
1.483
1.456
2.345
Add-on
I will do this for another deployment and plot the two times
data in a line graph to see the time it takes for each deployment
2
Answers
You are using the shell builtin command time. If you switch to linux’s time command you can control the output and get the just the data you want.
see
man time
for more detailsyou can take the output and pipe it into
grep -v deployment | tr 'n' ','
that will strip the dry run lines, and convert the renaming newlines into commasthis is a quick and dirty way to slice the date. I’m sure there are other solutions as well.
I just used a randomized sub-second sleep to get the output stream, but the principle should work.
Explained a bit with inline comments –
separate issue
So what you are saying is that if it takes 1.5 seconds you want it to output
1.500
, but if it takes 90.0 seconds you want it to output1.500
.How will you tell which it is without either specifying units or standardizing units?