This story will give you an idea on how to use expect
as part of your CI/CD process.
Here’s the scenario: You need to update your file(s) on your server and you cannot host those files on the cloud. So the only way to get your updated files is by transferring them from a machine to the server.
Hence, you would need to copy the files and replace them on the server itself. Most of us would use the scp
command which would do just that. But while using it, we would have noticed that the command would prompt for a password. This prompt creates an issue in the fully automated process if we were to run the command because it is expecting a password input and there’ is no option to enter the password inline.
Thus, I went digging around and found the solution best suited for me — expect
. This handy feature would provide the solution to the issue that was just mentioned. We will be able to enter the password programmatically.
Here, we are going to use expect
to help us automate the copying of files from the CI runner instance to the deployment servers. First, you can install it simply by running apt-get update -y && apt-get install -y expect
on your machine.
Let us look at a normal scp
command to better understand what we would need to put in our script.
$ scp -P $env(DEPLOYMENT_TARGET_PORT) -r $env(DEPLOYMENT_FILE_PATH) $env(DEPLOYMENT_TARGET_USER)@$env(DEPLOYMENT_TARGET_HOST):$env(DEPLOYMENT_TARGET_FILE_PATH)
We would first need to define certain variables which are needed in this scp command which are mainly listed below:
DEPLOYMENT_TARGET_PORT: $DEPLOYMENT_TARGET_PORT
DEPLOYMENT_TARGET_USER: $DEPLOYMENT_TARGET_USER
DEPLOYMENT_TARGET_PASSWORD: $DEPLOYMENT_TARGET_PASSWORD
DEPLOYMENT_TARGET_HOST: $DEPLOYMENT_TARGET_HOST
DEPLOYMENT_TARGET_FILE_PATH: $DEPLOYMENT_TARGET_FILE_PATH
DEPLOYMENT_FILE_PATH: $DEPLOYMENT_FILE_PATH
Depending on whether you want to copy a file, multiple files or folders, you may want to include in a -r
recursive flag in the scp command.
Now that we have defined the variables, we can test it by setting the variables and running the script. The output of this would still be the same — the password input is prompted.
Now comes the magic. We need to create the expect
script that is going to help us automate the inputting of the password field. This is going to look something like that:
expect -c '
set timeout 1000
spawn bash -c "scp -P $env(DEPLOYMENT_TARGET_PORT) -r $env(DEPLOYMENT_FILE_PATH) $env(DEPLOYMENT_TARGET_USER)@$env(DEPLOYMENT_TARGET_HOST):$env(DEPLOYMENT_TARGET_FILE_PATH)"
expect {
"(yes/no)?" {
send "yes\r"
expect "*?assword:*"
send "$env(DEPLOYMENT_TARGET_PASSWORD)\r"
}
"*?assword:*" {
send "$env(DEPLOYMENT_TARGET_PASSWORD)\r"
}
}
expect eof
'
The -c
flag is used to run the command in the quotes first. This is why we use it for the scp
command. Now comes the important part, we would need to handle 2 cases.
1. A new instance would require you to first authenticate the connection before entering the password
2. A previously used instance would just require the password
We wrap the 2 cases inside expect
as follows:
expect {
"case1" {
}
"case2" {
}
}
The first case would look something like this
The authenticity of host '123.123.123.123 (123.123.123.123)' can't be established.
ECDSA key fingerprint is SHA256:xxxxxxxxxxx.
Are you sure you want to continue connecting (yes/no)?
Hence, our case 1 would be (yes/no)?
since it ended with that. And after responding with a yes
, you would see the next prompt:
Warning: Permanently added '123.123.123.123' (ECDSA) to the list of known hosts.
admin@123.123.123.123's password:
Here, you would expect a password
prompt which you will enter your password and complete the scp
.
Next is case 2, which is pretty straight forward. You just need to handle the password
prompt exactly the same way as you did previously.
One last thing to take note is the timeout
setting here
set timeout 1000
This is important as we expect the whole scp
command to take 1000 seconds or less. If we do not include this, the default timeout setting is 10 seconds. This may not be enough if we are transferring large amounts of data or experiencing some latency issues.