Skip to content

How to setup SFTP for local integration testing

Richard Kettle edited this page May 10, 2022 · 3 revisions

The following aims to give insight into how to setup a local SFTP server and start an integration to test the SFTP import tool written for BookingBug.

There are multiple ways of testing the integration these include:

  • Standard login using username and password
  • Login using Username and pem file
  • Concept of not being able to move files on remote server
  • Concept of encrypted import files (note for this to work you will need GPG installed and a JSON file encrypted with the public key as well as knowledge of the password and the private key imported)

Setup

The setup locally has a single requirement (apart from the main BB application) that is to have Docker installed. Once installed download this shell script locally and give it execute permissions chmod +x {filename}

Then run the script as follows

./{scriptname} -f {base_folder_of_your_chosing}

for example: ./start_sftp.sh -f /Users/alan/sftp

The script will:

  1. create the base folder if it doesnt exist
  2. create a sub folder called uploads inside the base folder
  3. start a docker container mapping the basefolder to the container

To check that this has worked correctly you should be able to use something such as Filezilla to connect to:

  • localhost
  • port: 2222
  • username: dockeruser
  • password: areallysecurepassword1

Once connected, you should be able to put a file into the uploads directory locally, and should be able to see it on the SFTP server using Filezilla

NOTE a sample import can be found here but please be aware that the import API is a work in progress and this might need to be updated, please speak to a member of the team to get an updated version if required.

Testing simple Username & password access

"Reactive Mode"

To test the integration using the simple method of using just a username and password do the following:

  1. In terminal do the following:
    • bundle exec zeus start
    • bundle exec zeus s
    • bundle exec sidekiq -e development
    • tail -f log/dev.log | grep "SFTP Integration"
    • (optional but can be useful if making changes - see section on reloading) bundle exec zeus c
  2. Login to local env http://localhost:3000/integrations/connections and create a new integration:
    • type: SFTP
    • settings:
{
  "host": "localhost",
  "username": "dockeruser",
  "password": "areallysecurepassword1",
  "port": "2222",
  "schedule_cron": "* * * * *",
  "log_level": 0,
  "base_folder": "{set this to the last folder name given when starting the docker container}"
}

(for example, if you started the container with the -f option as /Users/alan/sftp the "base_folder" option here would be "sftp")

  • change it to "up"
  1. Open http://localhost:3000/sidekiq_web/cron/ where you should see your integration.
  2. Click "enqueue now" which will trigger the process - if you havent put a JSON file in the uploads directory, you can prove that it has worked as new folders should be created on the SFTP server:
    • in_progress
    • completed
    • completed_with_issues
    • failed
    • reports
      • in_progress
      • completed
  3. Put the sample import in and run it again, you should see that a report file is generated and the file moves around on the remote server. As well as the output in the tail -f logs/dev.log terminal window.

Sequence Diagram of how the above "works"

Reactive Sequence Diagram

"Observer" Mode

To test this mode the setup is very similar to the above, however with a couple of changes to the settings and the JSON file we drop in.

To explain this concept, this is (in the use case of Morgan Stanley) to be used where we do not have the required permission to be able to move the file around on the remote server, therefore we need to make some assumptions:

  • A file will be uploaded daily and the date will be in the filename
  • We will not attempt to create folders on the remote file system or move files
  • We will not generate reports
  1. Change the integration to have the following settings (personally i find this easier to do using the Rails console - more on that in a later section):
{
  "host": "localhost",
  "username": "dockeruser",
  "password": "areallysecurepassword1",
  "port": "2222",
  "schedule_cron": "* * * * *",
  "log_level": 0,
  "base_folder": "{set this to the last folder name given when starting the docker container}",
  "sftp_mode": "observer",
  (optional)"date_format": "%d-%m-%Y"
}

The "date_format" setting is optional as it will default to %Y-%m-%d but it can be useful to prove this setting by changing it and then updating the format of the filename. 2. When dropping in a JSON file for import, make sure the name contains today's date in the format specified. ie.

Given today is 29th of June 2018
And the format is "%Y-%m-%d"
Name the file: "sample_json_2018-06-29.json"

Sequence Diagram showing the above

Observer Mode Sequence Diagram

Notes on editing the code.

Personally I find that when editing the code it is required that the terminal window running sidekiq is restarted to pick up the changes. Therefore, the integration will need to be dropped and put back into an up state. To make this easier I tend to do the following once the Rails console window is loaded:

job = Integrations::Connection.find_by_id {put the id of integration here!}
def down(job:); job.state = 2; job.save; end;
def up(job:); job.state = 0; job.save; end;
def drop_create(job:); down(job: job); up(job:job); end;
// then drop create using
drop_create(job: job)

I would also change settings here using:

job.settings = {new settings hash}
job.save
drop_create(job: job)

Testing login using PEM file

This is possible with the local SFTP server, but have not added that part, i find it easier to prove this by connecting as a test user to sftp.bookingbug.com

The only real change to test this is to remove the password section from the settings, and add the location of the local pem file to the LOCAL_SETTINGS file using the keys:

importer:
 ftp_ssh_key: '~/.ssh/somekey.pem'

Testing PGP encryption

Testing this locally has been done using the PGP encrypt keys and files for Morgan stanley, speak to a member of the team to get these.

To test this, you will need to import the private key into your local gpg suite using gpg --import {key} You will also need to add the "secret" key to the LOCAL_SETTINGS using:

importer:
 file_decryption_key: 'somesecret'

NOTE when using MacOS and testing this method, you will get a popup for the password, unlike ubuntu or Centos where it is echo'd into the command.