LF

#14 TOP TIP 

for Globalscape EFT Server

EFT and Amazon S3 storage

Oct 19, 2017 | EFT, News, Technical, Top Tips

This Top Tip shows you how to get started using EFT and Amazon S3 storage. You can reference Amazon S3 storage, RDS Databases and EC2 servers within the AWE engine in EFT Enterprise. The functionality is hidden but not disabled.

Getting started

Firstly, to enable the functionality, you need to edit C:\Program Files (x86)\Globalscape\EFT Server Enterprise\AWE\bin\automatedworkflow.xml

Look for the XML Node <AvailableAcitons> and add the following lines:

<ActionName>S3</ActionName>
<ActionName>EC2</ActionName>
<ActionName>RDS</ActionName>

Once you’ve saved the file, you’ll see an ‘Amazon’ menu, where you can reference Amazon web services.

Please note that this file is overwritten during upgrades. You’ll need to re-add these lines to see the options again. You won’t be able to add new Amazon elements to an AWE until you have done this. Existing AWEs will continue working if they contain Amazon elements.

Writing data to an S3 bucket is broken down into a few stages, similar to database and SFTP connections.

Step 1: Create a session

  • Add in an S3 element to your AWE and then select ‘create session’ from the activity option.
  • Add the access key pair, which is defined against the S3 bucket as a security authentication.
  • You can leave all other options as default values.

 

 Step 2: Write a file up to S3 storage

  • For ‘activity’ select, ‘Put object’.
  • The file to be written to S3 can use variables such as %FS_PATH% or a physical file path.
  • The bucket name is the defined name of your S3 bucket.
  • The Key Name is the name you wish the file to be called. Using the variable %FS_FILE_NAME% would apply the same file name defined in the FS_PATH variable. You could also build a variable to contain other strings such as dates.
  • S3 does not have a file structure as such. You can create and reference folders inside it though, to make it act like a file system. To write a file to a folder, simply add the folder name at the start of the Key Name. Be aware that folders use a / character and that Key Names are case sensitive. So /mark/mark.txt is not the same file as /Mark/Mark.txt, and \Mark\Mark.txt is invalid.
  • Please be aware that this does not handle Wildcards particularly well. If you want to put multiple files up to an S3 bucket, use a file loop and write each file one at a time. This will allow you to define each file name as it is uploaded. Almost every other setting on the ‘put object’ dialog box can be left
    as default. Just make sure the S3 session created in the previous step is selected.

 

Step 3: Getting a file

  • Pulling a file down from an S3 bucket is a similar process, but using the ‘Get object’ activity
  • Take extra care with the file paths and file names, which are case sensitive.
  • ‘Get objects’ can use Wildcards to define multiple files.

 

Once the AWE script has been built, you can call it from an Event rule in the normal way.

Sample

The AML for a sample ‘write and get’ are below. You will need to define the Access Keys and Secret Keys to get the script to work.

<AMAWSS3 ACTIVITY=”create_session” ACCESSKEY=”Access Key ” SECRETKEY=”AM2iAWndJRgKZS+BbZ012AQlL4Fu3T3YFuQaME” PROTOCOL=”https” />
<AMAWSS3 ACTIVITY=”put_object” BUCKETNAME=”eft-test-bucket” KEYNAME=”mark/*.zip” FILE=”C:\Users\Mark\Desktop\*.zip” />
<AMAWSS3 BUCKETNAME=”eft-test-bucket” KEYNAME=”mark/*.*” FILE=”C:\Users\Mark\Desktop\Sample files\%FS_FILE_NAME%” />
<AMAWSS3 ACTIVITY=”end_session” />