Using Remote Connections

You can import and export data into and out of your Rulex process either by selecting a file from your local file system or through a remote connection.

The following remote connections are currently supported:

  • Http remote API (import only)

  • SharePoint

  • Apache Hadoop (HDFS)

  • Amazon Web Services S3 (AWS S3)

  • File Transfer Protocol (FTP)

  • Azure Storage (blob service) FROM VERSION 4.6

Rulex's technical documentation does not and cannot provide comprehensive guidelines on the use of third-party software, beyond how Rulex integrates with this software. Please consult the technical documentation of the third-party software itself for up to date information.

Remote Connections are supported in the following tasks:


Prerequisites

Procedure

  1. Click the Remote Connections tab in your import or export task.

  2. Select the source type (uri) from the list of the left.

  3. Enter the user name and password for your source type along with the following source specific parameters:

Remote connection parameters

Parameter

Connection type

PO

Description

Authentication

Http API

httpauth

The authentication scheme supported by the HTTP Server API.

Possible options are:

  • Basic

  • NTLM

  • Digest

  • Insecure (user and password are not required).

SharePoint url

Sharepoint

spurl

SharePoint URL, which must have the following structure: https://[domain_or_server]/sites/[web_site] (i.e.:https://rulexanalytics.sharepoint.com/sites/RulexSharepointConnectorTest).

HDFS url

HDFS

hdfsurl

The Apache Hadoop URLwhich must have the following structure: http://[domain_or_server]:[port] (port usually is 50070).
It is possible to add the file system path as: http://[domain_or_server]:[port]/folder/subfolder.

S3 bucket

AWS S3

s3bucket

The AWS storage area.

FTP host

FTP

ftpurl

The FTP host server that will be used for data exchange.

Before inserting your FTP host, you need to add the following prefix: ftp:/

Port

FTP, HDFS

ftpport, hdfsport

The port required for data exchange with the host server.

User name

Http API, FTP, HDFS, AWS S3

httpusr, spusr, hdfsusr, s3usr, ftpusr

Unique user name for connection.

When connecting to AWS S3 this option may be called Access Key ID.

Password

Http API, FTP, HDFS, AWS S3

htppwd, sppwd, hdfspwd, s3pwd, ftppwd

A valid login password for the specified user.

When connecting to AWS S3 this option may be called Secret Access Key.

Response file path

Http API

httpresponse

The file path where the http response will be saved.

Account Storage Name

Azure Storage

asurl

The Azure Storage Account Name for the BLOB service. This name is unique within Azure, and together with the blob, it is part of all the address for objects in your storage account.

Access Key

Azure Storage

aspwd

The Azure Storage Access Key for the BLOB service. This can either be:

  • one of account access keys (Account Key), which are 512-bit storage account access keys generated in Azure when you created your account, or

  • a restricted shared access signature (SAS key) valid only for a limited number of specific objects, with specific permissions for a limited time-frame.

Type Key

Azure Storage

asauth

The Azure Storage type of key you entered in the Access Key parameter.

Possible options are:

  • Account Key

  • SAS Key (Shared Access Signature)