Using Remote Connections
You can import and export data into and out of your Rulex process either by selecting a file from your local file system or through a remote connection.
The following remote connections are currently supported:
Http remote API (import only)
SharePoint
Apache Hadoop (HDFS)
Amazon Web Services S3 (AWS S3)
File Transfer Protocol (FTP)
Azure Storage (blob service) FROM VERSION 4.6
Rulex's technical documentation does not and cannot provide comprehensive guidelines on the use of third-party software, beyond how Rulex integrates with this software. Please consult the technical documentation of the third-party software itself for up to date information.
Remote Connections are supported in the following tasks:
Prerequisites
the required datasets have been imported into the process
for the Hadoop connector case with Kerberos secure authentication, a valid Kerberos ticket is required for the connecting user
Procedure
Click the Remote Connections tab in your import or export task.
Select the source type (uri) from the list of the left.
Enter the user name and password for your source type along with the following source specific parameters:
Remote connection parameters | |||
Parameter | Connection type | PO | Description |
---|---|---|---|
Authentication | Http API | httpauth | The authentication scheme supported by the HTTP Server API. Possible options are:
|
SharePoint url | Sharepoint | spurl | SharePoint URL, which must have the following structure: https://[domain_or_server]/sites/[web_site] (i.e.:https://rulexanalytics.sharepoint.com/sites/RulexSharepointConnectorTest). |
HDFS url | HDFS | hdfsurl | The Apache Hadoop URL, which must have the following structure: http://[domain_or_server]:[port] (port usually is 50070). |
S3 bucket | AWS S3 | s3bucket | The AWS storage area. |
FTP host | FTP | ftpurl | The FTP host server that will be used for data exchange. Before inserting your FTP host, you need to add the following prefix: |
Port | FTP, HDFS | ftpport, hdfsport | The port required for data exchange with the host server. |
User name | Http API, FTP, HDFS, AWS S3 | httpusr, spusr, hdfsusr, s3usr, ftpusr | Unique user name for connection. When connecting to AWS S3 this option may be called Access Key ID. |
Password | Http API, FTP, HDFS, AWS S3 | htppwd, sppwd, hdfspwd, s3pwd, ftppwd | A valid login password for the specified user. When connecting to AWS S3 this option may be called Secret Access Key. |
Response file path | Http API | httpresponse | The file path where the http response will be saved. |
Account Storage Name | Azure Storage | asurl | The Azure Storage Account Name for the BLOB service. This name is unique within Azure, and together with the blob, it is part of all the address for objects in your storage account. |
Access Key | Azure Storage | aspwd | The Azure Storage Access Key for the BLOB service. This can either be:
|
Type Key | Azure Storage | asauth | The Azure Storage type of key you entered in the Access Key parameter. Possible options are:
|