Schedule and automate your product imports/exports

Summary

 Overview

This feature allows you to automate your product imports and/or exports, and is available only for our Enterprise Edition or Growth Edition users. You will be able to:

  • Connect your product imports/exports to remote storages (Amazon S3, Microsoft Azure, Google Cloud Storage or any SFTP server).
  • Automate them by scheduling their executions.

Connect your import/export profile to a remote storage

To connect your product import/export to a remote storage:

  1. Go to Imports or Exports
  2. Select the profile you would like to automate
  3. Click Edit in the top right corner
  4. Click Properties
  5. Under "Connection", select any choice from the dropdown list. For instance: SFTP
  6. Fill in all mandatory fields: Filepath, Host, Port, Login and Password
  7. Click Save in the top right corner of the screen

 

  • If you want to automate your product exports, we support the patterns %job_label% & %datetime%. You can combine them to create a filepath like: "/myfolder1/mysubfolder2/export_%job_label%_%datetime%.xlsx".
  • Both relative (ex. “mysubfolder/myexport.xlsx”) and absolute filepaths (ex. "/myfolder/mysubfolder/export.xlsx") are supported
  • You can use the Test connection settings button to check your settings and ensure your connection is valid.
  • If you want to authenticate your SFTP server, paste its fingerprint in the optional field Host fingerprintThe accepted format of the fingerprint depends on the server's public key format: MD5 for ssh-rsa signatures, SHA-512 for others.
 

 

  • You can only import media files with a .ZIP archive. Your spreadsheet should contain a filepath column.
  • If you export media files to a remote server, it will create a folder with the spreadsheet and the related media files. This folder won't be compressed (no .ZIP archive).
 

Please note that if you're importing from a remote server, your configured host will be displayed.

Connect to an SFTP remote server

To connect your product import/export to an SFTP remote storage:

  1. Go to Imports or Exports
  2. Select the profile you would like to automate
  3. Click Edit in the top right corner
  4. Click Properties
  5. Under "Type", select SFTP
    Select a connection type
  6. Fill in all mandatory fields: Filepath, Host & Port
    Filepath, host and port
  7. Select the authentication method Login & Password or Private key
    * If you're using Login & Password, enter a login and a password, then proceed to step
    * If you're using Private key, the PIM will provide you a public key that you need to install on your SFTP server. In that case, ask your PIM administrator. Then proceed to step 8.
    Private key authentication method
  8. Click Save in the top right corner of the screen
     

If you want to authenticate your SFTP server, you can paste its fingerprint in the optional field Host fingerprint.
The accepted format of the fingerprint depends on the format of the server's public key: MD5 for ssh-rsa signatures, SHA-512 for others.

 

 

Connect to Amazon S3

To connect your product import/export to Amazon S3:

  1. Go to Imports or Exports
  2. Select the profile you would like to automate
  3. Click Edit in the top right corner
  4. Click Properties
  5. Under "Type", select Amazon S3
  6. Fill in all mandatory fields: Filepath, Region, Bucket name, Key, and Secret
  7. Click Save in the top right corner of the screen

Permissions for Amazon S3

The following permissions need to be granted depending on these needs:

  • To check the connection settings:
s3:ListBucket
  • To import files from Amazon S3:
s3:GetObject
  • To export files to Amazon S3:
s3:ListBucket
s3:PutObject
s3:GetObject
s3:GetObjectAcl
s3:DeleteObject 

Connect to Microsoft Azure

To connect your product import/export to Microsoft Azure:

  1. Go to Imports or Exports
  2. Select the profile you would like to automate
  3. Click Edit in the top right corner
  4. Click Properties
  5. Under "Type", select Microsoft Azure
  6. Fill in all mandatory fields: Filepath, Connection string, and Container name
  7. Click Save in the top right corner of the screen

Permissions for Microsoft Azure

The following permissions need to be granted for Azure SAS (Shared Access Signature):

  • Allowed services
    • Blob
  • Allowed resource types
    • Service
    • Container
    • Object
  • Allowed permissions
    • Read
    • Write
    • Delete
    • List
    • Add
    • CreatePermissions needed to enable Azure SAS

Connect to Google Cloud Storage

To connect your product import/export to Google Cloud Storage:

  1. Go to Imports or Exports
  2. Select the profile you would like to automate
  3. Click Edit in the top right corner
  4. Click Properties
  5. Under "Type", select Google Cloud Storage
  6. Fill in all mandatory fields: Filepath, Project ID, Service account, and Bucket

    The Service account is JSON information needed for the service account field that can only be retrieved at the key creation in the Google Cloud Console interface. Please paste the whole JSON key.
    Note that once created, it cannot be downloaded afterward. If lost, a new key must be created directly in the Google Cloud Console.

     
  7. Click Save in the top right corner of the screen

Permissions for GCS (service account)

The following permissions need to be granted depending on these needs:

  • To check the connection settings:
storage.buckets.get
  • To import files from GCS:
storage.objects.get
  • To export files to GCS:
storage.objects.create
storage.objects.delete
storage.objects.get

Schedule and automate your import/export profile

To automatically execute an import/export profile:

  1. Go to Imports or Exports
  2. Select the profile you would like to automate
  3. Click Edit in the top right corner
  4. Click Properties
  5. Under "Automation", switch "Enable scheduling" button to Yes
  6. Under "Scheduling", select your preferred frequency of execution
  7. Then, select the user group to apply the related permissions to this automated job can apply (EE only)
  8. Finally, select the user groups and/or users to be notified when a job is completed with success, has failed, or can't even be launched. One e-mail and one in-app notification will be sent for each job status.
  9. Click Save in the top right corner of the screen

 

  • The minimum frequency of execution is set to every 2 hours. If you need more frequent executions, please consider using the API to ensure the best performance. Read more about the Event API.
  • Frequencies executed multiple times a day start at midnight (UTC).
  • Frequencies are displayed in UTC.
  • A system user will execute automated jobs, so if you want to see the job in the Process Tracker, please check the permission View all jobs in process tracker under the user roles' permissions.
  • To automate a job, you need to configure a remote storage.
 

 

  • Queued jobs might take a few minutes to start. If you have scheduled a frequency and expect your job to start precisely on time, it may take slightly longer. This is normal, so don’t worry.
  • Additionally, please note that if a scheduled import job is already running and another instance of the same job is scheduled to start before the first one finishes, the new job will not be started. This occurrence will be marked as FAILED in the process tracker, accompanied by a dedicated error message.
 

 

Manually execute an import/export profile connected to a remote storage

Once you have connected your import/export profile to a remote server, go back to the root of your profile by clicking on its name in the breadcrumb, and click on Import now to import from a remote server or Export now to export to a remote server.