Import Salesforce data into Sitefinity Insight using Azure Data Factory

Overview

Complex data interactions among multiple data sources in different business systems are a major challenge for enterprises. To succeed, a marketer needs to tap into all these sources, understand their audience journey among the enterprise properties and build and execute successful campaigns. Bringing the data together often is a technical challenge.

In the following tutorials, you learn how to import contacts and their activity data from Salesforce into Sitefinity Insight using Azure Data Factory. You can use these tutorials to build connections to other enterprise data systems, such as Eloqua.

PREREQUISITES: You have created a blank Azure Data Factory version 2 and an Azure storage account.

NOTE: This tutorial assumes that
  • When a visitor visits your website and downloads a brochure, a “lead record” is created in Salesforce,
  • The actions of your sales team, such as phone calls and outbound mail messages are recorded in Salesforce as activities.

Terminology

The two systems use different languages to denote the same things:

  • Salesforce attributes such as Name or Company maps to Sitefinity Insight contact properties.
  • Salesforce Activity stream maps to Sitefinity Insight interactions.
  • Tutorial video

    To setup a data import from Salesforce using Azure Data Factory, use the following video:

    Import contacts

    Use this tutorial to create Azure Data Factory pipeline to import Salesforce customers as Sitefinity Insight customers, together with their contact properties.

    Perform the following:

    1. Download the Sitefinity Insight pipeline from the Progress website and save it on your computer.
    2. Log in to Microsoft Azure Data Factory.
    3. Navigate to your data factory.
    4. Under Open Azure Data Factory Studio, click Open.
      The Azure Data Factory Studio opens in a new browser tab.
    5. Click New » Pipeline.
    6. Click Add new resource (Azure data factory) (Add new resource) » Pipeline » Import from pipeline template.
      The Open dialog appears
    7. Select the adf-template-sitefinity-insight-salesforce-integration.zip file that you have downloaded in Step 1.
      The SalesforceLeads2Insight_Template page opens.
    8. In the Linked service for SalesforceSource (Salesforce dataset) dropdown box, click New.
    9. To configure the connection to your Salesforce instance, in the right pane, perform the following:
      1. Name your Salesforce connection.
      2. Fill out your Salesforce login details - User name, Password, and Security token.
      3. Click Create.
    10. In the Linked service For CsvDrop (DelimitedText dataset) dropdown box, click New.
    11. To configure the Azure Blob storage which will hold the CSV where the data will be temporarily stored, in the right pane, select the existing storage account and click Create.
    12. Click Use this template.
    13. To configure the connection to your Sitefinity Insight datacenter, perform the following:
      1. Navigate to the pipeline’s Variables tab.
      2. In the apiKey variable, enter the API key of the Sitefinity Insight data center where you want to import the data.
        For more information, see Data centers and data sources » API key.
      3. In the AccessKey variable, enter the access key of your Sitefinity Insight account.
        For more information, see Connect your sites to Insight » Access keys.
      4. In the DataSource variable, enter a descriptive name of the data source.
        For more information, see Data centers and data sources » Data sources.
      5. In the ImportJobName variable, enter the identifier of the import job.
        This identifier is used by Sitefinity Insight in future runs of the pipeline to get only the data after the last successful run of the pipeline. Therefore, this value must be unique for every pipeline you create.
        For example, enter SalesforeLeads
      6. In the StartFrom variable, enter the date when the leads were last modified.
        This is the date when the first import will start. This date is used only for the first import.
      7. In the BlobStorageAccountName, enter the name of the blob storage account that will contain the CVS file, which is temporarily used for storing the data transferred from Salesforce to Sitefinity Insight.
      8. In the BlobContainer variable, enter the name of an existing Blob storage container.
      9. In the BlobContainerSasToken variable, enter the Blob SAS token of the storage container you are using.
    14. Click Publish all >> Publish.

    RESULT: Your Azure Data Factory pipeline is ready to run, and you can use it to import your Salesforce data manually.

    Import user activity

    Use this tutorial to learn how to create Azure Data Factory pipeline to import Salesforce activity streams as Sitefinity Insight interactions.

    Perform the following:

    1. Download the Sitefinity Insight pipeline from the Progress website and save it on your computer.
    2. Log in to Microsoft Azure Data Factory.
    3. Navigate to your data factory.
    4. Under Open Azure Data Factory Studio. Click Open.
      The Azure Data Factory Studio opens in a new browser tab.
    5. Click New >> Pipeline.
    6. Click Add new resource (Azure data factory) (Add new resource) » Pipeline » Import from pipeline template.
      The Open dialog appears
    7. Select the adf-template-sitefinity-insight-salesforce-integration.zip file that you have downloaded in Step 1.
      The SalesforceLeads2Insight_Template page opens.
    8. In the Linked service for SalesforceSource (Salesforce dataset) dropdown box, click New.
    9. To configure the connection to your Salesforce instance, in the right pane, perform the following in the right pane:
      1. Name your Salesforce connection.
      2. Fill out your Salesforce login details - User name, Password, and Security token.
      3. Click Create.
    10. In the Linked service For CsvDrop (DelimitedText dataset) dropdown box, click New.
    11. To configure the Azure Blob storage that will hold the CSV where the data will be temporarily stored, in the right pane, select the existing storage account and click Create.
    12. Click Use this template.
    13. To configure the connection to your Sitefinity Insight datacenter, perform the following:
      1. Navigate to the pipeline’s Variables tab.
      2. In the apiKey variable, enter the API key of the Sitefinity Insight data center where you want to import the data.
        For more information, see Data centers and data sources » API key.
      3. In the AccessKey variable, enter the access key of your Sitefinity Insight account.
        For more information, see Connect your sites to Insight » Access keys.
      4. In the DataSource variable, enter a descriptive name of the data source.
        For more information, see Data centers and data sources » Data sources.
      5. In the ImportJobName variable, enter the identifier of the import job.
        This identifier is used by Sitefinity Insight in future runs of the pipeline to get only the data after the last successful run of the pipeline. Therefore, this value must be unique for every pipeline you create.
        For example, enter SalesforeLeads
      6. In the StartFrom variable, enter the date when the leads were last modified.
        This is the date when the first import will start. This date is used only for the first import.
      7. In the BlobStorageAccountName, enter the name of the blob storage account that will contain the CVS file, which is temporarily used for storing the data transferred from Salesforce to Sitefinity Insight.
      8. In the BlobContainer variable, enter the name of an existing Blob storage container.
      9. In the BlobContainerSasToken variable, enter the Blob SAS token of the storage container you are using.
      10. Click the Copy data activity, then click its Source tab.
      11. Click the Query field.
        The Add dynamic content pane opens on the right.
      12. Replace the SQL query in the Add dynamic content field with the following code:

        For more information, see Salesforce Object Query Language (SOQL).
      13. Click OK.
      14. Click the Mapping tab of the Copy data activity.
      15. To remove the contact properties, select the checkboxes the following fields: Email, FirstName, LastName, Company, Status, and Industry, and then click Delete (Azure Data Factory)(Delete).
      16. Rename the following fields:
        1. The Source of the Id field to WhoId
        2. The Source of LastModifiedDate to CreatedDate
        3. The Source of Phone to Subject
        4. The Destination of Phone to Predicate
        5. The Source of LeadSource to Description
        6. The Destination of LeadSource to Object
    14. Click Publish all » Publish.

    RESULT: Your Azure Data Factory pipeline is ready to run, and you can use it to import your Salesforce data manually.

    Automate pipeline execution

    The examples above describe how to create Azure Data Factory pipelines to import Salesforce data in Sitefinity Insight. However, these pipelines are dormant. You can execute them manually for one-shot data imports, but usually for production scenarios you want to run them automatically.

    To learn how to automate their execution on scheduled time, see the Azure Data factory documentation » Pipeline execution and triggers in Azure Data Factory or Azure Synapse Analytics.

    Import data from sources other than Salesforce

    You can use procedures analogous to the ones described above to import data from other data sources. To do this, after creating the pipelines as described above, perform the following:

    1. In the pipeline’s Dataset pane, click Dataset actions (Azure data factory) (Dataset actions) » New data set.
    2. To create a new dataset, In the pane on the right, select the system that you want to import data from.
      You need this dataset in Step 4 below.
      For example, select Oracle Eloqua and configure it.
    3. Click the pipeline’s Copy data activity.
    4. In the Source tab, in the Source dataset dropdown box, select the source you created in Step 2.
    5. In the Query field, enter the query which extracts data from the system you want to import data from.
      The query is specific to this system. For more information, see the respective documentation article about how to get the data you need.

    Was this article helpful?