10 added
10 removed
Original
2026-01-01
Modified
2026-03-10
1
<p><em>Solution Recipes are tutorials to achieve specific objectives in Klaviyo. They can also help you master Klaviyo, learn new third-party technologies, and come up with creative ideas. They are written mainly for developers and technically-advanced users.</em></p>
1
<p><em>Solution Recipes are tutorials to achieve specific objectives in Klaviyo. They can also help you master Klaviyo, learn new third-party technologies, and come up with creative ideas. They are written mainly for developers and technically-advanced users.</em></p>
2
<p>What you’ll learn:</p>
2
<p>What you’ll learn:</p>
3
<p>In this Solution Recipe, we will outline how to connect AWS S3 to Klaviyo’s SFTP to trigger profile ingestion when a new file is added to an S3 bucket. While this recipe covers AWS S3, you can apply this solution to connect any code hosting platform to Klaviyo’s SFTP.</p>
3
<p>In this Solution Recipe, we will outline how to connect AWS S3 to Klaviyo’s SFTP to trigger profile ingestion when a new file is added to an S3 bucket. While this recipe covers AWS S3, you can apply this solution to connect any code hosting platform to Klaviyo’s SFTP.</p>
4
<p>Why it matters:</p>
4
<p>Why it matters:</p>
5
-
<p>Increasingly, customers need a reliable solution that allows them to effortlessly import a large amount of data. A lot of Klaviyo customers rely on SFTP ingestion to quickly and accurately ingest data. By leveraging AWS, specifically S3 and Lambda, users can trigger ingestion when a new file is added to an S3 bucket, creating a more automatic ingestion schedule that scales for large data sets. </p>
5
+
<p>Increasingly, customers need a reliable solution that allows them to effortlessly import a large amount of data. A lot of Klaviyo customers rely on SFTP ingestion to quickly and accurately ingest data. By leveraging AWS, specifically S3 and Lambda, users can trigger ingestion when a new file is added to an S3 bucket, creating a more automatic ingestion schedule that scales for large data sets.</p>
6
<p>Level of sophistication:</p>
6
<p>Level of sophistication:</p>
7
<p>Moderate</p>
7
<p>Moderate</p>
8
<h2><strong>Introduction</strong></h2>
8
<h2><strong>Introduction</strong></h2>
9
-
<p>It is time consuming to update and import a large number of profiles (i.e., 500K+). We will outline an AWS-based solution that utilizes an S3 bucket and Klaviyo’s SFTP import tool to reduce the time required to make updates and automate key parts of the process. </p>
9
+
<p>It is time consuming to update and import a large number of profiles (i.e., 500K+). We will outline an AWS-based solution that utilizes an S3 bucket and Klaviyo’s SFTP import tool to reduce the time required to make updates and automate key parts of the process.</p>
10
<p>When using an S3-triggered SFTP ingestion, you can streamline and automate the process of importing data, which ultimately improves the overall experience of accurately and effectively maintaining customer data and leveraging Klaviyo’s powerful marketing tools. Some use cases that may require this type of solution include a bulk daily sync to update profile properties with rewards data or updating performance results from a third-party integration.</p>
10
<p>When using an S3-triggered SFTP ingestion, you can streamline and automate the process of importing data, which ultimately improves the overall experience of accurately and effectively maintaining customer data and leveraging Klaviyo’s powerful marketing tools. Some use cases that may require this type of solution include a bulk daily sync to update profile properties with rewards data or updating performance results from a third-party integration.</p>
11
<p>Other key profile attributes you may want to update in bulk and leverage for greater personalization in Klaviyo include:</p>
11
<p>Other key profile attributes you may want to update in bulk and leverage for greater personalization in Klaviyo include:</p>
12
<ul><li>Birthdate</li>
12
<ul><li>Birthdate</li>
13
<li>Favorite brands</li>
13
<li>Favorite brands</li>
14
<li>Loyalty</li>
14
<li>Loyalty</li>
15
<li>Properties from off-line sources</li>
15
<li>Properties from off-line sources</li>
16
</ul><p>In this solution recipe, we’ll walk through:</p>
16
</ul><p>In this solution recipe, we’ll walk through:</p>
17
<ol><li>Setting up your SFTP credentials</li>
17
<ol><li>Setting up your SFTP credentials</li>
18
<li>Configuring your AWS account with the required IAM, S3 and Lambda settings</li>
18
<li>Configuring your AWS account with the required IAM, S3 and Lambda settings</li>
19
<li>Deploying code to programmatically ingest a CSV file into Klaviyo via our SFTP server.</li>
19
<li>Deploying code to programmatically ingest a CSV file into Klaviyo via our SFTP server.</li>
20
</ol><p>The goal is to streamline and accelerate the process of ingesting profile data into Klaviyo using AWS services and Klaviyo’s SFTP import tool.</p>
20
</ol><p>The goal is to streamline and accelerate the process of ingesting profile data into Klaviyo using AWS services and Klaviyo’s SFTP import tool.</p>
21
<h2><strong>Ingredients</strong></h2>
21
<h2><strong>Ingredients</strong></h2>
22
<ul><li>GitHub Link<a>Repo</a></li>
22
<ul><li>GitHub Link<a>Repo</a></li>
23
<li>Klaviyo<a>SFTP</a>Import tool</li>
23
<li>Klaviyo<a>SFTP</a>Import tool</li>
24
<li><a>AWS Lambda with S3</a></li>
24
<li><a>AWS Lambda with S3</a></li>
25
<li>Python Libraries:<a>pandas</a>,<a>pysftp</a>,<a>boto3</a></li>
25
<li>Python Libraries:<a>pandas</a>,<a>pysftp</a>,<a>boto3</a></li>
26
</ul><h2><strong>Instructions</strong></h2>
26
</ul><h2><strong>Instructions</strong></h2>
27
<p>We will provide in-depth, step-by-step instructions throughout this Solution Recipe. The more broad overview of steps are:</p>
27
<p>We will provide in-depth, step-by-step instructions throughout this Solution Recipe. The more broad overview of steps are:</p>
28
<ol><li>Create an SSH key pair and prepare for SFTP ingestion.</li>
28
<ol><li>Create an SSH key pair and prepare for SFTP ingestion.</li>
29
<li>Set up an AWS account with access to<a>IAM</a>,<a>S3</a>, and<a>Lambda</a>a. Create an IAM execution role for Lambdab. Create and record security credentialsc. Create an S3 bucketd. Configure a Lambda function with the IAM execution role and set the S3 bucket as the triggere. Configure AWS environment variables.</li>
29
<li>Set up an AWS account with access to<a>IAM</a>,<a>S3</a>, and<a>Lambda</a>a. Create an IAM execution role for Lambdab. Create and record security credentialsc. Create an S3 bucketd. Configure a Lambda function with the IAM execution role and set the S3 bucket as the triggere. Configure AWS environment variables.</li>
30
<li>Deploy the Lambda function and monitor progress to ensure profiles are updated and displayed in Klaviyo’s UI as anticipated.</li>
30
<li>Deploy the Lambda function and monitor progress to ensure profiles are updated and displayed in Klaviyo’s UI as anticipated.</li>
31
</ol><h3><strong>Instructions for SFTP configuration</strong></h3>
31
</ol><h3><strong>Instructions for SFTP configuration</strong></h3>
32
<h4><strong>Step 1: Create SSH key pair and prepare for SFTP ingestion</strong></h4>
32
<h4><strong>Step 1: Create SSH key pair and prepare for SFTP ingestion</strong></h4>
33
<p>SFTP is only available to Klaviyo users with an Owner, Admin, or Manager role. To start, you’ll need to generate a public/private key pair on your local machine using<a>ssh-keygen</a>or a tool of your choice. Refer to this<a>documentation</a>for the supported SSH key formats.</p>
33
<p>SFTP is only available to Klaviyo users with an Owner, Admin, or Manager role. To start, you’ll need to generate a public/private key pair on your local machine using<a>ssh-keygen</a>or a tool of your choice. Refer to this<a>documentation</a>for the supported SSH key formats.</p>
34
<p>Once you have generated your keys:</p>
34
<p>Once you have generated your keys:</p>
35
<ol><li>In Klaviyo, click your account name in the lower left corner and select “<strong>Integrations</strong>“</li>
35
<ol><li>In Klaviyo, click your account name in the lower left corner and select “<strong>Integrations</strong>“</li>
36
<li>On the Integrations page, click “Manage Sources” in the upper right, then select “<strong>Import via SFTP</strong>“</li>
36
<li>On the Integrations page, click “Manage Sources” in the upper right, then select “<strong>Import via SFTP</strong>“</li>
37
<li>Click “<strong>Add a new SSH Key</strong>“</li>
37
<li>Click “<strong>Add a new SSH Key</strong>“</li>
38
<li>Paste your public key into the “SSH Key” box</li>
38
<li>Paste your public key into the “SSH Key” box</li>
39
<li>Click “Add key”</li>
39
<li>Click “Add key”</li>
40
</ol><p>Record the following details so you can add them into your AWS Environment Variables in the next step:</p>
40
</ol><p>Record the following details so you can add them into your AWS Environment Variables in the next step:</p>
41
<ul><li><strong>Server</strong>: sftp.klaviyo.com</li>
41
<ul><li><strong>Server</strong>: sftp.klaviyo.com</li>
42
<li><strong>Username</strong>: Your_SFTP_Username (abc123_def456)</li>
42
<li><strong>Username</strong>: Your_SFTP_Username (abc123_def456)</li>
43
<li><strong>SSH Private Key</strong>: the private key associated with the public key generated in previous steps</li>
43
<li><strong>SSH Private Key</strong>: the private key associated with the public key generated in previous steps</li>
44
-
</ul><p>You can read up on our SFTP tool by visiting our<a>developer portal</a>. </p>
44
+
</ul><p>You can read up on our SFTP tool by visiting our<a>developer portal</a>.</p>
45
<h3><strong>Instructions for AWS implementation</strong></h3>
45
<h3><strong>Instructions for AWS implementation</strong></h3>
46
<h4><strong>Step 2A. Create an IAM Execution role for Lambda</strong></h4>
46
<h4><strong>Step 2A. Create an IAM Execution role for Lambda</strong></h4>
47
<ol><li>Create an<a>IAM role</a>with AWS service as the trusted entity and Lambda as the use-case.</li>
47
<ol><li>Create an<a>IAM role</a>with AWS service as the trusted entity and Lambda as the use-case.</li>
48
-
<li>The Lambda will require the following policy names:<ul><li>AmazonS3FullAccess</li>
48
+
<li>The Lambda will require the following policy names:AmazonS3FullAccessAmazonAPIGatewayInvokeFullAccessAWSLambdaBasicExecutionRole<ul><li>AmazonS3FullAccess</li>
49
<li>AmazonAPIGatewayInvokeFullAccess</li>
49
<li>AmazonAPIGatewayInvokeFullAccess</li>
50
<li>AWSLambdaBasicExecutionRole</li>
50
<li>AWSLambdaBasicExecutionRole</li>
51
</ul></li>
51
</ul></li>
52
</ol><h4><strong>Step 2B. Set up your access key ID and secret access key</strong></h4>
52
</ol><h4><strong>Step 2B. Set up your access key ID and secret access key</strong></h4>
53
<p>You will need to set up your access key ID and secret access key in order to give access to the files to be uploaded from the local machine.</p>
53
<p>You will need to set up your access key ID and secret access key in order to give access to the files to be uploaded from the local machine.</p>
54
<ol><li>Navigate to “Users” in the<a>IAM console</a></li>
54
<ol><li>Navigate to “Users” in the<a>IAM console</a></li>
55
<li>Choose your IAM username</li>
55
<li>Choose your IAM username</li>
56
<li>Open the “Security credentials” tab, and then choose “Create access key”</li>
56
<li>Open the “Security credentials” tab, and then choose “Create access key”</li>
57
-
<li>To see the new access key, choose “Show” </li>
57
+
<li>To see the new access key, choose “Show”</li>
58
<li>To download the key pair, choose “<strong>Download .csv file</strong>“. Store the file in a secure location. You will add these values into your AWS Environment Variables.</li>
58
<li>To download the key pair, choose “<strong>Download .csv file</strong>“. Store the file in a secure location. You will add these values into your AWS Environment Variables.</li>
59
</ol><p><strong>Note</strong>: You can retrieve the secret access key only when you<strong><em>create</em></strong>the key pair. Like a password, you can’t retrieve it later. If you lose it, you must create a new key pair.</p>
59
</ol><p><strong>Note</strong>: You can retrieve the secret access key only when you<strong><em>create</em></strong>the key pair. Like a password, you can’t retrieve it later. If you lose it, you must create a new key pair.</p>
60
<h4><strong>Step 2C. Create S3 bucket</strong></h4>
60
<h4><strong>Step 2C. Create S3 bucket</strong></h4>
61
-
<p>Navigate here to<a>create the S3 bucket</a>. </p>
61
+
<p>Navigate here to<a>create the S3 bucket</a>.</p>
62
<h4><strong>Step 2D. Configure a Lambda function</strong></h4>
62
<h4><strong>Step 2D. Configure a Lambda function</strong></h4>
63
-
<p>The Lambda is the component of this set-up used to start up the SFTP connection and ingest the CSV file. </p>
63
+
<p>The Lambda is the component of this set-up used to start up the SFTP connection and ingest the CSV file.</p>
64
-
<ol><li>Create a new Lambda function with the execution role configured in step 2A. </li>
64
+
<ol><li>Create a new Lambda function with the execution role configured in step 2A.</li>
65
<li>Update the Trigger settings with the S3 bucket created in step 2C.</li>
65
<li>Update the Trigger settings with the S3 bucket created in step 2C.</li>
66
</ol><ol><li>Add corresponding files into Lambda from GitHub.</li>
66
</ol><ol><li>Add corresponding files into Lambda from GitHub.</li>
67
</ol><p>We’ll do a code deep dive in step 3.</p>
67
</ol><p>We’ll do a code deep dive in step 3.</p>
68
<h4><strong>Step 2E. Configure AWS environment variables.</strong></h4>
68
<h4><strong>Step 2E. Configure AWS environment variables.</strong></h4>
69
-
<ol><li>Navigate to the “Configuration” tab and add the following resources into your Environment Variables with their corresponding values. </li>
69
+
<ol><li>Navigate to the “Configuration” tab and add the following resources into your Environment Variables with their corresponding values.</li>
70
</ol># AWS access and secret keys ACCESS_KEY_ID SECRET_ACCESS_KEY S3_BUCKET_NAME # folder path to where you are saving your S3 file locally FOLDER_PATH Example: /Users/tyler.berman/Documents/SFTP/ S3_FILE_NAME Example: unmapped_profiles.csv LOCAL_FILE_NAME Example: unmapped_profiles_local.csv MAPPED_FILE_NAME Example: mapped_profiles.csv # add your 6 digit List ID found in the URL of the subscriber list LIST_ID Example: ABC123 PROFILES_PATH = /profiles/profiles.csv # folder path to where your SSH private key is stored PRIVATE_KEY_PATH Example: /Users/tyler.berman/.ssh/id_rsa HOST = sftp.klaviyo.com # add your assigned username found in the UI of Klaviyo's SFTP import tool USERNAME Example: abc123_def456<h4><strong>Step 3: Deploy your code</strong></h4>
70
</ol># AWS access and secret keys ACCESS_KEY_ID SECRET_ACCESS_KEY S3_BUCKET_NAME # folder path to where you are saving your S3 file locally FOLDER_PATH Example: /Users/tyler.berman/Documents/SFTP/ S3_FILE_NAME Example: unmapped_profiles.csv LOCAL_FILE_NAME Example: unmapped_profiles_local.csv MAPPED_FILE_NAME Example: mapped_profiles.csv # add your 6 digit List ID found in the URL of the subscriber list LIST_ID Example: ABC123 PROFILES_PATH = /profiles/profiles.csv # folder path to where your SSH private key is stored PRIVATE_KEY_PATH Example: /Users/tyler.berman/.ssh/id_rsa HOST = sftp.klaviyo.com # add your assigned username found in the UI of Klaviyo's SFTP import tool USERNAME Example: abc123_def456<h4><strong>Step 3: Deploy your code</strong></h4>
71
-
<p>Let’s review the code deployed in your AWS instance. </p>
71
+
<p>Let’s review the code deployed in your AWS instance.</p>
72
<ol><li>Programmatically add CSV file to S3:</li>
72
<ol><li>Programmatically add CSV file to S3:</li>
73
</ol>import boto3 import os session = boto3.Session( aws_access_key_id = os.environ['ACCESS_KEY_ID'], aws_secret_access_key = os.environ['SECRET_ACCESS_KEY'] ) s3 = session.resource('s3') #Define the bucket name and file name bucket_name = os.environ['S3_BUCKET_NAME'] #The name you want to give to the file in S3 s3_file_name = os.environ['S3_FILE_NAME'] local_file = os.environ['LOCAL_FILE_NAME'] s3.meta.client.upload_file(Filename=local_file, Bucket=bucket_name,Key=s3_file_name)<ol><li>Download S3 file:</li>
73
</ol>import boto3 import os session = boto3.Session( aws_access_key_id = os.environ['ACCESS_KEY_ID'], aws_secret_access_key = os.environ['SECRET_ACCESS_KEY'] ) s3 = session.resource('s3') #Define the bucket name and file name bucket_name = os.environ['S3_BUCKET_NAME'] #The name you want to give to the file in S3 s3_file_name = os.environ['S3_FILE_NAME'] local_file = os.environ['LOCAL_FILE_NAME'] s3.meta.client.upload_file(Filename=local_file, Bucket=bucket_name,Key=s3_file_name)<ol><li>Download S3 file:</li>
74
</ol>import boto3 def download_file_from_s3(aws_access_key_id, aws_secret_access_key, bucket_name, s3_file_name, downloaded_file): session = boto3.Session( aws_access_key_id=aws_access_key_id, aws_secret_access_key=aws_secret_access_key ) s3 = session.resource('s3') s3.meta.client.download_file(Bucket=bucket_name, Key=s3_file_name, Filename=downloaded_file)<ol><li>Prepare CSV file for ingestion:</li>
74
</ol>import boto3 def download_file_from_s3(aws_access_key_id, aws_secret_access_key, bucket_name, s3_file_name, downloaded_file): session = boto3.Session( aws_access_key_id=aws_access_key_id, aws_secret_access_key=aws_secret_access_key ) s3 = session.resource('s3') s3.meta.client.download_file(Bucket=bucket_name, Key=s3_file_name, Filename=downloaded_file)<ol><li>Prepare CSV file for ingestion:</li>
75
</ol>import pandas as pd import os #anticipate column mapping based on commonly used headers def suggest_column_mapping(loaded_file): column_mapping = { 'Email': ['EmailAddress', 'person.email', 'Email', 'email', 'emailaddress', 'email address', 'Email Address', 'Emails'], 'PhoneNumber': ['Phone#', 'person.number', 'phone', 'numbers', 'phone number', 'Phone Number'] } suggested_mapping = {} for required_header, old_columns in column_mapping.items(): for column in old_columns: if column in loaded_file.columns: suggested_mapping[column] = required_header return suggested_mapping #map column headers of S3 file and add List ID column def map_column_headers(loaded_file): mapped_file = loaded_file.rename(columns=suggest_column_mapping(loaded_file), inplace=False) mapped_file['List ID'] = os.environ['LIST_ID'] final_file = os.environ['FOLDER_PATH'] + os.environ['MAPPED_FILE_NAME'] mapped_file.to_csv(final_file, index=False) return final_file<ol><li>Establish SFTP server connection:</li>
75
</ol>import pandas as pd import os #anticipate column mapping based on commonly used headers def suggest_column_mapping(loaded_file): column_mapping = { 'Email': ['EmailAddress', 'person.email', 'Email', 'email', 'emailaddress', 'email address', 'Email Address', 'Emails'], 'PhoneNumber': ['Phone#', 'person.number', 'phone', 'numbers', 'phone number', 'Phone Number'] } suggested_mapping = {} for required_header, old_columns in column_mapping.items(): for column in old_columns: if column in loaded_file.columns: suggested_mapping[column] = required_header return suggested_mapping #map column headers of S3 file and add List ID column def map_column_headers(loaded_file): mapped_file = loaded_file.rename(columns=suggest_column_mapping(loaded_file), inplace=False) mapped_file['List ID'] = os.environ['LIST_ID'] final_file = os.environ['FOLDER_PATH'] + os.environ['MAPPED_FILE_NAME'] mapped_file.to_csv(final_file, index=False) return final_file<ol><li>Establish SFTP server connection:</li>
76
</ol>import pysftp import os #ingest S3 file via SFTP def connect_to_sftp_and_import_final_csv(final_file): with pysftp.Connection(host=os.environ['HOST'], username=os.environ['USERNAME'], private_key=os.environ['PRIVATE_KEY_PATH']) as sftp: print(f"Connected to {os.environ['HOST']}!") try: sftp.put(final_file, os.environ['PROFILES_PATH']) print(f"Imported {final_file}. Check your inbox for SFTP job details. View progress at https://www.klaviyo.com/sftp/set-up") except Exception as err: raise Exception(err) # close connection pysftp.Connection.close(self=sftp) print(f"Connection to {os.environ['HOST']} has been closed.")<ol><li>Put it all together in a lambda_handler function to ingest S3 file via SFTP:</li>
76
</ol>import pysftp import os #ingest S3 file via SFTP def connect_to_sftp_and_import_final_csv(final_file): with pysftp.Connection(host=os.environ['HOST'], username=os.environ['USERNAME'], private_key=os.environ['PRIVATE_KEY_PATH']) as sftp: print(f"Connected to {os.environ['HOST']}!") try: sftp.put(final_file, os.environ['PROFILES_PATH']) print(f"Imported {final_file}. Check your inbox for SFTP job details. View progress at https://www.klaviyo.com/sftp/set-up") except Exception as err: raise Exception(err) # close connection pysftp.Connection.close(self=sftp) print(f"Connection to {os.environ['HOST']} has been closed.")<ol><li>Put it all together in a lambda_handler function to ingest S3 file via SFTP:</li>
77
</ol>import configure_csv import sftp_connection import s3_download import os import pandas as pd import json def lambda_handler(event, context): s3_download.download_file_from_s3(os.environ['ACCESS_KEY_ID'], os.environ['SECRET_ACCESS_KEY'],os.environ['S3_BUCKET_NAME'], os.environ['S3_FILE_NAME'],os.environ['FOLDER_PATH'] + os.environ['LOCAL_FILE_NAME']) loaded_file = pd.read_csv(os.environ['FOLDER_PATH'] + os.environ['LOCAL_FILE_NAME']) final_file = configure_csv.map_column_headers(loaded_file) sftp_connection.connect_to_sftp_and_import_final_csv(final_file) return { 'statusCode': 200, 'body': json.dumps('successfully ran lambda'), }<h2><strong>Impact</strong></h2>
77
</ol>import configure_csv import sftp_connection import s3_download import os import pandas as pd import json def lambda_handler(event, context): s3_download.download_file_from_s3(os.environ['ACCESS_KEY_ID'], os.environ['SECRET_ACCESS_KEY'],os.environ['S3_BUCKET_NAME'], os.environ['S3_FILE_NAME'],os.environ['FOLDER_PATH'] + os.environ['LOCAL_FILE_NAME']) loaded_file = pd.read_csv(os.environ['FOLDER_PATH'] + os.environ['LOCAL_FILE_NAME']) final_file = configure_csv.map_column_headers(loaded_file) sftp_connection.connect_to_sftp_and_import_final_csv(final_file) return { 'statusCode': 200, 'body': json.dumps('successfully ran lambda'), }<h2><strong>Impact</strong></h2>
78
<p>Using Klaviyo’s SFTP tool makes data ingestion faster and more efficient. When coupled with the power of AWS’s S3 and Lambda services, you can boost its automation and scalability. With this configuration, your team will be able to manage data and execute ingestion with speed and accuracy, reducing the time and burden of updating profiles manually. Moreover, you can significantly improve data accuracy and mitigate the risk of errors during the ingestion process, ensuring the reliability and integrity of the data you’re using.</p>
78
<p>Using Klaviyo’s SFTP tool makes data ingestion faster and more efficient. When coupled with the power of AWS’s S3 and Lambda services, you can boost its automation and scalability. With this configuration, your team will be able to manage data and execute ingestion with speed and accuracy, reducing the time and burden of updating profiles manually. Moreover, you can significantly improve data accuracy and mitigate the risk of errors during the ingestion process, ensuring the reliability and integrity of the data you’re using.</p>
79
<p>Overall, this solution optimizes the efficiency and effectiveness of leveraging relevant data in Klaviyo while streamlining operations and enhancing overall performance.</p>
79
<p>Overall, this solution optimizes the efficiency and effectiveness of leveraging relevant data in Klaviyo while streamlining operations and enhancing overall performance.</p>
80
80