Skip to main content

Amplitude Migration Guide

Migrating from Amplitude to Statsig is a strategic choice. Statsig is an all-in-one platform that offers analytics, experimentation, and feature flagging under one umbrella. Using all these products in a single tool is much more powerful.

Migrating amplitude data into Statsig usually involves two steps: export and ingest. This guide provides the essentials. For anything beyond these basics, please contact us.

Step 1. Export your data from Amplitude

Amplitude offers a few different export methods. Pick the one that matches your data size and setup:

1. S3 Export

For high volume backfills, you can dump your Amplitude data into an S3 bucket.

2. Warehouse Export

If your Amplitude data is already in Snowflake, BigQuery, or Redshift, you can skip file downloads. Statsig can ingest directly from these warehouses (see Step 2.1)

3. Export API

Use Amplitude's Export API to pull gzipped JSON

  • Limit: 4 GB per request so use hourly windows for large ranges
  • Example:
curl --location --request GET 'https://amplitude.com/api/2/export?start=<starttime>&end=<endtime>' \
-u '{api_key}:{secret_key}'

4. UI Download (CSV/JSON)

Go to Organization Settings → Project → Export Data

  • Best for small datasets or initial testing

Step 2. Transform your data

Amplitude and Statsig store events in slightly different formats. To make your Amplitude exports work in Statsig, you'll need to map your Amplitude data to Statsig's format. This step is required irrespective of how you choose to import data into Statsig in the next step.

Amplitude fieldStatsig field
event_typeevent
event_timetimestamp (ms since epoch)
user_iduser.userID
device_iduser.stableID
event_propertiesmetadata
user_propertiesuser fields

Before transform

// Amplitude event
{
"event_type": "purchase",
"user_id": "123",
"device_id": "device_abc",
"event_time": "2023-08-17T00:00:00Z",
"event_properties": {
"amount": 25,
"currency": "USD"
},
"user_properties": {
"plan": "premium"
}
}

After transform

// Statsig event
{
"event": "purchase",
"user": {
"userID": "123",
"stableID": "device_abc",
"plan": "premium"
},
"timestamp": 1692230400000,
"metadata": {
"amount": 25,
"currency": "USD"
}
}

Step 3. Import into Statsig

Once your data look like Statsig events, you can start to bring them in. There are a few paths to import your data depending on how you exported:

If you exported from Amplitude via...Import into Statsig using...Best when...
S3 exportS3 ingestionYou're backfilling large datasets
Warehouse (Snowflake/BQ/Redshift)Warehouse ingestionYour Amplitude data already lives in a warehouse
Export APIEvent WebhookYou're moving a few days/weeks of data programmatically
UI download (CSV/JSON)Event WebhookYou're testing or moving a small slice of data

S3 ingestion

  • Just ensure the files are transformed to Statsig schema, in Parquet/JSON/CSV form, and then follow Statsig's S3 ingestion steps.
  • Please note that you need to shard your Amplitude raw data into 1 day's data per directory for to be able into Statsig

Warehouse ingestion

UI download or Export API

The simple way to get these events into Statsig is to replay them through the Event Webhook.

Think of it as a direct POST call: you take each row or JSON object, reshape it, and send it to Statsig one at a time or in small batches.

This is best for test runs or initial migrations, not for millions of events.

curl -X POST https://api.statsig.com/v1/webhooks/event_webhook \
-H "Content-Type: application/json" \
-H "STATSIG-API-KEY: $STATSIG_SERVER_SECRET" \
-d '{
"event": "signup",
"user": { "userID": "abc" },
"timestamp": 1692230400000
}'

Not sure where to start or need help?

If you're unsure how to approach Amplitude migration, please reach out to our team. We have worked with ex-Amplitude customers closely in the past to offer them hands on migration support.

We're always happy to discuss your team's individual needs or any other question you have - drop us a line at support@statsig.com or reach out on our slack community.