Skip to main content

Hi all

What's best practice to export user-level data to a cloud data warehouse?

I can see you can export user events via the API (/api/export/userEvents) but this requires iterating through all user events, and I can't see a time filter - so would have to export all data each data and merge into destination.

You can also export campaign data but this is too high level. We need event data at a user level, but only want to export new data each day.

Thanks

Luke

From @Alejandra Perez 

Hi Luke! Your team can use the  /api/export/data.csv endpoint to pull all custom event data. You can specify a startDateTime and endDateTime so that you only pull data daily. Iterable API Documentation

Best,

Alejandra 😊


From @Luke Harrison 

Thank you - that should work well


**This response has been manually added from our former community platform to ensure the full conversation is captured. Please note that the date of this post will appear significantly later than the original.


Hi, and what about campaign metrics data? Would be webhook a preferred option over api?


Hi ​@IrinaELX ,

Yes, system webhooks would be the best option for sending these events live.

If you prefer to export historic data, you can use our API export endpoints to get this data. All of these endpoints allow you to specify the “startDateTime” and “endDateTime” for the time range of system events you’d like to isolate.

Alternatively, you might consider using our Snowflake Data Share. Here’s more information on this:

https://support.iterable.com/hc/en-us/articles/4404943746708-Snowflake-Secure-Data-Sharing-Iterable-Integration

If you’d like to explore a similar type of connection to your DWH but with a different service, I recommend reaching out to your CSM so they can better advise.


Reply