How Can I Use AzCopy to Transfer Data from SQL Server to Snowflake?

0
12
Asked By DataDynamo84 On

I'm trying to find a straightforward and cost-effective way to transfer data from my on-premise SQL Server to Snowflake. I've got transactional data that totals about 15MB every day, updated every 15 minutes. My plan is to do a one-time manual data load to backfill my Snowflake database, then set up an SQL Server agent job to send CSV files to an Azure blob storage using AzCopy. Is this a good approach, or are there any limitations with using AzCopy? I haven't used it before and would love some insights.

4 Answers

Answered By TechWhiz22 On

If you've got a data connector or a VPN to connect to your Azure tenant, what you're considering sounds like an easy way to go. Using a Logic App could simplify things and allow you to manage the data flow more seamlessly.

Answered By FileSaverX On

Instead of dealing with small files and all the hassle that comes with blob storage and pipelines, have you thought about using a direct SQL connector to insert data into your Snowflake tables? This could streamline your process significantly.

DataDynamo84 -

What SQL connector would you recommend, though? My on-premise server doesn't have an external IP.

Answered By NoneOfThat On

No.

Answered By CloudGuru98 On

It's great that you're planning to use AzCopy! If you're on SQL Server 2019, keep in mind that there's upcoming support for change data capture, which could help with streaming data changes to an event hub and then to blob storage. For now, you can directly write to blob storage from SQL Server, but how important is it for you to have real-time data in Snowflake?

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.