首页 > 解决方案 > Processing huge data from sql server

问题描述

I have a stored procedure (SQL Server 2016) which currently returns 100K to 200K rows based on the parameters to that SP.

Each row can be a size of 100KB to 200KB. So total size can be around 10GB to 20GB.

My client(background job) has to call this SP and process all rows and send it to another client.

What is the best approach to handle such scenarios?

Currently I am thinking of using streaming enumerator using yield.

Get the record whenever the 'datareader.Read()' read a row and process it and send it to other client.

dataReader = command.ExecuteReader();                    
while (dataReader.Read())
{
    obj = new SomeClass();

    // prepare Someclass

    yield return obj;
}

Is this approach sufficient to handler such large data?

Is there any better approach to it? (Such as multi threading etc.)

If so how should I approach to it. Any pointers to refer?

Edit: SP has multiple joins and runs couple of times in a day.

标签: c#sql-servermultithreadingyield-return

解决方案


根据您的描述,我认为它代表了实现 SSIS(集成服务)的一个很好的场景,它可以管理并将最终结果写入 CSV 文件并允许客户进行交换。


推荐阅读