Wednesday, October 8, 2008

ASP.NET - Efficiently Importing DataTable

ASP.NET developers often use DataTable to store data received from SQL stored procedure, web-service, data access layer (DAL) etc.

DataTable acts like a temporary container where the data is stored, refined (trimming white-spaces, number formatting), transformed (calculating a running total) and then consumed.

For more details on DataTable visit: MSDN

The article will demonstrate how to efficiently create a DataTable from an existing DataTable based on row by row comparison.

Suppose DataTable (A) has 1000 rows of Order data placed by multiple customers of the web-site, our goal is to efficiently create a new DataTable (B) using DataTable (A) to extract Orders having customer ID of 101 & 102.

DataTable (A) is populated by making a call to the DAL, which returns all the Orders placed to date [ideally DAL would provide a method which will return filtered data but for now we are going to go with the assumption of filtering data out of DataTable (A)].






Code:


//DataTable (B) is cloned from DataTable (A)
DataTable B = A.Clone();

//clear DataTable

B.Clear();

//save changes
B.AcceptChanges();

Note we are using DataTable 'Clone' method instead of 'Copy' method as our objective is to only retain the structure of DataTable (A) and not the data.

Now we need to populate DataTable (B) with Orders having customer ID of 101 & 102.

//iterate
foreach(DataRow dr in A.Rows)

{
int customerID = Convert.ToDouble(dr["CustomerID"]);
//match customer ID
if (customerID == 101 || customerID == 102)

{
//customerID matched add it to DataTable (B)

B.ImportRow(dr);

}

}
//save
B.AcceptChanges();

DataTable’s ImportRow method copies the row from DataTable (A) to DataTable (B), performance wise ImportRow is faster than individually adding rows specially when dealing with large amount of data.


kick it on DotNetKicks.com

No comments: