Srijith Sarman

Time,space and living

  Home  |   Contact  |   Syndication    |   Login
  16 Posts | 1 Stories | 8 Comments | 1 Trackbacks

News



Archives

Algorithms

General

Other software

Programming .NET

Sunday, January 14, 2007 #

          Bulk inserting would not be a problem for most of the developers.Lot's of examples are out there. Simply bulk inserting in a transaction will do the purpose. Another common method is to call the DataAdapter.Update() method.

         Last day I stumbled a little ,when I test the time taken by the bulk delete and bulk update code which I have writtten. Each time ,the sql query which has a where condition kills the time.Also the operation was being done in a client side database (Had it been an enterprise database like oracle,we could have used tools like sqlldr.exe to load  data). I tried it by putting all delete operations in a transaction. But that was also not sufficient. Finally I made up my mind to do all the operations in the memory and to perform a bulk insert after truncating the table.

Like,

DataSet dsUpdate=getData() // Data to Update

 DataSet ds=FetchTableData() // function to return data from the database.

// Some times I may have to update all the records in the table.So I deleted all those records ,which has to be updated ,and then inserted the new dataset to the table.

 DataTable DtChange=Changes(ds.Table[0],dsUpdate.Tables[0],uniqueColumns) ;

//After getting the difference ,you can add the extra records to DtChanges  which have to be inserted,and then you can perform the bulk insert operation.


public DataTable Changes(DataTable dt1, DataTable dt2,string[] uniColumns)

{

DataTable resultTable = new DataTable();

DataSet ds = new DataSet();

ds.Tables.AddRange(new DataTable[]{dt1.Copy(),dt2.Copy()});

int dt1Len=uniColumns.Length;

DataColumn[] dt1Cols = new DataColumn[dt1Len];

for(int i = 0; i <dt1Len ; i++)

{

string col1=uniColumns[i];

dt1Cols[i] = ds.Tables[0].Columns[col1];

}

int secLen=uniColumns.Length;

DataColumn[] dt2Cols = new DataColumn[secLen];

for(int i = 0; i <secLen; i++)

{

string col2=uniColumns[i];

dt2Cols[i] =ds.Tables[1].Columns[col2];

}

DataRelation r = new DataRelation(string.Empty,dt1Cols,dt2Cols,false);

ds.Relations.Add(r);

for(int i = 0; i < dt1.Columns.Count; i++)

{

resultTable.Columns.Add(dt1.Columns[i].ColumnName, dt1.Columns[i].DataType);

}

foreach(DataRow row in ds.Tables[0].Rows)

{

DataRow[] childrows = row.GetChildRows(r);

if(childrows == null || childrows.Length == 0)

resultTable.ImportRow(parentrow);

}

return resultTable;

}


     I used Sqlite as client side database which is a very fast zero configuration database.Bulk insert could be easily done if you have all those records in a transaction. It took hardly one minute for me inserting50,000 records in a single transaction.When I take away the transaction,it takes nearly 15 minutes!!!.