How to insert the data from excel to Database in .Net?
In order to bulk insert the data from excel to the Database, lot of ways followed. Even it works fine and perfectly, the biggest thing in developers’ mind is time consuming. Here I explained the concept of Bulk inserting through Sql Loader.
How to insert the data from excel to Database in .Net?
In order to bulk insert the data from excel to the Database, lot of ways followed. Even it works fine and perfectly, the biggest thing in developers' mind is time consuming.
Since .Net is a web application and it requires the state management technique, we can't hold the data for longer time, in the Cache or Session. Usually, developers use Bulk insert method in this scenario. In order to reduce the time and improve the performance in the .Net, the Sql Loader concept is used.
Here I explained the concept of Bulk inserting through Sql Loader.
Step 1.
Get the data from excel to DataSetstrFileName = “C:\temp\test.xls";
System.Data.OleDb.OleDbConnection excelConn = new System.Data.OleDb.OleDbConnection("provider=microsoft.jet.oledb.4.0;data source= '" + strFileName + "'; " + " extended properties=excel 8.0;");
strsql = "Select Name, Age, Marks from [sheet1$]";
System.Data.OleDb.OleDbDataAdapter excelDataAdap = new System.Data.OleDb.OleDbDataAdapter(strsql, excelConn);
DataSet dsExcel = new DataSet();
if (excelConn.State == ConnectionState.Closed)
{
excelConn.Open();
}
//Select the values from Excel
excelDataAdap.Fill(dsExcel);
Step 2.
Now your datas are in DataSet, in order to insert the records into Database, different ways are followed.1. Using record wise insertion, suits if we have very less number of records
2. Bulk insert, suits if we have more records but it takes time.
3. Sql loader, suits best for more records insertion in less time.
Since most of developers experienced with first two options, here with I explained about the third one.
Step 3.
Using Sql loader1. Copy all the datas into CSV format
Converting the Datas into the CSV format takes less time and simultaneously it improves the performance of the entire application. CSV format can be supported by Sql Loader as well. So, Converting into CSV is recommended.
In order to reduce the memory usage, here StringBuilder concept is used for appending the datas. All records are delimited with the character ‘@'. The reason behind using the @ as delimiter, most of the data can have any kind of delimiter except @. Since it is rarely used in the data, here I am using @ as delimiter. This helps to improve the performance of the application and most of them never use such character @ in the data.
Private string ToCSV(DataTable dataTable)
{
//create the stringbuilder that would hold our data
StringBuilder sb = new StringBuilder();
//check if there are columns in our datatable
if (dataTable.Columns.Count != 0)
{
foreach (DataRow row in dataTable.Rows)
{
//loop thru each column in our datatable
foreach (DataColumn column in dataTable.Columns)
{
//get the value for tht row on the specified column
// and append our separator
sb.Append(row[column].ToString() + '@');
}
//append a carriage return
sb.Append("\r\n");
}
}
//return our values
return sb.ToString();
}
2. Call the below function with the following parameters
CSVPath, LoaderFile, LogFile
CSVPath has the collection of CSV format data.
LoaderFile has the loader control, here just pass the file name.
LogFile has the log information, here just pass the name and it gets updated while executing. It helps for determining the records inserted, rejected and reason for the rejection. Passing the LogFile is recommended.
Syntax of the Sql Loader :
Sqlldr username/password@schemaname control=controlfile.ctl log=logfile.log
Sqlldr username/password@schemaname control=controlfile.ctl log=logfile.log
Ex:
Sqlldr abc/abc@dataschema control=C:\temp\test.ctl log=c:\temp\test.log
private void UploadDatabase(string strCSVPath, string strLoaderFile, string strLogFile)
{
string strCmd, strSQLLoader;
strCmd = “sqlldr abc/abc@dataschema control=" + strLoaderFile + " LOG=" + strLogFile;
System.IO.DirectoryInfo di;
try
{
System.Diagnostics.ProcessStartInfo cmdProcessInfo = new System.Diagnostics.ProcessStartInfo("cmd.exe");
di = new DirectoryInfo(strCSVPath);
strSQLLoader = "";
strSQLLoader += "LOAD DATA INFILE '" + strCSVPath.ToString().Trim() + "' APPEND INTO TABLE STUDENT FIELDS TERMINATED BY '@' TRAILING NULLCOLS (NAME, AGE, MARKS)";
StreamWriter writer = new StreamWriter(strLoaderFile);
writer.WriteLine(strSQLLoader);
writer.Flush();
writer.Close();
// Redirect both streams so we can write/read them.
cmdProcessInfo.RedirectStandardInput = true;
cmdProcessInfo.RedirectStandardOutput = true;
cmdProcessInfo.UseShellExecute = false;
// Start the procses.
System.Diagnostics.Process pro = System.Diagnostics.Process.Start(cmdProcessInfo);
// Issue the dir command.
pro.StandardInput.WriteLine(strCmd);
// Exit the application.
pro.StandardInput.WriteLine("exit");
// Read all the output generated from it.
string strOutput;
strOutput = pro.StandardOutput.ReadToEnd();
pro.Dispose();
}
catch (Exception ex)
{
return;
}
finally
{
}
}
Step 4.
Verifying the dataIn order to verify the inserted data, check the log file which gives the following information.
The highlighted one shows the information of inserted and rejected records.
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Oct 26 20:39:42 2007
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: C:\temp\test.ctl
Data File: C:\temp\test.dat
Bad File: C:\temp\test.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table STUDENT, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- -------------
NAME FIRST * @ CHARACTER
AGE NEXT * @ CHARACTER
MARKS NEXT * @ CHARACTER
value used for ROWS parameter changed from 64 to 32
Record 11: Discarded - all columns null.
Table STUDENT:
10 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
1 Row not loaded because all fields were null.
Space allocated for bind array: 255936 bytes(32 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 11
Total logical records rejected: 0
Total logical records discarded: 1
Run began on Fri Oct 26 20:39:42 2007
Run ended on Fri Oct 26 20:39:42 2007
Elapsed time was: 00:00:00.44
CPU time was: 00:00:00.03
Summary
This article explains about the bulk insert data to database in the secured as well as the fastest way. Even if any data found invalid (ie) datatype mismatch and etc., then the entire insertion process stopped immediately.
Thus it works fine and meets the requirements.
Regards
Nellaikumar
nellaikumar2002@rediffmail.com
hey you have not utilized this query
strSQLLoader += "LOAD DATA INFILE '" + strCSVPath.ToString().Trim() + "' APPEND INTO TABLE STUDENT FIELDS TERMINATED BY '@' TRAILING NULLCOLS (NAME, AGE, MARKS)";
and what is cmd.