In This Topic
Array binding is a powerful feature that significantly improves the performance by executing similar SQL statements in a single batch, rather than sending them one by one to the database.
The feature is especially useful for performing batch operations of the same type, such as INSERT, UPDATE, or DELETE.
The article demonstrates batch data insertion and update using SQLiteCommand with a practical example. To update your database with large volumes of data, follow the steps below:
1. Create a database and its table.
With any SQL tool, you can create a database and its table by using the following DDL statement:
CREATE TABLE batch_test
(
id INTEGER PRIMARY KEY,
f_integer INTEGER,
f_varchar VARCHAR(100)
);
Alternatively, you can achieve the same result programmatically with the following C# code:
using(SQLiteConnection connection = new
SQLiteConnection("DataSource=batch.db3;FailIfMissing=False;"))
{
connection.Open();
// create the test table
SQLiteCommand createCommand = new SQLiteCommand("CREATE TABLE IF NOT EXISTS batch_test(" +
"id INTEGER PRIMARY KEY, " +
"f_integer INTEGER, " +
"f_varchar VARCHAR(100))",
connection);
createCommand.ExecuteNonQuery();
}
2. Create the SQLIteCommand and set a DML statement.
Define the desired DML statement (e.g. INSERT, UPDATE, or DELETE) by setting the CommandText property of the SQLiteCommand object.
SQLiteCommand command = new SQLiteCommand("INSERT INTO batch_test" +
"(id, f_integer, f_varchar) " +
"VALUES" +
"(:id, :f_integer, :f_varchar)",
connection);
3. Add parameters.
Specify the parameter names and their types.
command.Parameters.Add("id", SQLiteType.Int32, 4, "id");
command.Parameters.Add("f_integer", SQLiteType.Int32, 4, "f_integer");
command.Parameters.Add("f_varchar", SQLiteType.Text, 100, "f_varchar");
Array binding with SQLiteCommand doesn't support explicitly setting the batch size. The optimal size is calculated automatically.
4. Fill parameter values.
Define an array of values in each parameter of the SQLiteCommand. Each array element corresponds to a different execution of the SQL statement within the same batch.
command.Parameters["id"].Value = new int[5] { 1, 2, 3, 4, 5};
command.Parameters["f_integer"].Value = new int[5] { 1, 2, 3, 4, 5 };
command.Parameters["f_varchar"].Value = new string[5] { "string 1", "string 2", "string 3", "string 4", "string 5" };
5. Execute the batch insert.
Call the ExecuteArray() method to execute the command for all sets of parameter values in a single batch.
Below is a sample code that executes several INSERT operations using array binding.
CREATE TABLE batch_test (
id INTEGER PRIMARY KEY,
f_integer INTEGER,
f_varchar VARCHAR(100)
);
using (SQLiteConnection connection = new
SQLiteConnection("DataSource=batch.db3;FailIfMissing=False;"))
{
connection.Open();
// create SQLiteCommand
SQLiteCommand command = new SQLiteCommand("INSERT INTO batch_test" +
"(id, f_integer, f_varchar) " +
"VALUES" +
"(:id, :f_integer, :f_varchar)",
connection);
command.Parameters.Add("id", SQLiteType.Int32, 4, "id");
command.Parameters.Add("f_integer", SQLiteType.Int32, 4, "f_integer");
command.Parameters.Add("f_varchar", SQLiteType.Text, 100, "f_varchar");
// fill SQLiteCommand parameter values
command.Parameters["id"].Value = new int[5] { 1, 2, 3, 4, 5};
command.Parameters["f_integer"].Value = new int[5] { 1, 2, 3, 4, 5 };
command.Parameters["f_varchar"].Value = new string[5] { "string 1", "string 2", "string 3", "string 4", "string 5" };
// execute the update
command.ExecuteArray();
}
This example demonstrates how to insert multiple records with a single execution operation using array binding in dotConnect for SQLite.
To use the array binding feature, you must assign arrays of values to the parameters of a SQLiteCommand object and call the ExecuteArray() method. Each array element corresponds to a different execution of the SQL statement within the same batch.
Array binding with SQLiteCommand doesn't support explicitly setting the batch size. The optimal size is calculated automatically based on the input data.
This approach is especially beneficial when working with large datasets, as it reduces the number of individual communications between the application and the database, significantly improving performance.