Skip to main content

Insert Data

nicyeAbout 4 minAbout 1305 words

Insert Data

FreeSql provides methods for inserting data in single and batches, and it can also return the inserted records when executed in a specific database.

static IFreeSql fsql = new FreeSql.FreeSqlBuilder()
    .UseConnectionString(FreeSql.DataType.MySql, connectionString)
    .UseAutoSyncStructure(true) //Automatically synchronize the entity structure to the database
    .Build(); //Be sure to define as singleton mode

class Topic {
    [Column(IsIdentity = true, IsPrimary = true)]
    public int Id { get; set; }
    public int Clicks { get; set; }
    public string Title { get; set; }
    public DateTime CreateTime { get; set; }
}

var items = new List<Topic>();
for (var a = 0; a < 10; a++) items.Add(new Topic { Id = a + 1, Title = $"newtitle{a}", Clicks = a * 100 });

1. Single Insert

var t1 = fsql.Insert(items.First()).ExecuteAffrows();
//INSERT INTO `Topic`(`Clicks`, `Title`, `CreateTime`) 
//VALUES(@Clicks0, @Title0, @CreateTime0)

If the table has auto-increment columns, id will be returned after inserting data.

Method 1: (Original)

long id = fsql.Insert(blog).ExecuteIdentity();
blog.Id = id;

Method 2: (depends on FreeSql.Repository)

var repo = fsql.GetRepository<Blog>();
repo.Insert(blog);

In the internal implementation, after inserting the data, the self-incremental value will be assigned to blog.Id

2. Batch Insert

var t2 = fsql.Insert(items).ExecuteAffrows();
//INSERT INTO `Topic`(`Clicks`, `Title`, `CreateTime`) 
//VALUES(@Clicks0, @Title0, @CreateTime0), (@Clicks1, @Title1, @CreateTime1), 
//(@Clicks2, @Title2, @CreateTime2), (@Clicks3, @Title3, @CreateTime3), 
//(@Clicks4, @Title4, @CreateTime4), (@Clicks5, @Title5, @CreateTime5), 
//(@Clicks6, @Title6, @CreateTime6), (@Clicks7, @Title7, @CreateTime7), 
//(@Clicks8, @Title8, @CreateTime8), (@Clicks9, @Title9, @CreateTime9)

The errors that are easily caused by adding SqlServer in batches have been resolved:

The incoming request has too many parameters. The server supports a maximum of 2100 parameters. Reduce the number of parameters and resend the request.

Principle: Split into multiple packages and execute them in transactions.

When inserting large quantities of data, the internal logic is divided and executed in batches. The segmentation rules are as follows:

QuantitySize of Parameters
MySql50003000
PostgreSQL50003000
SqlServer10002100
Oracle500999
Sqlite5000999

Quantity: It is the size of each batch of division. For example, a batch of 10,000 pieces of data will be inserted into two batches when mysql is executed.
Size of Parameters: the size of the parameter size divided into each batch. For example, when inserting 10,000 pieces of data in batches, each row needs to use 5 parameterizations, which will be divided into 3000/5 for each batch when mysql is executed.

After the execution of the split, when the external transaction is not provided, the internal transaction is opened to achieve insertion integrity. You can also set appropriate values through BatchOptions.

FreeSql adapts to the use of parameterization and non-parameterization of each data type. It is recommended to turn off the parameterization function for batch insertion and use .NoneParameter() to execute it.

3. ExecuteSqlBulkCopy, ExecutePgCopy, ExecuteMySqlBulkCopy and ExecuteOracleBulkCopy

Bulk Copy operation is implemented in the form of an extension method. For SqlServer/PostgreSQL/MySql/Oracle databases, the available packages are: FreeSql.Provider.SqlServer/FreeSql.Provider.PostgreSQL/FreeSql.Provider.MySqlConnector/FreeSql.Provider.Oracle.

bulk insert test reference (52 fields)

18W1W5K2K1K50010050
MySql 5.5 ExecuteAffrows38,4812,2341,1362842391676630
MySql 5.5 ExecuteMySqlBulkCopy28,4051,1426574514355924722
SqlServer Express ExecuteAffrows402,35524,84711,4654,9712,43791513888
SqlServer Express ExecuteSqlBulkCopy21,065578326139105796048
PostgreSQL 10 ExecuteAffrows46,7563,2942,2691,0193742095137
PostgreSQL 10 ExecutePgCopy10,09058333713688613025

Explanation of 8W: insert 180000 rows of records, and the number in the table is the execution time (unit: ms)

bulk insert test reference (10 fields)

18W1W5K2K1K50010050
MySql 5.5 ExecuteAffrows11,1718663668083502434
MySql 5.5 ExecuteMySqlBulkCopy6,504399257116871001616
SqlServer Express ExecuteAffrows47,2042,2751,1084882791233516
SqlServer Express ExecuteSqlBulkCopy4,248127713048141110
PostgreSQL 10 ExecuteAffrows9,7865683361571023496
PostgreSQL 10 ExecutePgCopy4,0811679339211242

The test results are all based on the same operating system, and all are preheated.

4. Insert the specified columns

var t3 = fsql.Insert(items).InsertColumns(a => a.Title).ExecuteAffrows();
//INSERT INTO `Topic`(`Title`) 
//VALUES(@Title0), (@Title1), (@Title2), (@Title3), (@Title4), 
//(@Title5), (@Title6), (@Title7), (@Title8), (@Title9)

var t4 = fsql.Insert(items).InsertColumns(a =>new { a.Title, a.Clicks }).ExecuteAffrows();
//INSERT INTO `Topic`(`Clicks`, `Title`) 
//VALUES(@Clicks0, @Title0), (@Clicks1, @Title1), (@Clicks2, @Title2), 
//(@Clicks3, @Title3), (@Clicks4, @Title4), (@Clicks5, @Title5), 
//(@Clicks6, @Title6), (@Clicks7, @Title7), (@Clicks8, @Title8), 
//(@Clicks9, @Title9)

5. Ignore the specified columns

var t5 = fsql.Insert(items).IgnoreColumns(a => a.CreateTime).ExecuteAffrows();
//INSERT INTO `Topic`(`Clicks`, `Title`) 
//VALUES(@Clicks0, @Title0), (@Clicks1, @Title1), (@Clicks2, @Title2), 
//(@Clicks3, @Title3), (@Clicks4, @Title4), (@Clicks5, @Title5), 
//(@Clicks6, @Title6), (@Clicks7, @Title7), (@Clicks8, @Title8), 
//(@Clicks9, @Title9)

var t6 = fsql.Insert(items).IgnoreColumns(a => new { a.Title, a.CreateTime }).ExecuteAffrows();
///INSERT INTO `Topic`(`Clicks`) 
//VALUES(@Clicks0), (@Clicks1), (@Clicks2), (@Clicks3), (@Clicks4), 
//(@Clicks5), (@Clicks6), (@Clicks7), (@Clicks8), (@Clicks9)

6. Column Insertion Priority

All Columns < Specified columns (InsertColumns) < Ignored Columns (IgnoreColumns)
  • Without using InsertColumns/IgnoreColumns, all columns of the entity will be inserted into the database;
  • Otherwise, when using InsertColumns and not using IgnoreColumns, only the specified columns are inserted into the database;
  • Otherwise, in the case of using IgnoreColumns, only unspecified columns are inserted into the database.

7、Dictionary Insert

var dic = new Dictionary<string, object>();
dic.Add("id", 1);
dic.Add("name", "xxxx");

fsql.InsertDict(dic).AsTable("table1").ExecuteAffrows();

8. Import table data

int affrows = fsql.Select<Topic>()
  .Limit(10)
  .InsertInto(null, a => new Topic2
  {
    Title = a.Title
  });
INSERT INTO `Topic2`(`Title`, `Clicks`, `CreateTime`)
SELECT a.`Title`, 0, '0001-01-01 00:00:00' 
FROM `Topic` a 
limit 10

Note: Because Clicks and CreateTime are not selected, it'll use the value set by the target entity attribute [Column(InsertValueSql = xx)], or the default value of C# of the target entity attribute.

8. Insert Ignore Into (MySql only)

fsql.Insert<Topic>().MySqlIgnoreInto().AppendData(items).ExecuteAffrows();
///INSERT IGNORE INTO `Topic`(`Clicks`) 
//VALUES(@Clicks0), (@Clicks1), (@Clicks2), (@Clicks3), (@Clicks4), 
//(@Clicks5), (@Clicks6), (@Clicks7), (@Clicks8), (@Clicks9)

Reference

API

MethodsReturnParametersDescription
AppendData<this>T1 | IEnumerable<T1>Append the entity to be inserted
InsertIdentity<this>NoneSpecify the insert auto-increment column
InsertColumns<this>LambdaSpecify the inserted columns
IgnoreColumns<this>LambdaSpecify the ignored columns
CommandTimeout<this>intCommand timeout setting (seconds)
WithTransaction<this>DbTransactionSet transaction object
WithConnection<this>DbConnectionSet the connection object
ToSqlstringReturn the SQL statement to be executed.
OnDuplicateKeyUpdateOnDuplicateKeyUpdate<T1>NoneMySql only. On Duplicate Key Update
OnConflictDoUpdateOnConflictDoUpdate<T1>NonePostgreSQL only. On Conflict Do Update
ExecuteAffrowslongExecute SQL and return the number of rows affected.
ExecuteIdentitylongExecute SQL and return the incremented value.
ExecuteInsertedList<T1>Execute SQL and return the inserted records.
ExecuteSqlBulkCopyvoidSqlServer only. To execute BulkCopy to import data in batches, you need to reference FreeSql.Provider.SqlServer
ExecutePgCopyvoidPostgreSQL only. To execute BulkCopy to import data in batches, you need to reference FreeSql.Provider.PostgreSQL
ExecuteMySqlBulkCopyvoidMySql only. To execute BulkCopy to import data in batches, you need to reference FreeSql.Provider.MysqlConnector
ExecuteOracleBulkCopyvoidOracle only. To execute BulkCopy to import data in batches, you need to reference FreeSql.Provider.Oracle
ExecuteDmBulkCopyvoidDameng database only. To execute BulkCopy to import data in batches, you need to reference FreeSql.Provider.Dameng