公司有一部分业务是先用php程序抓取数据后,然后写入到数据库中!每天大概8G的数据量!
分别采用了三种数据插入方式:
1:每次插入一条数据,多次循环,这样效率不高,因为每次insert后,都要commit
2:一次插入多条数据,然后commit,类似于:insert into person values() () () ();
这里插入有个细节性的问题:
我的实验环境如下:

创建一个person表有主键的,我要模拟主键冲突的时候的同时insert多个值的插入情况!
手动往person表里面插入两条记录:

我下面通过导入一个a.sql测试,a.sql内容如下:

显然已经有id为2的记录了,显然是插入不了的!我现在想测试的是(3,'will'),(4,'grace')是否能被插入,因为他们是正常的 语句,测试的结果是也不能被插入,因为(2,'Tomnew')的缘故,它们在一条语句中,要某执行,要么一起回滚!导入如下:

查看结果:

显然一条记录都没有被插入!
下面加上参数测试:加上-f参数强制当出现问题的时候继续导入,如下图:

去查看是否已经导入:

显然id=5和id=6 的数据已经导入,但由于id=2 的冲突,id=3和id=4的数据并没有导入!
有个更好的导入方法:

查看如下:
对
nsert into table (),(),(),(),()…….;执行过程是理解:
它的执行过程,是从头开始查找,插入,遇到有问题的值,在回滚,如果开头已经插入了999条,而在第1000条遇到了问题,则前面插入的999条数据都要rollback,从而导致insert整个insert过程比较慢!
现有开始是插入,所以状态为insert,后来遇到主键一样的插入失败,而rolloback !附件如下:
insert如下:

rollback如下:
这样效率有所提高,但在大量写的时候还是有问题
3:采用mysql自带的load data infile方式!
测试环境:4G内存+1 core cpu
测试的数据量60万条见附件:

history的表结构如下:

history.txt文件大小:
在test数据库中创建该表,然后通过load data infile 导入数据:
测试如下:mysql> use test;
Database changed
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (6.98 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> select count(*) from history;
+----------+
| count(*) |
+----------+
| 600000 |
+----------+
1 row in set (0.20 sec)
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.95 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.17 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.68 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.29 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.15 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.08 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (8.13 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.52 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (6.98 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (6.91 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (6.95 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.38 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.49 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (8.08 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (10.38 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> select count(*) from history ;
ERROR 2006 (HY000): MySQL server has gone away
No connection. Trying to reconnect...
Connection id: 29
Current database: test
+----------+
| count(*) |
+----------+
| 9600000 |
+----------+
1 row in set (3.98 sec)
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (9.90 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.56 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.67 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.63 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.78 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> select count(*) from history ;
+----------+
| count(*) |
+----------+
| 12600000 |
+----------+
1 row in set (5.18 sec)
mysql> select count(*) from history ;
+----------+
| count(*) |
+----------+
| 12600000 |
+----------+
1 row in set (0.00 sec)
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.75 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.88 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.99 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (8.07 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (7.94 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (9.29 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (12.62 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (10.08 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (11.86 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql> LOAD DATA INFILE '/tmp/history.txt' INTO TABLE history fields Terminated By ',' LINES TERMINATED BY '\n';
Query OK, 600000 rows affected (8.33 sec)
Records: 600000 Deleted: 0 Skipped: 0 Warnings: 0
mysql>
显然插入60万条数据,只需要10表左右!
#####文件名不一样,这是我第一个测试的例子
在使用load data infile命令时遇到一个问题见附件!

原因;其实我的文件是有的,看起来提示没有该文件,目录权限不够(
我将该文件放到了/root目录下),只需要将文件移到/tmp目录下,即可!
提高插入速度的高手建议##摘抄如下

对于大表弟DML操作技巧:
大量导入时的优化: