I have to insert 100k+ record in the db and I have some memory issues. $_data is an array holding the arrays of data. I even increase the memory size but still got into problems
// VERSION 1
protected function save() {
$memory_limit = ini_get('memory_limit');
ini_set('memory_limit', '512M');
$sql = "
INSERT INTO table (
c1,
c2,
c3,
c4,
c5,
c6,
c7,
c7,
c9,
c10,
c11
) VALUES (?,?,?,?,?,?,?,?,?)
ON DUPLICATE KEY UPDATE
c10 = VALUES(c10),
c11 = VALUES(c10),
c12 = VALUES(c12)
";
$db = Zend_Registry::get('db');
$stmt = new Zend_Db_Statement_Pdo($db, $sql);
foreach($this->_data as $entry){
$stmt->execute($entry);
}
unset($this->_data, $stmt, $sql);
ini_set('memory_limit', $memory_limit);
The second tries to insert all entries in a multi-insert, but not better.
// VERSION 2
protected function save2(){
$question_marks = str_repeat('?,', count($this->_data[0]));
$question_marks = trim($question_marks, ',');
$question_marks = str_repeat("($question_marks),", count($this->_data));
$question_marks = trim($question_marks, ',');
$sql = "
INSERT INTO table (
c1,
c2,
c3,
c4,
c5,
c6,
c7,
c7,
c9,
c10,
c11
) VALUES $question_marks
ON DUPLICATE KEY UPDATE
c10 = VALUES(c10),
c11 = VALUES(c11),
c12 = VALUES(c12)
;";
$db = Zend_Registry::get('db');
$stmt = new Zend_Db_Statement_Pdo($db, $sql);
$insert_values = call_user_func_array('array_merge', $this->_data);
$stmt->execute($insert_values);
$affected_rows = $stmt->rowCount();
if ($affected_rows){
// @todo log
}
unset($this->_data);
unset($stmt, $sql, $insert_values, $affected_rows, $question_marks);
Column names are not the original. Any suggestions?
I will try to split the data array to 5k entries and do the inserts in batches. Also trying to see how modifying max_allowed_packet in mysql cnf helps. Meanwhile I would appreciate any suggestions. Thanks
UPDATE
in my case modifying the max_allowed_packet from 16M to 1024M helped and I did make the insert without splitting the array.