Hello,
i use insertBatch with more than 20.000 data from a file. Sometimes there are same entrys with a unique key. Now i try to catch the duplicate entry error instead of an exception and errorpage, but it wont work the suggested way.
What i'll do:
Turn off DBDebug in Database.php (set to false)
and then
PHP Code:
try {
$Model->insertBatch($data);
}catch (Exception $e){
echo $e->getMessage();
}
Doesnt work!
Next
PHP Code:
$Model->insertBatch($data);
if($Model->error()['message']>''){
var_dump($Model->error()['message']);
}
Work with small amount of Data doesnt work with large Data!
As i see it inserts the Data till error, than some not and than the rest but doesnt give the error back.
Testscript:
PHP Code:
$data = [];
$data[] = ['name'=>'Hans','wert'=>1];
for($i=0;$i<5000;$i++){
$data[] = ['name'=>'Tom_1'.$i,'wert'=>1];
}
$data[] = ['name'=>'Hans','wert'=>1];
for($i=0;$i<5000;$i++){
$data[] = ['name'=>'Tom_2'.$i,'wert'=>1];
}
Name is the primary key!
After i have 9901 Entry without error.
What am i missing?
Best regards