I want to insert al large csv file (2500 rows) into a sqlite database. On my lokal computer (xampp) the import takes about 3 minutes. This project on a real linux webserver takes enless time. What is the best practices for importing large csv files? How can I speed up the insert procedure wirth CI?
Here is my code:
Database
Controller
Model
Thanks for your help!
Here is my code:
Database
PHP Code:
$db['default'] = array(
'dsn' => '',
'hostname' => '',
'username' => '',
'password' => '',
'database' => APPPATH.'/database/myDB.db',
'dbdriver' => 'sqlite3',
'dbprefix' => '',
'pconnect' => FALSE,
'db_debug' => (ENVIRONMENT !== 'production'),
'cache_on' => FALSE,
'cachedir' => '',
'char_set' => 'utf8',
'dbcollat' => 'utf8_general_ci', //utf8_unicode_ciutf8_general_ci
'swap_pre' => '',
'encrypt' => FALSE,
'compress' => FALSE,
'stricton' => FALSE,
'failover' => array(),
'save_queries' => TRUE
);
Controller
PHP Code:
$this->load->model('db_model');
$csv = read_file('./folder/import.csv');
$new_csv = explode(PHP_EOL, $txt);
foreach($new_csv as $row){
$array = explode(',', str_replace("\"", "", $row));
}
$this->db_model->emptyTable();
for ($i = 0; $i < sizeof($array)-1; $i++) {
$myData = array(
'var1' => $array[$i][0],
'var2' => $array[$i][1],
'var3' => $array[$i][2],
'var4' => $array[$i][3],
'var5' => $array[$i][4],
'var6' => $array[$i][5],
'var7' => $array[$i][6],
'var8' => $array[$i][7]
);
$this->db_model->setTimetable($myData);
Model
PHP Code:
public function emptyTable() {
$this->db->empty_table('tableName');
}
public function setTimetable($data) {
$this->db->insert('tableName', $data);
}
Thanks for your help!