I’m using a PHP script to insert the unique value of an array into the database. Here’s the script
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
if (isset($_POST['submit'])) {
$df = $_POST['df'];
$arr = array();
$file = fopen('files/'.$df.'.csv', 'r');
while (($line = fgetcsv($file)) !== false) {
$arr[] = $line[2];
}
$arruniq = array_unique($arr);
fclose($file);
$arr_count = count($arr);
$time = strtotime(date_default_timezone_get());
$ts = date("Y-m-d H:i:s", $time);
$mysqli = new mysqli("localhost", "root", "pass", "tab");
if ($mysqli === false) {
die("ERROR: DB could not connect. ");
}
$sqlins = "insert into tab (created_time, uniq_id) values (?, ?);";
for ($i = 0; $i < $arr_count; $i++) {
if ($stmt = $mysqli->prepare($sqlins)) {
$stmt->bind_param("ss", $ts, $arruniq[$i]);
$stmt->execute();
} else {
echo "ERROR: Could not prepare query";
}
}
$mysqli->close();
}
The script is reading a CSV file, unique the values and insert into the database. Here’s the problem, the script works fine if the file size is smaller, but if I use a larger file I’m getting this “500 internal server error”.
No error is displaying or logged in the server PHP logs as well as apache logs.
I even increased the execution time of the script and memory limit of the script by using ini_set('memory_limit','2048M');
and max_execution_time
headers, which also didn’t work.
2
Answers
This is an issue with execution time of the script. Since i'm using webmin/ virtualmin max_execution_time header didn't work. So I've changed that in the virtualmin server
Now the issue is solved, But it takes longer time for executing script.
It should be faster if you use prepared statements and transactions.
Prepared statements when used correctly give you slight performance improvement, because the server doesn’t need to parse the SQL every time and the transactions save you some time because the
commit
is not called after each row.