I have a search function which works with a small amount of data, but after I filled the database with ten thousands of data, the site gets stuck on loading when I try to search for something.
pubblic function searchVideo(Request $request)
{
$searchTerm = $request->search;
$videos = Videos::with(['tags:id', 'stars:id'])
->where('title', 'LIKE', '%'.$searchTerm.'%')
->orWhereHas('tags', function ($query) use ($searchTerm) {
$query->where('name', 'LIKE', '%'.$searchTerm.'%');
})
->orWhereHas('stars', function ($query) use ($searchTerm) {
$query->where('name', 'LIKE', '%'.$searchTerm.'%');
})
->orderBy('created_at', 'desc')
->paginate(15);
return $videos;
}
If I delete
->orWhereHas('tags', function ($query) use ($searchTerm) {
$query->where('name', 'LIKE', '%'.$searchTerm.'%');
})
->orWhereHas('
stars', function ($query) use ($searchTerm) {
$query->where('name', 'LIKE', '%'.$searchTerm.'%');
})
it works fine but then I wont get every result I wish to search for.
2
Answers
Try this :
public function searchVideo(Request $request)
{
$searchTerm = $request->search;
$videos = Videos::with([‘tags:id’, ‘stars:id’])
->where(‘title’, ‘like’, ‘%’.$searchTerm.’%’)
->orWhereHas(‘tags’, function ($query) use ($searchTerm) {
$query->where(‘name’, ‘like’, ‘%’.$searchTerm.’%’);
})
->orWhereHas(‘pornstars’, function ($query) use ($searchTerm) {
$query->where(‘name’, ‘like’, ‘%’.$searchTerm.’%’);
})
->orderBy(‘created_at’, ‘desc’)
->simplePaginate(15);
Using ‘like’ instead of ‘LIKE’ invokes a Laravel method instead of an SQL operator.
The simplePaginate method instead of paginate, can be more efficient.
Can you use "select" to specify specific columns from the "videos" table, instead of loading all columns. For example:
Also you can use "orWhere" with Closure: Instead of using multiple orWhere methods, you can group them in a closure to avoid redundant query building. For example:
Unfortunately, this kind of query search term
which translates to SQL as
is a notorious query-performance antipattern. It can’t random-access any ordinary BTREE index, but must rather scan through all the rows to find matching values. Fast on tiny testing databases. Slow on large production databases.
And, your query has the other notorious performance antipattern in it too:
This means your RDBMS software must accumulate all matching rows, then sort them, then discard all but a small number of them. Slow.
What can you do about this? Here are a few suggestions that might help.
If you’re on MySQL or MariaDB, make sure your
title
column is defined asVARCHAR(767)
or some smaller number of characters. Don’t define it as any kind ofTEXT
: access to those sorts of columns is slower.Create a compound index as follows
This will put the
title
column to be scanned in an index. That may be faster than scanning the entire table. And, thecreated_at
column in the same index may make yourORDER BY ... LIMIT ...
operation a bit faster.Redesign your search operation to avoid leading
%
wildcards inLIKE
clauses. Without the leading%
, those operations can exploit indexes. I don’t know enough about your application to give you advice about how to redesign your search, though.Switch to using postgreSQL and use its trigram indexes. They handle
LIKE '%searchterm%'
queries efficiently.