skip to Main Content

I have the next query to get the 15 most sold plates in a place:

This query is taking 12 seconds to execute over 100,000 rows. I think this execution takes too long, so I am searching a way to optmize the query.

I ran the explain SQL command on PHPMyAdmin and i got this:
[![enter image description here][1]][1]

According to this, the main problem is on the p table which is scanning the entire table, but how can I fix this? The id of p table is a primary key, do I need to set it also as an index? Also, is there anything else I can do to make the query runs faster?

2

Answers


  1. You did include the explain plan but you did not give any information about your table structure, data distribution, cardinality nor volumes. Assuming your indices are accurate and you have an even data distribution, the query is having to process over 12 million rows – not 100,000. But even then, that is relatively poor performance. But you never told us what hardware this sits on nor the background load.

    A query with so many joins is always going to be slow – are they all needed?

    the main problem is on the p table which is scanning the entire table

    Full table scans are not automatically bad. The cost of dereferencing an index lookup as opposed to a streaming read is about 20 times more. Since the only constraint you apply to this table is its joins to other tables, there’s nothing in the question you asked to suggest there is much scope for improving this.

    Login or Signup to reply.
  2. You can make a relationship between the two tables.
    https://database.guide/how-to-create-a-relationship-in-mysql-workbench/

    Beside this you can also use a left join so you won’t load the whole right table in.
    Order by is a slow function in MySQL, if you are using code afterwards you can just do it in the code that is much faster than order by.

    I hope I helped and Community feel free to edit 🙂

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search