I need to find a faster solution than my two working solutions below.
If I pass a comma separated string into a function, I need the most efficient way to see if a field matches a value in that string. For the below example comma_string
is a text parameter of the function which could equal a string like this 2,56,34,98,23
.
So far my fastest solution is:
set _json_set = concat("[", comma_string, "]");
select * from `table` where `ID` in (select * from json_table(_json_set, "$[*]" columns(`c` int path "$")) as `jt`);
A far slower solution would be to use as find_in_set
, however find_in_set
doesn’t use indexes properly, thus it is slower:
select * from `table` where find_in_set(`ID`, comma_string);
The above examples work fine and of course it would be great to do below, but as far as I am aware, it is not possible because a variable/parameter can’t be a set/array, and using the term in
is fast as it uses indexes properly. The top solution is only slower than the below concept because of the building of the json table:
select * from `table` where `ID` in (comma_string_set);
Is there a more efficient way (efficiency being speed) than the top method. Please remember that this is inside a function.
2
Answers
The only way you can do what you want and make it use an index is to use a dynamic SQL statement.
You are responsible for guaranteeing that the
comma_string_set
is safe and won’t cause an SQL injection vulnerability. That is, it must contain only a comma-separated list of id values.You can’t use this method inside a trigger or a stored function, but you can use it inside a stored procedure (unless the procedure is called from a trigger or a stored function).
If you must do this in a stored function, then you need to take a step back and reconsider how you’re designing your application.
A
JOIN
instead of theIN
-subquery might be faster:Note: The query can return duplicates if there are any in the
comma_string
.