I have an array of data. array contains over 500k items. I want to map items in the array then fileter it. This is what I have done.
I create a function that filters array tpo get unique values
const getUniqueValues = (array: string[]): string[] => {
return array.filter((item, index, _array) => _array.indexOf(item) === index);
};
then I pass in the mapped data into the function
const uniqueValues = getUniqueValues(
editedData.map((bodyItem: any) => bodyItem[index])
).filter(Boolean);
This worked well and faster when the array contains less items. Now, it takes sometimes five to ten minutes to perform action, which isn’t good for user experience. Now uniqueValue
returns aproximately 210,000 items
in array
Is there a better way to perform this in less time?
I have tried array.reduce
, not sure of my code tho cos it seems not to solve the problem. if someone could check it out, i’ll appreciate
const uniqueValues = editedData.reduce(
(accumulator, bodyItem) => {
const item = bodyItem[index];
if (!accumulator.includes(item)) {
accumulator.push(item);
}
return accumulator;
},
[]
);
2
Answers
With that high amount of items, you should be using the builtin Set class, which will get rid of duplicates efficiently. Just replace
getUniqueValues
with the code belowcan you try this :