-
-
Save telekosmos/3b62a31a5c43f40849bb to your computer and use it in GitHub Desktop.
var uniqueArray = function(arrArg) { | |
return arrArg.filter(function(elem, pos,arr) { | |
return arr.indexOf(elem) == pos; | |
}); | |
}; | |
var uniqEs6 = (arrArg) => { | |
return arrArg.filter((elem, pos, arr) => { | |
return arr.indexOf(elem) == pos; | |
}); | |
} | |
var test = ['mike','james','james','alex']; | |
var testBis = ['alex', 'yuri', 'jabari']; | |
console.log(uniqueArray(test.concat(testBis))); |
Set is shorter but filter with indexof is faster. example
it's just because the better complexity will faster on bigger counts. And be faster greatly be on them, instead of 10 elements array...
you have to check it at least on hundreds
If you need to filter by object value and a performance way :
// creates an object only once - garbage be glad ^^
let cachedObject = {};
// array to be filtered
let arr = [
{id : 0, prop : 'blah'},
{id : 1, prop : 'foo'},
{id : 0, prop : 'bar'}
]
// "filter" to object - keep original array - garbage be glad too ^^
arr.map((item)=> cachedObject[item.id] = item)
// optional, turns object to array
arr = Object.values(cachedObject)
What's wrong with
let arr1 = ["apples", "apples", "oranges", "bananas"]; arr1 = Array.from(new Set(arr1));
Surely this is a lot simpler if you're just trying to remove duplicates. Obviously this would work better as the return statement in a function. I'm just posting this as an example.
let arr = [1, 2, 3, 4, 3, 2];
arr.filter((e, i, arr) => arr.indexOf(e) == i);
let arr = [{x:10, y: 20}, {x:10, y: 30}, {x:10, y: 20}]; // each element has x and y props
arr.filter((e, i, arr) => arr.findIndex(e2 => Object.keys(e2).every(prop => e2[prop] == e[prop])) == i);
I find this reads a little bit better if you stash your reducer function in a named variable.
const duplicates = (e, i, arr) => arr.indexOf(e) === i
let arr = [1, 2, 3, 4, 3, 2];
arr.filter(duplicates);
Array.prototype.uniq = function(key) {
return key
? this.map(e => e[key])
.map((e, i, final) => final.indexOf(e) === i && i)
.filter(e => this[e])
.map(e => this[e])
: [...new Set(this)];
}
In Typescript
export const dedupeByProperty =
<T>(arr: ReadonlyArray<T>, objKey: keyof T) =>
arr.reduce<ReadonlyArray<T>>((acc, curr) =>
acc.some(a => a[objKey] === curr[objKey])
? acc
: [...acc, curr], [])
Find brief article here : Click here to view
const uniq = elements.reduce((acc, value) => acc.some(i => i.id === value.id) ? acc : acc.concat(value), []); // id your uniq key
Excelent, thanks!
const deduplicateArrays = ( arr1, arr2 = [] ) => [
...new Set([
...arr1.map(i => JSON.stringify(i)),
...arr2.map(i => JSON.stringify(i))
])
].map(i => JSON.parse(i))
Hi @gevera
Beware of problems when using JSON.stringify and JSON.parse with:
- number: https://github.com/josdejong/lossless-json
- date: https://stackoverflow.com/questions/31096130/how-to-json-stringify-a-javascript-date-and-preserve-timezone
Being bitten more than once... 🤕
As with @VSmain, I'm tossing Lodash into the ring:
Documentation on
uniq
.