-
-
Save telekosmos/3b62a31a5c43f40849bb to your computer and use it in GitHub Desktop.
var uniqueArray = function(arrArg) { | |
return arrArg.filter(function(elem, pos,arr) { | |
return arr.indexOf(elem) == pos; | |
}); | |
}; | |
var uniqEs6 = (arrArg) => { | |
return arrArg.filter((elem, pos, arr) => { | |
return arr.indexOf(elem) == pos; | |
}); | |
} | |
var test = ['mike','james','james','alex']; | |
var testBis = ['alex', 'yuri', 'jabari']; | |
console.log(uniqueArray(test.concat(testBis))); |
ES6 Set & Unique keys
let left = ["research", "stockfast01", "news", "stockfast01"];
left = [...new Set(left)];
https://stackoverflow.com/questions/9229645/remove-duplicate-values-from-js-array
https://codereview.stackexchange.com/questions/60128/removing-duplicates-from-an-array-quickly
ES6 version
var names = ['mike','james','james','alex'];
let output = names.filter((value) => {
return !this[value] && (this[value] = true)
}, Object.create(null))
// output is ['mike','james','alex'];
@timrsmith +1
var arr = [1,2,4,13,1];
Array.from(new Set(arr))
So how does this work when merging should happen using a property of the array object?
@seanmavley You can merge array before passing it to Set
.
const a = [1, 2, 3]
cont b = [2, 3, 4, 5]
const uniqueMerged = [...new Set([...a, ...b])] // 1, 2, 3, 4, 5
do it with ladosh
const _ = require('lodash');
...
_.uniq([1,2,3,3,'a','a','x'])//1,2,3,'a','x'
_.uniqBy([{a:1,b:2},{a:1,b:2},{a:1,b:3}], v=>v.a.toString()+v.b.toString())//[{a:1,b:2},{a:1,b:3}]
@mordentware I was puzzled with your results so I made a CodePen with my own data sample where I needed to remove duplicates.
Data array has 6.288 items, 5.284 of them are unique. Results are nearly the same for both sorted and unsorted arrays.
My findings are that filter and reduce are similar while Set was much faster. Reduce with spread was much slower due to a large number of deconstruction/construction.
See the Pen Deduplicating speed test by Mladen Đurić (@macmladen) on CodePen.
(times may vary due to system, browser, CPU, memory...)
Results on MacBook Pro i7 form 2011, using Firefox (usually with few hundred open tabs ;)
Just discovered this thread and found that [0, 1, NaN, 3].indexOf(NaN)
yields -1. :-( Probably because NaN != NaN.
Set
dedups NaN correctly, though.
Set is so cool! Never knew it existed!
Set is shorter but filter with indexof is faster. example
it's just because the better complexity will faster on bigger counts. And be faster greatly be on them, instead of 10 elements array...
you have to check it at least on hundreds
If you need to filter by object value and a performance way :
// creates an object only once - garbage be glad ^^
let cachedObject = {};
// array to be filtered
let arr = [
{id : 0, prop : 'blah'},
{id : 1, prop : 'foo'},
{id : 0, prop : 'bar'}
]
// "filter" to object - keep original array - garbage be glad too ^^
arr.map((item)=> cachedObject[item.id] = item)
// optional, turns object to array
arr = Object.values(cachedObject)
What's wrong with
let arr1 = ["apples", "apples", "oranges", "bananas"]; arr1 = Array.from(new Set(arr1));
Surely this is a lot simpler if you're just trying to remove duplicates. Obviously this would work better as the return statement in a function. I'm just posting this as an example.
let arr = [1, 2, 3, 4, 3, 2];
arr.filter((e, i, arr) => arr.indexOf(e) == i);
let arr = [{x:10, y: 20}, {x:10, y: 30}, {x:10, y: 20}]; // each element has x and y props
arr.filter((e, i, arr) => arr.findIndex(e2 => Object.keys(e2).every(prop => e2[prop] == e[prop])) == i);
I find this reads a little bit better if you stash your reducer function in a named variable.
const duplicates = (e, i, arr) => arr.indexOf(e) === i
let arr = [1, 2, 3, 4, 3, 2];
arr.filter(duplicates);
Array.prototype.uniq = function(key) {
return key
? this.map(e => e[key])
.map((e, i, final) => final.indexOf(e) === i && i)
.filter(e => this[e])
.map(e => this[e])
: [...new Set(this)];
}
In Typescript
export const dedupeByProperty =
<T>(arr: ReadonlyArray<T>, objKey: keyof T) =>
arr.reduce<ReadonlyArray<T>>((acc, curr) =>
acc.some(a => a[objKey] === curr[objKey])
? acc
: [...acc, curr], [])
Find brief article here : Click here to view
const uniq = elements.reduce((acc, value) => acc.some(i => i.id === value.id) ? acc : acc.concat(value), []); // id your uniq key
Excelent, thanks!
const deduplicateArrays = ( arr1, arr2 = [] ) => [
...new Set([
...arr1.map(i => JSON.stringify(i)),
...arr2.map(i => JSON.stringify(i))
])
].map(i => JSON.parse(i))
Hi @gevera
Beware of problems when using JSON.stringify and JSON.parse with:
- number: https://github.com/josdejong/lossless-json
- date: https://stackoverflow.com/questions/31096130/how-to-json-stringify-a-javascript-date-and-preserve-timezone
Being bitten more than once... 🤕
Check with include before add, like below
var Array = [1, 2, 3]
Add()
{
if(!Array. include(4))
{
Array.push(4);
console.log(Array);
}
}
Output: [1,2,3,4]