Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Remove duplicates from js array (ES5/ES6)
var uniqueArray = function(arrArg) {
return arrArg.filter(function(elem, pos,arr) {
return arr.indexOf(elem) == pos;
});
};
var uniqEs6 = (arrArg) => {
return arrArg.filter((elem, pos, arr) => {
return arr.indexOf(elem) == pos;
});
}
var test = ['mike','james','james','alex'];
var testBis = ['alex', 'yuri', 'jabari'];
console.log(uniqueArray(test.concat(testBis)));
@mbplautz

This comment has been minimized.

Copy link

commented Apr 1, 2016

Rock on, this code is exactly what I was looking for. For an unnecessarily cryptic version of uniqEs6, you could instead use the one-liner:

var uniqEs6 = (arrArg) => arrArg.filter((elem, pos, arr) => arr.indexOf(elem) == pos);
(Of course there are those JavaScripters who believe that (...) => {} is not intended to be a replacement for function(...) {})

@VonD

This comment has been minimized.

Copy link

commented Jul 15, 2016

You could also use the Set object :

new Set([1, 2, 3, 1, 2, 3]).toJSON() // [1, 2, 3]
@guidobouman

This comment has been minimized.

Copy link

commented Jul 20, 2016

Or use the spread operator:

[ ...new Set([1, 2, 3, 1, 2, 3]) ] // [1, 2, 3]
@jedrichards

This comment has been minimized.

Copy link

commented Oct 25, 2016

(+1 for const foo = () => {} over function foo () {}. It has clearer semantics)

@augnustin

This comment has been minimized.

Copy link

commented Nov 23, 2016

👍 The filter strategy is clearly the more readable one!

@idangozlan

This comment has been minimized.

Copy link

commented May 11, 2017

Thanks :)

@cup

This comment has been minimized.

Copy link

commented Jun 28, 2017

[11, 22, 22, 33].reduce((x, y) => x.includes(y) ? x : [...x, y], [])
@glococo

This comment has been minimized.

Copy link

commented Jul 2, 2017

Excelent shortest function and without map (above function).
This one filter and returns an "array of objects"

var some = [ {name:"Guille", last:"Foo"},{name:"Jorge", last:"bar"},{name:"Pedro", last:"Foo"},{name:"Guille", last:"Ipsum"} ];
some.reduce((x, y) => x.findIndex(e=>e.name==y.name)<0 ? [...x, y]: x, [])

@domwashburn

This comment has been minimized.

Copy link

commented Jul 2, 2017

@svnpenn this is excellent!

@tjcafferkey

This comment has been minimized.

Copy link

commented Jul 11, 2017

Nice! Only thing missing really, there's no safety checks in place. What if your function is given a string, or an integer? You're assuming it's always going to be an array.

@4031651

This comment has been minimized.

Copy link

commented Jul 14, 2017

[1,2,2,3,1,4,1,4,2,3].filter((el, i, a) => i === a.indexOf(el))
@shellscape

This comment has been minimized.

Copy link

commented Jul 27, 2017

+1 for [ ...new Set([ ]) ] beautiful

@FaiChou

This comment has been minimized.

Copy link

commented Sep 25, 2017

const test = [{a:1,b:2}, {a:1,b:2}, {a:1,b:2,c:3}, {a:1,b:2,c:3,d:4}];

how to deal with this situation?

@LuisPaGarcia

This comment has been minimized.

Copy link

commented Oct 5, 2017

Thanks a lot for this! 💯

@andr83

This comment has been minimized.

Copy link

commented Oct 14, 2017

Filter with indexOf has n2 complexity - do not think it's the best chose. Option with Set looking better.

@bilousov94

This comment has been minimized.

Copy link

commented Nov 7, 2017

Also +1 [ ...new Set([1, 2, 3, 1, 2, 3]) ]

@KarloZKvasin

This comment has been minimized.

Copy link

commented Nov 14, 2017

Set is shorter but filter with indexof is faster. example

@mordentware

This comment has been minimized.

Copy link

commented Nov 30, 2017

@KarloZKvasin: You can mutate the accumulator for the reducer if speed is your concern for the reduce. Building on your example, times for me were:

  • 137 for filter + indexof
  • 352 for Set
  • 357 for reduce
  • 103 for a reducer that pushes rather than creates a new array for each unique element

EDIT: Above was with Chrome (Firefox gave similar results, albeit a little higher). Hilariously, with IE11 on the same machine my times were:

  • 262 for filter + indexof
  • 2098 for Set
  • 6130 for reduce
  • 5052 for the mutating reduce

IE Edge was a little better:

  • 257 for filter + indexof
  • 1349 for Set
  • 709 for reduce
  • 308 for mutating reduce

Basically, filter + indexof is probably the way to go if speed is your concern.

@spotsadmin

This comment has been minimized.

Copy link

commented Dec 21, 2017

If the array elements are objects, indexOf won't help. Here is a soln for that (in es6)

const uniqList = fullList.filter((s1, pos, arr) => arr.findIndex((s2)=>s2._id === s1._id) === pos});
@lionxcr

This comment has been minimized.

Copy link

commented Jan 6, 2018

This is what I came up with as I needed to filter results from an elastic search

let array1 = [{"id":1,"name":"apple","id":2,"name":"orange"}];
let array2 = [{"id":1,"name":"apple","id":4,"name":"purple"}];

let unique = array1.concat(array2).filter((obj, key, array) => array.map((obj2) => obj.id !== obj2.id));

will return a unique array filtered on any key...

const uniqueArray = (arr, objKey) => arr.filter((obj, key, array) => array.map((obj2) => obj[`${objKey}`] !== obj2[`${objKey}`] ));
this.state = uniqueArray(array1, "id");

Hope this can save others some time!

@timrsmith

This comment has been minimized.

Copy link

commented Feb 4, 2018

For generating a unique array of objects:

uniqueArray = a => [...new Set(a.map(o => JSON.stringify(o)))].map(s => JSON.parse(s))
@guillaumegarcia13

This comment has been minimized.

Copy link

commented Feb 20, 2018

@timrsmith Just a small word of caution with the JSON.parse(JSON.stringify(...)) approach. If you have properties with Date they will be replaced with their JSON representation (toISOString()).
uniqueArray([ {a: new Date()}, {b: new Date(1978,3,29)}, {a: new Date()}, {b: new Date(1978,3,29)} ]);

(2) [{…}, {…}]
0: {a: "2018-02-20T15:50:53.516Z"}
1: {b: "1978-04-28T22:00:00.000Z"}

@spacehunter

This comment has been minimized.

Copy link

commented Feb 28, 2018

@timrsmith beautiful!

@wmhilton

This comment has been minimized.

Copy link

commented Mar 16, 2018

I too feel compelled to say: +1 for @guidobouman's [...new Set(arr)]

@rravithejareddy

This comment has been minimized.

Copy link

commented Mar 29, 2018

Check with include before add, like below
var Array = [1, 2, 3]

Add()
{
if(!Array. include(4))
{
Array.push(4);
console.log(Array);
}

}

Output: [1,2,3,4]

@xyzdata

This comment has been minimized.

Copy link

commented Mar 30, 2018

@whitehorse0

This comment has been minimized.

Copy link

commented Apr 11, 2018

ES6 version

var names = ['mike','james','james','alex'];

let output = names.filter((value) => {
  return !this[value] && (this[value] = true)
}, Object.create(null))

// output is ['mike','james','alex'];
@pikislabis

This comment has been minimized.

Copy link

commented Apr 24, 2018

@karosi12

This comment has been minimized.

Copy link

commented Apr 25, 2018

var arr = [1,2,4,13,1];
Array.from(new Set(arr))

@seanmavley

This comment has been minimized.

Copy link

commented Jun 6, 2018

So how does this work when merging should happen using a property of the array object?

@iamvanja

This comment has been minimized.

Copy link

commented Jun 19, 2018

@seanmavley You can merge array before passing it to Set.

const a = [1, 2, 3]
cont b = [2, 3, 4, 5]
const uniqueMerged = [...new Set([...a, ...b])] // 1, 2, 3, 4, 5 
@VSmain

This comment has been minimized.

Copy link

commented Jul 13, 2018

do it with ladosh

const _ = require('lodash');
...
_.uniq([1,2,3,3,'a','a','x'])//1,2,3,'a','x'
_.uniqBy([{a:1,b:2},{a:1,b:2},{a:1,b:3}], v=>v.a.toString()+v.b.toString())//[{a:1,b:2},{a:1,b:3}]
@macmladen

This comment has been minimized.

Copy link

commented Aug 15, 2018

@mordentware I was puzzled with your results so I made a CodePen with my own data sample where I needed to remove duplicates.

Data array has 6.288 items, 5.284 of them are unique. Results are nearly the same for both sorted and unsorted arrays.

My findings are that filter and reduce are similar while Set was much faster. Reduce with spread was much slower due to a large number of deconstruction/construction.

See the Pen Deduplicating speed test by Mladen Đurić (@macmladen) on CodePen.

(times may vary due to system, browser, CPU, memory...)

Results on MacBook Pro i7 form 2011, using Firefox (usually with few hundred open tabs ;)

image

@joeyparrish

This comment has been minimized.

Copy link

commented Sep 6, 2018

Just discovered this thread and found that [0, 1, NaN, 3].indexOf(NaN) yields -1. :-( Probably because NaN != NaN.

Set dedups NaN correctly, though.

@nabilfreeman

This comment has been minimized.

Copy link

commented Sep 12, 2018

Set is so cool! Never knew it existed!

@joshuapinter

This comment has been minimized.

Copy link

commented Oct 18, 2018

As with @VSmain, I'm tossing Lodash into the ring:

import _ from "lodash";

_.uniq([2, 1, 2]);
// => [2, 1]

Documentation on uniq.

@impfromliga

This comment has been minimized.

Copy link

commented Nov 12, 2018

Set is shorter but filter with indexof is faster. example

it's just because the better complexity will faster on bigger counts. And be faster greatly be on them, instead of 10 elements array...
you have to check it at least on hundreds

@brunoandradebr

This comment has been minimized.

Copy link

commented Jan 19, 2019

If you need to filter by object value and a performance way :

// creates an object only once - garbage be glad ^^
let cachedObject = {};

// array to be filtered
let arr = [
    {id : 0, prop : 'blah'},
    {id : 1, prop : 'foo'},
    {id : 0, prop : 'bar'}
]

// "filter" to object - keep original array - garbage be glad too ^^
arr.map((item)=> cachedObject[item.id] = item)

// optional, turns object to array
arr = Object.values(cachedObject)
@Kr3m

This comment has been minimized.

Copy link

commented Feb 8, 2019

What's wrong with

let arr1 = ["apples", "apples", "oranges", "bananas"]; arr1 = Array.from(new Set(arr1));

Surely this is a lot simpler if you're just trying to remove duplicates. Obviously this would work better as the return statement in a function. I'm just posting this as an example.

@little-brother

This comment has been minimized.

Copy link

commented Feb 13, 2019

let arr = [1, 2, 3, 4, 3, 2];
arr.filter((e, i, arr) => arr.indexOf(e) == i);

let arr = [{x:10, y: 20}, {x:10, y: 30}, {x:10, y: 20}]; // each element has x and y props
arr.filter((e, i, arr) => arr.findIndex(e2 => Object.keys(e2).every(prop => e2[prop] == e[prop])) == i);
@philihp

This comment has been minimized.

Copy link

commented Feb 26, 2019

I find this reads a little bit better if you stash your reducer function in a named variable.

const duplicates = (e, i, arr) => arr.indexOf(e) === i

let arr = [1, 2, 3, 4, 3, 2];
arr.filter(duplicates);
@indatawetrust

This comment has been minimized.

Copy link

commented Mar 15, 2019

Array.prototype.uniq = function(key) {
  return key
    ? this.map(e => e[key])
        .map((e, i, final) => final.indexOf(e) === i && i)
        .filter(e => this[e])
        .map(e => this[e])
    : [...new Set(this)];
}
@patrickmichalina

This comment has been minimized.

Copy link

commented Mar 16, 2019

In Typescript

export const dedupeByProperty =
  <T>(arr: ReadonlyArray<T>, objKey: keyof T) =>
    arr.reduce<ReadonlyArray<T>>((acc, curr) =>
      acc.some(a => a[objKey] === curr[objKey])
        ? acc
        : [...acc, curr], [])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.