Skip to content

Instantly share code, notes, and snippets.

@telekosmos
Last active November 15, 2022 17:13
Show Gist options
  • Save telekosmos/3b62a31a5c43f40849bb to your computer and use it in GitHub Desktop.
Save telekosmos/3b62a31a5c43f40849bb to your computer and use it in GitHub Desktop.
Remove duplicates from js array (ES5/ES6)
var uniqueArray = function(arrArg) {
return arrArg.filter(function(elem, pos,arr) {
return arr.indexOf(elem) == pos;
});
};
var uniqEs6 = (arrArg) => {
return arrArg.filter((elem, pos, arr) => {
return arr.indexOf(elem) == pos;
});
}
var test = ['mike','james','james','alex'];
var testBis = ['alex', 'yuri', 'jabari'];
console.log(uniqueArray(test.concat(testBis)));
@spotsadmin
Copy link

If the array elements are objects, indexOf won't help. Here is a soln for that (in es6)

const uniqList = fullList.filter((s1, pos, arr) => arr.findIndex((s2)=>s2._id === s1._id) === pos});

@lionxcr
Copy link

lionxcr commented Jan 6, 2018

This is what I came up with as I needed to filter results from an elastic search

let array1 = [{"id":1,"name":"apple","id":2,"name":"orange"}];
let array2 = [{"id":1,"name":"apple","id":4,"name":"purple"}];

let unique = array1.concat(array2).filter((obj, key, array) => array.map((obj2) => obj.id !== obj2.id));

will return a unique array filtered on any key...

const uniqueArray = (arr, objKey) => arr.filter((obj, key, array) => array.map((obj2) => obj[`${objKey}`] !== obj2[`${objKey}`] ));
this.state = uniqueArray(array1, "id");

Hope this can save others some time!

@timrsmith
Copy link

timrsmith commented Feb 4, 2018

For generating a unique array of objects:

uniqueArray = a => [...new Set(a.map(o => JSON.stringify(o)))].map(s => JSON.parse(s))

@guillaumegarcia13
Copy link

@timrsmith Just a small word of caution with the JSON.parse(JSON.stringify(...)) approach. If you have properties with Date they will be replaced with their JSON representation (toISOString()).
uniqueArray([ {a: new Date()}, {b: new Date(1978,3,29)}, {a: new Date()}, {b: new Date(1978,3,29)} ]);

(2) [{…}, {…}]
0: {a: "2018-02-20T15:50:53.516Z"}
1: {b: "1978-04-28T22:00:00.000Z"}

@spacehunter
Copy link

@timrsmith beautiful!

@billiegoose
Copy link

I too feel compelled to say: +1 for @guidobouman's [...new Set(arr)]

@rravithejareddy
Copy link

Check with include before add, like below
var Array = [1, 2, 3]

Add()
{
if(!Array. include(4))
{
Array.push(4);
console.log(Array);
}

}

Output: [1,2,3,4]

@xyzdata
Copy link

xyzdata commented Mar 30, 2018

@whitehorse0
Copy link

whitehorse0 commented Apr 11, 2018

ES6 version

var names = ['mike','james','james','alex'];

let output = names.filter((value) => {
  return !this[value] && (this[value] = true)
}, Object.create(null))

// output is ['mike','james','alex'];

@pikislabis
Copy link

@karosi12
Copy link

karosi12 commented Apr 25, 2018

var arr = [1,2,4,13,1];
Array.from(new Set(arr))

@seanmavley
Copy link

So how does this work when merging should happen using a property of the array object?

@iamvanja
Copy link

@seanmavley You can merge array before passing it to Set.

const a = [1, 2, 3]
cont b = [2, 3, 4, 5]
const uniqueMerged = [...new Set([...a, ...b])] // 1, 2, 3, 4, 5 

@VSmain
Copy link

VSmain commented Jul 13, 2018

do it with ladosh

const _ = require('lodash');
...
_.uniq([1,2,3,3,'a','a','x'])//1,2,3,'a','x'
_.uniqBy([{a:1,b:2},{a:1,b:2},{a:1,b:3}], v=>v.a.toString()+v.b.toString())//[{a:1,b:2},{a:1,b:3}]

@macmladen
Copy link

@mordentware I was puzzled with your results so I made a CodePen with my own data sample where I needed to remove duplicates.

Data array has 6.288 items, 5.284 of them are unique. Results are nearly the same for both sorted and unsorted arrays.

My findings are that filter and reduce are similar while Set was much faster. Reduce with spread was much slower due to a large number of deconstruction/construction.

See the Pen Deduplicating speed test by Mladen Đurić (@macmladen) on CodePen.

(times may vary due to system, browser, CPU, memory...)

Results on MacBook Pro i7 form 2011, using Firefox (usually with few hundred open tabs ;)

image

@joeyparrish
Copy link

joeyparrish commented Sep 6, 2018

Just discovered this thread and found that [0, 1, NaN, 3].indexOf(NaN) yields -1. :-( Probably because NaN != NaN.

Set dedups NaN correctly, though.

@nabilfreeman
Copy link

Set is so cool! Never knew it existed!

@joshuapinter
Copy link

As with @VSmain, I'm tossing Lodash into the ring:

import _ from "lodash";

_.uniq([2, 1, 2]);
// => [2, 1]

Documentation on uniq.

@impfromliga
Copy link

Set is shorter but filter with indexof is faster. example

it's just because the better complexity will faster on bigger counts. And be faster greatly be on them, instead of 10 elements array...
you have to check it at least on hundreds

@brunoandradebr
Copy link

If you need to filter by object value and a performance way :

// creates an object only once - garbage be glad ^^
let cachedObject = {};

// array to be filtered
let arr = [
    {id : 0, prop : 'blah'},
    {id : 1, prop : 'foo'},
    {id : 0, prop : 'bar'}
]

// "filter" to object - keep original array - garbage be glad too ^^
arr.map((item)=> cachedObject[item.id] = item)

// optional, turns object to array
arr = Object.values(cachedObject)

@Kr3m
Copy link

Kr3m commented Feb 8, 2019

What's wrong with

let arr1 = ["apples", "apples", "oranges", "bananas"]; arr1 = Array.from(new Set(arr1));

Surely this is a lot simpler if you're just trying to remove duplicates. Obviously this would work better as the return statement in a function. I'm just posting this as an example.

@little-brother
Copy link

let arr = [1, 2, 3, 4, 3, 2];
arr.filter((e, i, arr) => arr.indexOf(e) == i);

let arr = [{x:10, y: 20}, {x:10, y: 30}, {x:10, y: 20}]; // each element has x and y props
arr.filter((e, i, arr) => arr.findIndex(e2 => Object.keys(e2).every(prop => e2[prop] == e[prop])) == i);

@philihp
Copy link

philihp commented Feb 26, 2019

I find this reads a little bit better if you stash your reducer function in a named variable.

const duplicates = (e, i, arr) => arr.indexOf(e) === i

let arr = [1, 2, 3, 4, 3, 2];
arr.filter(duplicates);

@indatawetrust
Copy link

indatawetrust commented Mar 15, 2019

Array.prototype.uniq = function(key) {
  return key
    ? this.map(e => e[key])
        .map((e, i, final) => final.indexOf(e) === i && i)
        .filter(e => this[e])
        .map(e => this[e])
    : [...new Set(this)];
}

@patrickmichalina
Copy link

patrickmichalina commented Mar 16, 2019

In Typescript

export const dedupeByProperty =
  <T>(arr: ReadonlyArray<T>, objKey: keyof T) =>
    arr.reduce<ReadonlyArray<T>>((acc, curr) =>
      acc.some(a => a[objKey] === curr[objKey])
        ? acc
        : [...acc, curr], [])

@pankajkrr
Copy link

Find brief article here : Click here to view

@fosteev
Copy link

fosteev commented Mar 10, 2020

const uniq = elements.reduce((acc, value) => acc.some(i => i.id === value.id) ? acc : acc.concat(value), []); // id your uniq key

@ioness
Copy link

ioness commented Apr 27, 2020

Excelent, thanks!

@gevera
Copy link

gevera commented Nov 15, 2022

const deduplicateArrays = ( arr1, arr2 = [] ) => [
...new Set([
           ...arr1.map(i => JSON.stringify(i)),
           ...arr2.map(i => JSON.stringify(i))
          ])
].map(i => JSON.parse(i))

@guillaumegarcia13
Copy link

Hi @gevera
Beware of problems when using JSON.stringify and JSON.parse with:

Being bitten more than once... 🤕

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment