929

I have an object that contains an array of objects.

obj = {};

obj.arr = new Array();

obj.arr.push({place:"here",name:"stuff"});
obj.arr.push({place:"there",name:"morestuff"});
obj.arr.push({place:"there",name:"morestuff"});

I'm wondering what is the best method to remove duplicate objects from an array. So for example, obj.arr would become...

{place:"here",name:"stuff"},
{place:"there",name:"morestuff"}
7
  • Do you mean how do you stop a hashtable/object with all the same parameters being added to an array? Commented Feb 8, 2010 at 0:46
  • 11
    Mathew -> If it is simpler to prevent a duplicate object from being added to the array in the first place, instead of filtering it out later, yes, that would be fine too.
    – Travis
    Commented Feb 8, 2010 at 1:01
  • 3
    Suuuper long answers and yet MDN has possibly the shortest: arrayWithNoDuplicates = Array.from(new Set(myArray))
    – tonkatata
    Commented Dec 6, 2021 at 21:47
  • 12
    @tonkatata This doesn't work with array of objects. Commented Dec 14, 2021 at 7:50
  • 4
    thanks to @tonkatata for inspiration. array of objects can be done with Array.from(new Set(myArray.map(e => JSON.stringify(e)))))
    – qyb2zm302
    Commented Oct 18, 2023 at 2:31

78 Answers 78

970

How about with some ES6 magic?

obj.arr = obj.arr.filter((value, index, self) =>
  index === self.findIndex((t) => (
    t.place === value.place && t.name === value.name
  ))
)

Reference URL

A more generic solution would be:

const uniqueArray = obj.arr.filter((value, index) => {
  const _value = JSON.stringify(value);
  return index === obj.arr.findIndex(obj => {
    return JSON.stringify(obj) === _value;
  });
});

Using the above property strategy instead of JSON.stringify:

const isPropValuesEqual = (subject, target, propNames) =>
  propNames.every(propName => subject[propName] === target[propName]);

const getUniqueItemsByProperties = (items, propNames) => 
  items.filter((item, index, array) =>
    index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNames))
  );

You can add a wrapper if you want the propNames property to be either an array or a value:

const getUniqueItemsByProperties = (items, propNames) => {
  const propNamesArray = Array.from(propNames);

  return items.filter((item, index, array) =>
    index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNamesArray))
  );
};

allowing both getUniqueItemsByProperties('a') and getUniqueItemsByProperties(['a']);

Stackblitz Example

Explanation

  • Start by understanding the two methods used:
  • Next take your idea of what makes your two objects equal and keep that in mind.
  • We can detect something as a duplicate, if it satisfies the criterion that we have just thought of, but its position is not at the first instance of an object with the criterion.
  • Therefore we can use the above criterion to determine if something is a duplicate.
18
  • 115
    This can be shortened to: things.thing = things.thing.filter((thing, index, self) => self.findIndex(t => t.place === thing.place && t.name === thing.name) === index)
    – Josh Cole
    Commented Mar 13, 2017 at 12:12
  • 10
    @vsync just take @BKM's answer and put it together, a generic solution would be: const uniqueArray = arrayOfObjects.filter((object,index) => index === arrayOfObjects.findIndex(obj => JSON.stringify(obj) === JSON.stringify(object))); jsfiddle.net/x9ku0p7L/28
    – Eydrian
    Commented Jul 18, 2018 at 11:33
  • 36
    The key here is that the findIndex() method returns the index of the first element, so if there is a second element that matches, it will never be found and added during the filter. I was staring at it for a minute :)
    – JBaczuk
    Commented Sep 13, 2019 at 23:21
  • 4
    One question, wouldn't this be an O(n^2) approach. In case I'm working with 30 records, I'd be doing 900 iterations, right? (Worst case scenario, no duplicates)
    – Jose A
    Commented Apr 23, 2020 at 10:29
  • 11
    If you have an array with 200,000 entries then this will take 40 BILLION iterations. This should never be used with a large array. Always use a map instead.
    – JP_
    Commented Feb 23, 2021 at 22:54
519

One liners with filter() (Preserves order)

If you have some identifier in the objects which signifies uniqueness (e.g. id), then we can use filter() with findIndex() to work through the list and verify that the index of each object with that id value matches only itself. This means that there's only one such object in the list, i.e. no duplicates.

myArr.filter((obj1, i, arr) => 
  arr.findIndex(obj2 => (obj2.id === obj1.id)) === i
)

(Note that this solution keeps the first instance of detected duplicates in the result. You can instead take the last instance by replacing findIndex with findLastIndex in the above.)

If the order is not important, then map solutions will be faster: Solution with map


The above format can be applied to other cases by altering how we check for duplicates (i.e. replacing obj2.id === obj1.id with something else).

Unique by multiple properties (e.g. place and name, as in the question)

myArr.filter((obj1, i, arr) => 
  arr.findIndex(obj2 => 
    ['place', 'name'].every(key => obj2[key] === obj1[key])
  ) === i
)

Unique by all properties

myArr.filter((obj1, i, arr) => 
  arr.findIndex(obj2 => 
    JSON.stringify(obj2) === JSON.stringify(obj1)
  ) === i
)

Caveats:

  • This may get slow, depending on object & array sizes
  • JSON.stringify() key order is generally consistent, but is only guaranteed in ES2015 and later
    • This means that your mileage may vary, and you may want to prefer something more robust (like comparing specific keys)
11
  • 54
    v,i,a == value, index, array
    – James B
    Commented Oct 9, 2020 at 18:23
  • 8
    arr.filter((v,i,a)=>a.findIndex(t=>(JSON.stringify(t) === JSON.stringify(v)))===i) this will not work if the keys are not in the same order Commented Mar 19, 2021 at 14:11
  • 2
    simply BEAUTIFUL
    – avalanche1
    Commented Nov 30, 2021 at 15:15
  • 6
    This would be better if it had an explanation of what these did. And if they used legible naming conventions instead of trying to pre-minify the code. Commented Jan 26, 2022 at 18:26
  • 1
    Here is with an arbitrary predicate that decides duplicateness: function uniqueByPredicate(arr, predicate) { return l.filter((v1, i, a) => a.findIndex(v2 => predicate(v1, v2)) === i); } where predicate must be of type (a: T, b: T) => boolean. Commented Oct 29, 2022 at 20:51
320

Using ES6+ in a single line you can get a unique list of objects by key:

const key = 'place';
const unique = [...new Map(arr.map(item => [item[key], item])).values()]

It can be put into a function:

function getUniqueListBy(arr, key) {
    return [...new Map(arr.map(item => [item[key], item])).values()]
}

Here is a working example:

const arr = [
    {place: "here",  name: "x", other: "other stuff1" },
    {place: "there", name: "x", other: "other stuff2" },
    {place: "here",  name: "y", other: "other stuff4" },
    {place: "here",  name: "z", other: "other stuff5" }
]

function getUniqueListBy(arr, key) {
    return [...new Map(arr.map(item => [item[key], item])).values()]
}

const arr1 = getUniqueListBy(arr, 'place')

console.log("Unique by place")
console.log(JSON.stringify(arr1))

console.log("\nUnique by name")
const arr2 = getUniqueListBy(arr, 'name')

console.log(JSON.stringify(arr2))

How does it work

First the array is remapped in a way that it can be used as an input for a Map.

arr.map(item => [item[key], item]);

which means each item of the array will be transformed in another array with 2 elements; the selected key as first element and the entire initial item as second element, this is called an entry (ex. array entries, map entries). And here is the official doc with an example showing how to add array entries in Map constructor.

Example when key is place:

[["here", {place: "here",  name: "x", other: "other stuff1" }], ...]

Secondly, we pass this modified array to the Map constructor and here is the magic happening. Map will eliminate the duplicate keys values, keeping only last inserted value of the same key. Note: Map keeps the order of insertion. (check difference between Map and object)

new Map(entry array just mapped above)

Third we use the map values to retrieve the original items, but this time without duplicates.

new Map(mappedArr).values()

And last one is to add those values into a fresh new array so that it can look as the initial structure and return that:

return [...new Map(mappedArr).values()]

9
  • This does not answer the original question as this is searches for an id. The question needs the entire object to be unique across all fields such as place and name
    – L. Holanda
    Commented Dec 10, 2019 at 18:54
  • Your ES6 function seems very concise and practical. Can you explain it a bit more? What is happening exactly? Are first or last duplicates removed? Or is it random, which duplicate gets removed? That would be helpful, thanks. Commented Mar 25, 2020 at 17:15
  • As far as i can tell, a Map with the property value as key is created. But it is not 100% how or if the order of the array is preserved. Commented Mar 25, 2020 at 18:16
  • 3
    Hi @DavidSchumann, I will update the answer and will explain how it works. But for short answer the order is preserved and the first one are removed... Just think about how it is inserted in the map... it checks if the key already exists it will update it, therfore the last one will remain
    – V. Sambor
    Commented Mar 25, 2020 at 18:21
  • 9
    TS version, incase anyone is looking: export const unique = <T extends { [key: string]: unknown }>(arr: T[], key: string): T[] => [ ...new Map(arr.map((item: T) => [item[key], item])).values() ];
    – readikus
    Commented Aug 16, 2022 at 7:55
259

Simple and performant solution with a better runtime than the 70+ answers that already exist:

const ids = arr.map(({ id }) => id);
const filtered = arr.filter(({ id }, index) => !ids.includes(id, index + 1));

Example:

const arr = [{
  id: 1,
  name: 'one'
}, {
  id: 2,
  name: 'two'
}, {
  id: 1,
  name: 'one'
}];

const ids = arr.map(({ id }) => id);
const filtered = arr.filter(({ id }, index) => !ids.includes(id, index + 1));

console.log(filtered);

How it works:

Array.filter() removes all duplicate objects by checking if the previously mapped id-array includes the current id ({id} destructs the object into only its id). To only filter out actual duplicates, it is using Array.includes()'s second parameter fromIndex with index + 1 which will ignore the current object and all previous.

Since every iteration of the filter callback method will only search the array beginning at the current index + 1, this also dramatically reduces the runtime because only objects not previously filtered get checked.

What if you don't have a single unique identifier like id?

Just create a temporary one:

const objToId = ({ name, city, birthyear }) => `${name}-${city}-${birthyear}`;


const ids = arr.map(objToId);
const filtered = arr.filter((item, index) => !ids.includes(objToId(item), index + 1));
11
  • 2
    @user239558 Good question but not really, it would be orders of magnitude slower and for objects with a different order like {id: 1, name: 'one'} and {namd: 'one', id: 1} it would fail to detect the duplicate.
    – leonheess
    Commented Feb 21, 2021 at 23:52
  • 1
    what is this magic with { id } you're pulling here? I'm following everything else. Was about to implement a Set for my own purposes but found this Commented Mar 4, 2021 at 17:34
  • 4
    Good question, @Timotronadon. { id } is destructuring the object into only its id-key. To illustrate let's look at this these two loops: 1. arr.forEach(object => console.log(object.id)) and 2. arr.forEach({id} => console.log(id)). They are both doing exactly the same thing: printing the id-key of all objects in arr. However, one is using destructuring and the other one is using a more conventional key access via the dot notation.
    – leonheess
    Commented Mar 4, 2021 at 23:54
  • 4
    def the best response here. Simple clean and elegant and works like a charm thank you!
    – d0rf47
    Commented Nov 19, 2021 at 0:54
  • 2
    Amazing answer. This worked perfectly without using any external library.
    – SatelBill
    Commented Mar 18, 2022 at 3:37
201

A primitive method would be:

const obj = {};

for (let i = 0, len = things.thing.length; i < len; i++) {
  obj[things.thing[i]['place']] = things.thing[i];
}

things.thing = new Array();

 for (const key in obj) { 
   things.thing.push(obj[key]);
}
7
  • 78
    You should never user the length in the for loop, because it will slow everything down calculating it on every iteration. Assign it to a variable outside the loop and pass the variable instead of the things.thing.length.
    – Nosebleed
    Commented Aug 26, 2014 at 12:56
  • 16
    @aefxx I do not quite understand this function, how do you handle the situation that the "place" is same but name is different, should that be consider dup or not?
    – Kuan
    Commented Jun 23, 2015 at 21:48
  • 2
    Though this works, it does not take care of a sorted array since fetching keys is never order guaranteed. So, you end up sorting it again. Now, suppose the array was not sorted but yet its order is important, there is no way you can make sure that order stays intact
    – Deepak G M
    Commented Apr 17, 2019 at 6:31
  • 3
    @DeepakGM You're absolutely right. The answer won't (necessarily) preserve a given order. If that is a requirement, one should look for another solution.
    – aefxx
    Commented Apr 17, 2019 at 17:03
  • How could I modify the above to remove objects from an array that contain X as well as de-duped?
    – Ryan H
    Commented Feb 9, 2020 at 12:36
158

If you can use Javascript libraries such as underscore or lodash, I recommend having a look at _.uniq function in their libraries. From lodash:

_.uniq(array, [isSorted=false], [callback=_.identity], [thisArg])

Basically, you pass in the array that in here is an object literal and you pass in the attribute that you want to remove duplicates with in the original data array, like this:

var data = [{'name': 'Amir', 'surname': 'Rahnama'}, {'name': 'Amir', 'surname': 'Stevens'}];
var non_duplidated_data = _.uniq(data, 'name'); 

UPDATE: Lodash now has introduced a .uniqBy as well.

4
  • 4
    @Praveen Pds: Did I say anything about underscore in the code example? I said 'lodash' has this function and underscore has similar ones. Before voting down, please read answers carefully.
    – ambodi
    Commented Jan 25, 2015 at 11:08
  • //Lists unique objects using _underscore.js holdingObject = _.uniq(holdingObject , function(item, key, name) { return item.name; });
    – praveenpds
    Commented Jan 26, 2015 at 8:31
  • 39
    Note: you now need to use uniqBy instead of uniq, e.g. _.uniqBy(data, 'name')... documentation: lodash.com/docs#uniqBy
    – drmrbrewer
    Commented Jun 14, 2017 at 7:46
  • If you have a deep collection: let data = [{'v': {'t':1, 'name':"foo"}}, {'v': {'t':1, 'name':"bar"}}]; do: let uniq = _.uniqBy(data, 'v.t'); Commented Jan 2, 2022 at 9:27
98

I had this exact same requirement, to remove duplicate objects in a array, based on duplicates on a single field. I found the code here: Javascript: Remove Duplicates from Array of Objects

So in my example, I'm removing any object from the array that has a duplicate licenseNum string value.

var arrayWithDuplicates = [
    {"type":"LICENSE", "licenseNum": "12345", state:"NV"},
    {"type":"LICENSE", "licenseNum": "A7846", state:"CA"},
    {"type":"LICENSE", "licenseNum": "12345", state:"OR"},
    {"type":"LICENSE", "licenseNum": "10849", state:"CA"},
    {"type":"LICENSE", "licenseNum": "B7037", state:"WA"},
    {"type":"LICENSE", "licenseNum": "12345", state:"NM"}
];

function removeDuplicates(originalArray, prop) {
     var newArray = [];
     var lookupObject  = {};

     for(var i in originalArray) {
        lookupObject[originalArray[i][prop]] = originalArray[i];
     }

     for(i in lookupObject) {
         newArray.push(lookupObject[i]);
     }
      return newArray;
 }

var uniqueArray = removeDuplicates(arrayWithDuplicates, "licenseNum");
console.log("uniqueArray is: " + JSON.stringify(uniqueArray));

The results:

uniqueArray is:

[{"type":"LICENSE","licenseNum":"10849","state":"CA"},
{"type":"LICENSE","licenseNum":"12345","state":"NM"},
{"type":"LICENSE","licenseNum":"A7846","state":"CA"},
{"type":"LICENSE","licenseNum":"B7037","state":"WA"}]
3
  • 1
    This would be more useful if the function could filter the 'falsy' objects too. for(var i in array) { if(array[i][prop]){ //valid lookupObject[array[i][prop]] = array[i]; } else { console.log('falsy object'); } } Commented Nov 6, 2017 at 17:49
  • 1
    Why not bring down the complexity 0(n) by using: for (let i in originalArray) { if (lookupObject[originalArray[i]['id']] === undefined) { newArray.push(originalArray[i]); } lookupObject[originalArray[i]['id']] = originalArray[i]; }
    – Tudor B.
    Commented Feb 17, 2019 at 22:17
  • this is the best way because it is important to know what it is that you want to not be duplicated. Now can this be done through reducer for e6 standards? Commented Sep 16, 2019 at 19:09
73

One liner using Set

var things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

// assign things.thing to myData for brevity
var myData = things.thing;

things.thing = Array.from(new Set(myData.map(JSON.stringify))).map(JSON.parse);

console.log(things.thing)

Explanation:

  1. new Set(myData.map(JSON.stringify)) creates a Set object using the stringified myData elements.
  2. Set object will ensure that every element is unique.
  3. Then I create an array based on the elements of the created set using Array.from.
  4. Finally, I use JSON.parse to convert stringified element back to an object.
4
  • 27
    the problem being {a: 1, b:2} wont be equal to {b:2,a:1}
    – PirateApp
    Commented Oct 2, 2017 at 10:00
  • 4
    keep in mind that there would be a problems with Date properties Commented Mar 5, 2018 at 6:31
  • This line creates random null values with a row object that do not exist in the original array of objects. Can you please help?
    – B1K
    Commented Oct 16, 2018 at 16:26
  • To address the issue @PirateApp pointed out in the comments the answer provided by @Mu can be modified as follows to handle objects with rearranged properties: const distinct = (data, elements = []) => [...new Set(data.map(o => JSON.stringify(o, elements)))].map(o => JSON.parse(o)); Then when calling distinct just pass in the property names for the elements array. For the original post that would be ['place', 'name']. For @PirateApp's example that would be ['a', 'b'].
    – knot22
    Commented May 25, 2022 at 13:31
62

ES6 one liner is here

let arr = [
  {id:1,name:"sravan ganji"},
  {id:2,name:"pinky"},
  {id:4,name:"mammu"},
  {id:3,name:"avy"},
  {id:3,name:"rashni"},
];

console.log(Object.values(arr.reduce((acc,cur)=>Object.assign(acc,{[cur.id]:cur}),{})))

7
  • 6
    Nice and clean if you only want to remove objects with a single duplicate value, not so clean for fully duplicated objects. Commented May 29, 2019 at 23:56
  • @DavidBarker you mean multiple duplicate values with an object ? Commented Sep 1, 2020 at 20:08
  • yes, but more specifically objects that have all identical values. Commented Sep 2, 2020 at 7:42
  • 1
    What is the functionality of :cur in cur.id]:cur? I dont understand this piece of the code. Commented Feb 17, 2021 at 16:12
  • 1
    As is always the case, explanation of code is good. Commented Jan 26, 2022 at 18:29
46

To remove all duplicates from an array of objects, the simplest way is use filter:

var uniq = {};
var arr  = [{"id":"1"},{"id":"1"},{"id":"2"}];
var arrFiltered = arr.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
console.log('arrFiltered', arrFiltered);

2
  • 12
    It's good practice on Stack Overflow to add an explanation as to why your solution should work, especially how yours is better than the other answers. For more information read How To Answer. Commented Sep 15, 2018 at 5:14
  • 1
    This does not answer the original question as this is searches for an id. The question needs the entire object to be unique across all fields such as place and name
    – L. Holanda
    Commented Dec 10, 2019 at 18:47
38

One liners with Map ( High performance, Does not preserve order )

Find unique id's in array arr.

const arrUniq = [...new Map(arr.map(v => [v.id, v])).values()]

If the order is important check out the solution with filter: Solution with filter


Unique by multiple properties ( place and name ) in array arr

const arrUniq = [...new Map(arr.map(v => [JSON.stringify([v.place,v.name]), v])).values()]

Unique by all properties in array arr

const arrUniq = [...new Map(arr.map(v => [JSON.stringify(v), v])).values()]

Keep the first occurrence in array arr

const arrUniq = [...new Map(arr.slice().reverse().map(v => [v.id, v])).values()].reverse()
2
  • The multiple properties solution worked perfectly. Thanks a lot! Commented Jul 19, 2022 at 6:06
  • "high performance" must be a joke because this is anything but fast.
    – leonheess
    Commented Aug 11, 2023 at 11:16
31

Here's another option to do it using Array iterating methods if you need comparison only by one field of an object:

    function uniq(a, param){
        return a.filter(function(item, pos, array){
            return array.map(function(mapItem){ return mapItem[param]; }).indexOf(item[param]) === pos;
        })
    }

    uniq(things.thing, 'place');
1
  • Although this has an order greater than O(n²), this fits my use case because my array size will always be less than 30. Thanks!
    – Sterex
    Commented Jul 13, 2016 at 11:44
26

This is a generic way of doing this: you pass in a function that tests whether two elements of an array are considered equal. In this case, it compares the values of the name and place properties of the two objects being compared.

ES5 answer

function removeDuplicates(arr, equals) {
    var originalArr = arr.slice(0);
    var i, len, val;
    arr.length = 0;

    for (i = 0, len = originalArr.length; i < len; ++i) {
        val = originalArr[i];
        if (!arr.some(function(item) { return equals(item, val); })) {
            arr.push(val);
        }
    }
}

function thingsEqual(thing1, thing2) {
    return thing1.place === thing2.place
        && thing1.name === thing2.name;
}

var things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];

removeDuplicates(things, thingsEqual);
console.log(things);

Original ES3 answer

function arrayContains(arr, val, equals) {
    var i = arr.length;
    while (i--) {
        if ( equals(arr[i], val) ) {
            return true;
        }
    }
    return false;
}

function removeDuplicates(arr, equals) {
    var originalArr = arr.slice(0);
    var i, len, j, val;
    arr.length = 0;

    for (i = 0, len = originalArr.length; i < len; ++i) {
        val = originalArr[i];
        if (!arrayContains(arr, val, equals)) {
            arr.push(val);
        }
    }
}

function thingsEqual(thing1, thing2) {
    return thing1.place === thing2.place
        && thing1.name === thing2.name;
}

removeDuplicates(things.thing, thingsEqual);
3
  • 1
    Two objects won't evaluate equal, even if they share the same properties and values.
    – kennebec
    Commented Feb 8, 2010 at 4:06
  • Yes, I know. But fair point, I've failed to read the question correctly: I hadn't spotted that it was objects with identical properties he needed to weed out. I'll edit my answer.
    – Tim Down
    Commented Feb 8, 2010 at 9:14
  • 1
    instead of while inside arrayContains- use Array.prototype..some method Returns true if one of array members match condition Commented Mar 5, 2018 at 6:29
25

If you can wait to eliminate the duplicates until after all the additions, the typical approach is to first sort the array and then eliminate duplicates. The sorting avoids the N * N approach of scanning the array for each element as you walk through them.

The "eliminate duplicates" function is usually called unique or uniq. Some existing implementations may combine the two steps, e.g., prototype's uniq

This post has few ideas to try (and some to avoid :-) ) if your library doesn't already have one! Personally I find this one the most straight forward:

    function unique(a){
        a.sort();
        for(var i = 1; i < a.length; ){
            if(a[i-1] == a[i]){
                a.splice(i, 1);
            } else {
                i++;
            }
        }
        return a;
    }  

    // Provide your own comparison
    function unique(a, compareFunc){
        a.sort( compareFunc );
        for(var i = 1; i < a.length; ){
            if( compareFunc(a[i-1], a[i]) === 0){
                a.splice(i, 1);
            } else {
                i++;
            }
        }
        return a;
    }
4
  • That won't work for generic objects without a natural sort order.
    – Tim Down
    Commented Feb 8, 2010 at 9:28
  • True, I added a user-supplied comparison version.
    – maccullt
    Commented Feb 8, 2010 at 10:57
  • Your user-supplied comparison version won't work because if your comparison function is function(_a,_b){return _a.a===_b.a && _a.b===_b.b;} then the array won't be sorted. Commented Mar 25, 2010 at 6:02
  • 1
    That is an invalid compare function. From developer.mozilla.org/en/Core_JavaScript_1.5_Reference/… ... function compare(a, b) { if (a is less than b by some ordering criterion) return -1; if (a is greater than b by the ordering criterion) return 1; // a must be equal to b return 0; } ...
    – maccullt
    Commented Mar 25, 2010 at 17:01
22

I think the best approach is using reduce and Map object. This is a single line solution.

const data = [
  {id: 1, name: 'David'},
  {id: 2, name: 'Mark'},
  {id: 2, name: 'Lora'},
  {id: 4, name: 'Tyler'},
  {id: 4, name: 'Donald'},
  {id: 5, name: 'Adrian'},
  {id: 6, name: 'Michael'}
]

const uniqueData = [...data.reduce((map, obj) => map.set(obj.id, obj), new Map()).values()];

console.log(uniqueData)

/*
  in `map.set(obj.id, obj)`
  
  'obj.id' is key. (don't worry. we'll get only values using the .values() method)
  'obj' is whole object.
*/

1
  • 3
    Anything can be made a "single line solution" by deleting the carriage returns and/or line feeds between lines :P. Commented Jan 26, 2022 at 18:32
17

Considering lodash.uniqWith

const objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
 
_.uniqWith(objects, _.isEqual);
// => [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }]
1
  • 1
    Neither lodash's uniq nor uniqBy did the trick, but your solution did. Thanks! Please give the source of your code however, if it's a direct copy. lodash.com/docs/4.17.10#uniqWith
    – Manu CJ
    Commented Jun 18, 2018 at 8:21
16

To add one more to the list. Using ES6 and Array.reduce with Array.find.
In this example filtering objects based on a guid property.

let filtered = array.reduce((accumulator, current) => {
  if (! accumulator.find(({guid}) => guid === current.guid)) {
    accumulator.push(current);
  }
  return accumulator;
}, []);

Extending this one to allow selection of a property and compress it into a one liner:

const uniqify = (array, key) => array.reduce((prev, curr) => prev.find(a => a[key] === curr[key]) ? prev : prev.push(curr) && prev, []);

To use it pass an array of objects and the name of the key you wish to de-dupe on as a string value:

const result = uniqify(myArrayOfObjects, 'guid')
15

You could also use a Map:

const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());

Full sample:

const things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());

console.log(JSON.stringify(dedupThings, null, 4));

Result:

[
    {
        "place": "here",
        "name": "stuff"
    },
    {
        "place": "there",
        "name": "morestuff"
    }
]
2
  • +1, nice tho explaining a bit more the inner working of dedupThings would be good - on the bright side I now understand reduce :D
    – MimiEAM
    Commented Apr 6, 2017 at 12:53
  • 1
    Great one line answer, finally I saw usage of Map :D
    – Farhad
    Commented Oct 20, 2021 at 8:21
15

Dang, kids, let's crush this thing down, why don't we?

let uniqIds = {}, source = [{id:'a'},{id:'b'},{id:'c'},{id:'b'},{id:'a'},{id:'d'}];
let filtered = source.filter(obj => !uniqIds[obj.id] && (uniqIds[obj.id] = true));
console.log(filtered);
// EXPECTED: [{id:'a'},{id:'b'},{id:'c'},{id:'d'}];

2
  • 1
    This does not answer the original question as this is searches for an id. The question needs the entire object to be unique across all fields such as place and name
    – L. Holanda
    Commented Dec 10, 2019 at 18:55
  • 2
    This is a refinement of an above generalization of the problem. The original question was posted 9 years ago, so the original poster probably isn't worried about place and name today. Anyone reading this thread is looking for an optimal way to dedup a list of objects, and this is a compact way of doing so.
    – Cliff Hall
    Commented Dec 11, 2019 at 19:25
15

let myData = [{place:"here",name:"stuff"}, 
 {place:"there",name:"morestuff"},
 {place:"there",name:"morestuff"}];


let q = [...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];

console.log(q)

One-liner using ES6 and new Map().

// assign things.thing to myData
let myData = things.thing;

[...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];

Details:-

  1. Doing .map() on the data list and converting each individual object into a [key, value] pair array(length =2), the first element(key) would be the stringified version of the object and second(value) would be an object itself.
  2. Adding above created array list to new Map() would have the key as stringified object and any same key addition would result in overriding the already existing key.
  3. Using .values() would give MapIterator with all values in a Map (obj in our case)
  4. Finally, spread ... operator to give new Array with values from the above step.
15

A TypeScript solution

This will remove duplicate objects and also preserve the types of the objects.

function removeDuplicateObjects(array: any[]) {
  return [...new Set(array.map(s => JSON.stringify(s)))]
    .map(s => JSON.parse(s));
}
4
  • 7
    Using type any entirely defeats the purpose of TypeScript
    – leonheess
    Commented Dec 11, 2020 at 10:14
  • Definitely I think this removes any inferred checking that the tax compiler will do.
    – Neil
    Commented Jan 25, 2021 at 1:49
  • 5
    Stop using stringify when you have objects! Jesus, this is why I dislike JS, it lets people do all sorts of ugly things.
    – MattSom
    Commented Nov 8, 2021 at 10:30
  • 1
    If the array contains any objects with circular references, this code will fall flat on its face. Commented Jan 26, 2022 at 18:34
13

 const things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];
const filteredArr = things.reduce((thing, current) => {
  const x = thing.find(item => item.place === current.place);
  if (!x) {
    return thing.concat([current]);
  } else {
    return thing;
  }
}, []);
console.log(filteredArr)

Solution Via Set Object | According to the data type

const seen = new Set();
 const things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];

const filteredArr = things.filter(el => {
  const duplicate = seen.has(el.place);
  seen.add(el.place);
  return !duplicate;
});
console.log(filteredArr)

Set Object Feature

Each value in the Set Object has to be unique, the value equality will be checked

The Purpose of Set object storing unique values according to the Data type , whether primitive values or object references.it has very useful four Instance methods add, clear , has & delete.

Unique & data Type feature:..

addmethod

it's push unique data into collection by default also preserve data type .. that means it prevent to push duplicate item into collection also it will check data type by default...

has method

sometime needs to check data item exist into the collection and . it's handy method for the collection to cheek unique id or item and data type..

delete method

it will remove specific item from the collection by identifying data type..

clear method

it will remove all collection items from one specific variable and set as empty object

Set object has also Iteration methods & more feature..

Better Read from Here : Set - JavaScript | MDN

0
13

If array contains objects, then you can use this to remove duplicate

const persons= [
      { id: 1, name: 'John',phone:'23' },
      { id: 2, name: 'Jane',phone:'23'},
      { id: 1, name: 'Johnny',phone:'56' },
      { id: 4, name: 'Alice',phone:'67' },
    ];
const unique = [...new Map(persons.map((m) => [m.id, m])).values()];

if remove duplicates on the basis of phone, just replace m.id with m.phone

const unique = [...new Map(persons.map((m) => [m.phone, m])).values()];
12

removeDuplicates() takes in an array of objects and returns a new array without any duplicate objects (based on the id property).

const allTests = [
  {name: 'Test1', id: '1'}, 
  {name: 'Test3', id: '3'},
  {name: 'Test2', id: '2'},
  {name: 'Test2', id: '2'},
  {name: 'Test3', id: '3'}
];

function removeDuplicates(array) {
  let uniq = {};
  return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true))
}

removeDuplicates(allTests);

Expected outcome:

[
  {name: 'Test1', id: '1'}, 
  {name: 'Test3', id: '3'},
  {name: 'Test2', id: '2'}
];

First, we set the value of variable uniq to an empty object.

Next, we filter through the array of objects. Filter creates a new array with all elements that pass the test implemented by the provided function.

return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));

Above, we use the short-circuiting functionality of &&. If the left side of the && evaluates to true, then it returns the value on the right of the &&. If the left side is false, it returns what is on the left side of the &&.

For each object(obj) we check uniq for a property named the value of obj.id (In this case, on the first iteration it would check for the property '1'.) We want the opposite of what it returns (either true or false) which is why we use the ! in !uniq[obj.id]. If uniq has the id property already, it returns true which evaluates to false (!) telling the filter function NOT to add that obj. However, if it does not find the obj.id property, it returns false which then evaluates to true (!) and returns everything to the right of the &&, or (uniq[obj.id] = true). This is a truthy value, telling the filter method to add that obj to the returned array, and it also adds the property {1: true} to uniq. This ensures that any other obj instance with that same id will not be added again.

0
11

Fast (less runtime) and type-safe answer for lazy Typescript developers:

export const uniqueBy = <T>( uniqueKey: keyof T, objects: T[]): T[] => {
  const ids = objects.map(object => object[uniqueKey]);
  return objects.filter((object, index) => !ids.includes(object[uniqueKey], index + 1));
} 
1
  • 1
    uniqueKey should be keyof T instead of string to make it more precise. Commented Sep 14, 2021 at 13:45
10

This way works well for me:

function arrayUnique(arr, uniqueKey) {
  const flagList = new Set()
  return arr.filter(function(item) {
    if (!flagList.has(item[uniqueKey])) {
      flagList.add(item[uniqueKey])
      return true
    }
  })
}
const data = [
  {
    name: 'Kyle',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Kyle',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Emily',
    occupation: 'Web Designer'
  },
  {
    name: 'Melissa',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Tom',
    occupation: 'Web Developer'
  },
  {
    name: 'Tom',
    occupation: 'Web Developer'
  }
]
console.table(arrayUnique(data, 'name'))// work well

printout

┌─────────┬───────────┬────────────────────┐
│ (index) │   name    │     occupation     │
├─────────┼───────────┼────────────────────┤
│    0    │  'Kyle'   │ 'Fashion Designer' │
│    1    │  'Emily'  │   'Web Designer'   │
│    2    │ 'Melissa' │ 'Fashion Designer' │
│    3    │   'Tom'   │  'Web Developer'   │
└─────────┴───────────┴────────────────────┘

ES5:

function arrayUnique(arr, uniqueKey) {
  const flagList = []
  return arr.filter(function(item) {
    if (flagList.indexOf(item[uniqueKey]) === -1) {
      flagList.push(item[uniqueKey])
      return true
    }
  })
}

These two ways are simpler and more understandable.

9

Here is a solution for ES6 where you only want to keep the last item. This solution is functional and Airbnb style compliant.

const things = {
  thing: [
    { place: 'here', name: 'stuff' },
    { place: 'there', name: 'morestuff1' },
    { place: 'there', name: 'morestuff2' }, 
  ],
};

const removeDuplicates = (array, key) => {
  return array.reduce((arr, item) => {
    const removed = arr.filter(i => i[key] !== item[key]);
    return [...removed, item];
  }, []);
};

console.log(removeDuplicates(things.thing, 'place'));
// > [{ place: 'here', name: 'stuff' }, { place: 'there', name: 'morestuff2' }]
1
  • You can remove the duplicate and you can also remove all the duplicate with this code. Nice
    – sg28
    Commented May 23, 2019 at 18:02
6

I know there is a ton of answers in this question already, but bear with me...

Some of the objects in your array may have additional properties that you are not interested in, or you simply want to find the unique objects considering only a subset of the properties.

Consider the array below. Say you want to find the unique objects in this array considering only propOne and propTwo, and ignore any other properties that may be there.

The expected result should include only the first and last objects. So here goes the code:

const array = [{
    propOne: 'a',
    propTwo: 'b',
    propThree: 'I have no part in this...'
},
{
    propOne: 'a',
    propTwo: 'b',
    someOtherProperty: 'no one cares about this...'
},
{
    propOne: 'x',
    propTwo: 'y',
    yetAnotherJunk: 'I am valueless really',
    noOneHasThis: 'I have something no one has'
}];

const uniques = [...new Set(
    array.map(x => JSON.stringify(((o) => ({
        propOne: o.propOne,
        propTwo: o.propTwo
    }))(x))))
].map(JSON.parse);

console.log(uniques);

4
  • It works but the other properties will be cleared, is it possible to keep the rest properties of the selected object?
    – Thanwa Ch.
    Commented Sep 19, 2020 at 11:43
  • @ThanwaCh. That's doable, and it is a matter of preference really - just need to determine which object the rest of the properties should be taken from in case of duplicates. Using my example, first and second objects in the array become one in the uniques. Now should that object contain propThree from array[0], or someOtherProperty from array[1], or both, or something else? As long as we know exactly what to do in such case, what you asked for is doable for sure. Commented Sep 19, 2020 at 12:05
  • This solution worked beautifully for the use case I was coding. Can you explain what this part is/does (({ propOne, propTwo }) => ({ propOne, propTwo }))(x)?
    – knot22
    Commented May 26, 2021 at 14:12
  • 1
    @knot22 the part before (x) is an arrow function which is unpacking the argument object into properties propOne and propTwo. Learn about object destructuring here. Now that I have read the code again, I think it should have been written a little more clearly. I have updated the code. Commented May 26, 2021 at 15:12
5

Another option would be to create a custom indexOf function, which compares the values of your chosen property for each object and wrap this in a reduce function.

var uniq = redundant_array.reduce(function(a,b){
      function indexOfProperty (a, b){
          for (var i=0;i<a.length;i++){
              if(a[i].property == b.property){
                   return i;
               }
          }
         return -1;
      }

      if (indexOfProperty(a,b) < 0 ) a.push(b);
        return a;
    },[]);
1
  • this worked out great for me - I paired this with lodash.isequal npm package as a lightweight object comparator to perform unique array filtering ...e.g. distinct array of objects. Just swapped in if (_.isEqual(a[i], b)) { instead of looking @ a single property Commented Nov 29, 2017 at 17:40
4

This solution worked best for me , by utilising Array.from Method, And also its shorter and readable.

let person = [
{name: "john"}, 
{name: "jane"}, 
{name: "imelda"}, 
{name: "john"},
{name: "jane"}
];

const data = Array.from(new Set(person.map(JSON.stringify))).map(JSON.parse);
console.log(data);
1
  • Alternatively you can use the spread operator to make a list like so: const data = [...(new Set(person.map(JSON.stringify)))].map(JSON.parse); Commented Jul 3 at 17:58

Not the answer you're looking for? Browse other questions tagged or ask your own question.