1996

I have two JavaScript arrays:

var array1 = ["Vijendra","Singh"];
var array2 = ["Singh", "Shakya"];

I want the output to be:

var array3 = ["Vijendra","Singh","Shakya"];

The output array should have repeated words removed.

How do I merge two arrays in JavaScript so that I get only the unique items from each array in the same order they were inserted into the original arrays?

2
  • 33
    Before you post a new answer, consider there are already 75+ answers for this question. Please, make sure that your answer contributes information that is not among existing answers.
    – janniks
    Commented Feb 3, 2020 at 12:05
  • If you want a more generic solution that also covers deep-merging, take a look at this question, instead. Some answers cover arrays as well. Commented May 6, 2020 at 13:06

91 Answers 91

4

A functional approach with ES2015

Following the functional approach a union of two Arrays is just the composition of concat and filter. In order to provide optimal performance we resort to the native Set data type, which is optimized for property lookups.

Anyway, the key question in conjunction with a union function is how to treat duplicates. The following permutations are possible:

Array A      + Array B

[unique]     + [unique]
[duplicated] + [unique]
[unique]     + [duplicated]
[duplicated] + [duplicated]

The first two permutations are easy to handle with a single function. However, the last two are more complicated, since you can't process them as long as you rely on Set lookups. Since switching to plain old Object property lookups would entail a serious performance hit the following implementation just ignores the third and fourth permutation. You would have to build a separate version of union to support them.


// small, reusable auxiliary functions

const comp = f => g => x => f(g(x));
const apply = f => a => f(a);
const flip = f => b => a => f(a) (b);
const concat = xs => y => xs.concat(y);
const afrom = apply(Array.from);
const createSet = xs => new Set(xs);
const filter = f => xs => xs.filter(apply(f));


// de-duplication

const dedupe = comp(afrom) (createSet);


// the actual union function

const union = xs => ys => {
  const zs = createSet(xs);  
  return concat(xs) (
    filter(x => zs.has(x)
     ? false
     : zs.add(x)
  ) (ys));
}


// mock data

const xs = [1,2,2,3,4,5];
const ys = [0,1,2,3,3,4,5,6,6];


// here we go

console.log( "unique/unique", union(dedupe(xs)) (ys) );
console.log( "duplicated/unique", union(xs) (ys) );

From here on it gets trivial to implement an unionn function, which accepts any number of arrays (inspired by naomik's comments):

// small, reusable auxiliary functions

const uncurry = f => (a, b) => f(a) (b);
const foldl = f => acc => xs => xs.reduce(uncurry(f), acc);

const apply = f => a => f(a);
const flip = f => b => a => f(a) (b);
const concat = xs => y => xs.concat(y);
const createSet = xs => new Set(xs);
const filter = f => xs => xs.filter(apply(f));


// union and unionn

const union = xs => ys => {
  const zs = createSet(xs);  
  return concat(xs) (
    filter(x => zs.has(x)
     ? false
     : zs.add(x)
  ) (ys));
}

const unionn = (head, ...tail) => foldl(union) (head) (tail);


// mock data

const xs = [1,2,2,3,4,5];
const ys = [0,1,2,3,3,4,5,6,6];
const zs = [0,1,2,3,4,5,6,7,8,9];


// here we go

console.log( unionn(xs, ys, zs) );

It turns out unionn is just foldl (aka Array.prototype.reduce), which takes union as its reducer. Note: Since the implementation doesn't use an additional accumulator, it will throw an error when you apply it without arguments.

5
  • 1
    a couple feedbacks: I noticed that flip and notf are unused. Also unionBy predicate leaks implementation details (requires implicit knowledge of Set type). It might be nice if you could just do something like this: union = unionBy (apply) and unionci = unionBy (p => x => p(x.toLowerCase())). That way the user just sends whatever the grouping value is to p – just an idea ^_^
    – Mulan
    Commented Sep 8, 2016 at 16:12
  • zs variable declaration also lacks var/let keyword
    – Mulan
    Commented Sep 8, 2016 at 16:48
  • 1
    here's a code snippet to clarify [gist: unionBy.js]
    – Mulan
    Commented Sep 8, 2016 at 17:35
  • @naomik After rethinking my solution for a while I am not so sure anymore if it is the right way to pass the predicate. All you gain is a transformation of each element of the second array. I wonder if this approach solves more than just toy problems.
    – user6445533
    Commented Sep 9, 2016 at 6:48
  • what are the benefits of functional approach in this case? Commented Sep 28, 2020 at 16:46
4

If you do not want duplicates of a specific property (for example the ID)

let noDuplicate = array1.filter ( i => array2.findIndex(a => i.id==a.id)==-1 );
let result = [...noDuplicate, ...array2];
4

var array1 = ["one","two"];
var array2 = ["two", "three"];
var collectionOfTwoArrays = [...array1, ...array2];    
var uniqueList = array => [...new Set(array)];
console.log('Collection :');
console.log(collectionOfTwoArrays);    
console.log('Collection without duplicates :');
console.log(uniqueList(collectionOfTwoArrays));

4

Here an option for objects with object arrays:

const a = [{param1: "1", param2: 1},{param1: "2", param2: 2},{param1: "4", param2: 4}]
const b = [{param1: "1", param2: 1},{param1: "4", param2: 5}]


var result = a.concat(b.filter(item =>
         !JSON.stringify(a).includes(JSON.stringify(item))
    ));

console.log(result);
//Result [{param1: "1", param2: 1},{param1: "2", param2: 2},{param1: "4", param2: 4},{param1: "4", param2: 5}]
3

for the sake of it... here is a single line solution:

const x = [...new Set([['C', 'B'],['B', 'A']].reduce( (a, e) => a.concat(e), []))].sort()
// ['A', 'B', 'C']

Not particularly readable but it may help someone:

  1. Applies a reduce function with the initial accumulator value set to an empty array.
  2. The reduce function uses concat to append each sub-array onto the accumulator array.
  3. The result of this is passed as a constructor parameter to create a new Set.
  4. The spread operator is used to convert the Set to an array.
  5. The sort() function is applied to the new array.
1
  • 2
    Also instead of reduce() you can use Array.from(set) Commented Dec 28, 2017 at 16:31
3
var arr1 = [1, 3, 5, 6];
var arr2 = [3, 6, 10, 11, 12];
arr1.concat(arr2.filter(ele => !arr1.includes(ele)));
console.log(arr1);

output :- [1, 3, 5, 6, 10, 11, 12]
3

You can try this:

const union = (a, b) => Array.from(new Set([...a, ...b]));

console.log(union(["neymar","messi"], ["ronaldo","neymar"]));

3

The simplest solution with filter:

var array1 = ["Vijendra","Singh"];
var array2 = ["Singh", "Shakya"];

var mergedArrayWithoutDuplicates = array1.concat(
  array2.filter(seccondArrayItem => !array1.includes(seccondArrayItem))
);
3

Modular, General

This could be achieved by composing two essential functions.

const getUniqueMerge = (...arrs) => getUniqueArr(mergeArrs(...arrs))
const getUniqueArr = (array) => Array.from(new Set(array))  
const mergeArrs = (...arrs) => [].concat(...arrs)

It can handle unlimited arrays, or values

console.log(getUniqueMerge(["Vijendra","Singh"],["Singh", "Shakya"])
// ["Vijendra", "Singh", "Shakya"]

console.log(getUniqueMerge(["Sheldon", "Cooper"], ["and", "Cooper", "Amy", "and"], "Farrah", "Amy", "Fowler"))
// ["Sheldon", "Cooper", "and", "Amy", "Farrah", "Fowler"]
3

If you merging object arrays, consider use of lodash UnionBy function, it allows you to set custom predicate compare objects:

import { unionBy } from 'lodash';

const a = [{a: 1, b: 2}];
const b = [{a: 1, b: 3}];
const c = [{a: 2, b: 4}];

const result = UnionBy(a,b,c, x => x.a);

Result is: [{ a: 1; b: 2 }, { a: 2; b: 4 }]

First passed match from arrays is used in result

1
  • Thank you. It is the best solution, I am sure.
    – Sandokan
    Commented Feb 15, 2023 at 5:52
3
  1. Using array.concat() and array.filter()
  2. Using new Set object and Spread Operator
  3. Using array.concat and new Set object

let array1 = [1, 2, 3, 4, 5]
let array2 = [1, 4, 6, 9]

// Using array.concat and array.filter
const array3 = array1.concat(array2.filter((item)=> array1.indexOf(item) == -1 ))
console.log('array3 : ', array3);

// Using new Set and Spread Operator
const array4 = [...new Set([...array1 ,...array2])];
console.log('array4 : ', array4);

// Using array.concat and new Set
const array5 = [...new Set(array1.concat(array2))];
console.log('array5 : ', array5);

2

In Dojo 1.6+

var unique = []; 
var array1 = ["Vijendra","Singh"];
var array2 = ["Singh", "Shakya"];
var array3 = array1.concat(array2); // Merged both arrays

dojo.forEach(array3, function(item) {
    if (dojo.indexOf(unique, item) > -1) return;
    unique.push(item); 
});

Update

See working code.

http://jsfiddle.net/UAxJa/1/

3
  • 1
    Why use dojo just for the forEach function? Commented Jul 26, 2013 at 11:19
  • Also, you don't need to merge the too arrays. Just loop through the second array and add their values if they don't exist in the first array. Commented Jul 26, 2013 at 11:23
  • 1
    @MészárosLajos No, I would never load Dojo just for the forEach function. I posted this in case someone was already using Dojo. As for the optimization, it's not possible unless you know that the first array contains unique values. Commented Jul 26, 2013 at 16:08
2

Merge an unlimited number of arrays or non-arrays and keep it unique:

function flatMerge() {
    return Array.prototype.reduce.call(arguments, function (result, current) {
        if (!(current instanceof Array)) {
            if (result.indexOf(current) === -1) {
                result.push(current);
            }
        } else {
            current.forEach(function (value) {
                console.log(value);
                if (result.indexOf(value) === -1) {
                    result.push(value);
                }
            });
        }
        return result;
    }, []);
}

flatMerge([1,2,3], 4, 4, [3, 2, 1, 5], [7, 6, 8, 9], 5, [4], 2, [3, 2, 5]);
// [1, 2, 3, 4, 5, 7, 6, 8, 9]

flatMerge([1,2,3], [3, 2, 1, 5], [7, 6, 8, 9]);
// [1, 2, 3, 5, 7, 6, 8, 9]

flatMerge(1, 3, 5, 7);
// [1, 3, 5, 7]
2

Assuming original arrays don't need de-duplication, this should be pretty fast, retain original order, and does not modify the original arrays...

function arrayMerge(base, addendum){
    var out = [].concat(base);
    for(var i=0,len=addendum.length;i<len;i++){
        if(base.indexOf(addendum[i])<0){
            out.push(addendum[i]);
        }
    }
    return out;
}

usage:

var array1 = ["Vijendra","Singh"];
var array2 = ["Singh", "Shakya"];
var array3 = arrayMerge(array1, array2);

console.log(array3);
//-> [ 'Vijendra', 'Singh', 'Shakya' ]
2

The easiest way to do this is either to use concat() to merge the arrays and then use filter() to remove the duplicates, or to use concat() and then put the merged array inside a Set().

First way:

const firstArray = [1,2, 2];
const secondArray = [3,4];
// now lets merge them
const mergedArray = firstArray.concat(secondArray); // [1,2,2,3,4]
//now use filter to remove dups
const removeDuplicates = mergedArray.filter((elem, index) =>  mergedArray.indexOf(elem) === index); // [1,2,3, 4]

Second way (but with performance implications on the UI):

const firstArray = [1,2, 2];
const secondArray = [3,4];
// now lets merge them
const mergedArray = firstArray.concat(secondArray); // [1,2,2,3,4]
const removeDuplicates = new Set(mergedArray);
3
  • Really tempted to use the second way but creating a new Set could be costly when doing this in UI update loop.
    – newguy
    Commented Apr 10, 2017 at 8:18
  • I wasn't aware of that. Thanks for pointing out the issue - I'll update my answer. Could you please provide a link in regards to that btw? Commented Apr 10, 2017 at 20:10
  • Uh I dont think there is a link. I am just saying that based on my own experience working with arrays in the rendering loop in HTML5 canvas.
    – newguy
    Commented Apr 11, 2017 at 17:50
2

Using Lodash

I found @GijsjanB's answer useful but my arrays contained objects that had many attributes, so I had to de-duplicate them using one of the attributes.

Here's my solution using lodash

userList1 = [{ id: 1 }, { id: 2 }, { id: 3 }]
userList2 = [{ id: 3 }, { id: 4 }, { id: 5 }]
// id 3 is repeated in both arrays

users = _.unionWith(userList1, userList2, function(a, b){ return a.id == b.id });

// users = [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }]

The function you pass as the third parameter has two arguments (two elements), and it must return true if they are equal.

2

Here is about the most effective one, in terms of computation time. Also it keeps the initial order of elements.

First filter all duplicates from second array, then concatenate what is left to the first one.

var a = [1,2,3];
var b = [5,4,3];
var c = a.concat(b.filter(function(i){
    return a.indexOf(i) == -1;
}));
console.log(c); // [1, 2, 3, 5, 4]

Here is slightly improved (faster) version of it, with a downside, that arrays must not miss values:

var i, c = a.slice(), ci = c.length;
for(i = 0; i < b.length; i++){
    if(c.indexOf(b[i]) == -1)
        c[ci++] = b[i];
}
2
  • I don't see why this should be more time efficient than LiraNuna's solution. You both use the concat function, and to find unique elements you both have a time complexity of O(n^2). Your solution is, however, fewer lines of code and easier to read. Commented Jun 12, 2014 at 8:28
  • @AntonGildebrand More useless iterations there. You can check out the comparsion here(open browser console to see): jsbin.com/wabahuha/1/edit?js,output
    – AlexTR
    Commented Jun 12, 2014 at 9:05
2

looks like the accepted answer is the slowest in my tests;

note I am merging 2 arrays of objects by Key

<!DOCTYPE html>
<html>
<head>
  <meta charset="utf-8">
  <meta name="viewport" content="width=device-width">
  <title>JS Bin</title>
</head>
<body>
<button type='button' onclick='doit()'>do it</button>
<script>
function doit(){
    var items = [];
    var items2 = [];
    var itemskeys = {};
    for(var i = 0; i < 10000; i++){
        items.push({K:i, C:"123"});
        itemskeys[i] = i;
    }

    for(var i = 9000; i < 11000; i++){
        items2.push({K:i, C:"123"});
    }

    console.time('merge');
    var res = items.slice(0);

    //method1();
    method0();
    //method2();

    console.log(res.length);
    console.timeEnd('merge');

    function method0(){
        for(var i = 0; i < items2.length; i++){
            var isok = 1;
            var k = items2[i].K;
            if(itemskeys[k] == null){
                itemskeys[i] = res.length;
                res.push(items2[i]);
            }
        }
    }

    function method1(){
        for(var i = 0; i < items2.length; i++){
            var isok = 1;
            var k = items2[i].K;

            for(var j = 0; j < items.length; j++){
                if(items[j].K == k){
                    isok = 0;
                    break;
                }
            }

            if(isok) res.push(items2[i]);
        }  
    }

    function method2(){
        res = res.concat(items2);
        for(var i = 0; i < res.length; ++i) {
            for(var j = i+1; j < res.length; ++j) {
                if(res[i].K === res[j].K)
                    res.splice(j--, 1);
            }
        }
    }
}
</script>
</body>
</html>
2

You can use loadash unionWith - _.unionWith([arrays], [comparator])

This method is like _.union except that it accepts comparator which is invoked to compare elements of arrays. Result values are chosen from the first array in which the value occurs. The comparator is invoked with two arguments: (arrVal, othVal).

var array1 = ["Vijendra","Singh"];
var array2 = ["Singh", "Shakya"];
 
var array3 = _.unionWith(array1, array2, _.isEqual);
console.log(array3);
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.11/lodash.min.js"></script>

2
2

Reduce them !!!

This alternative instead of merging and deduplicating explicitly it will take one array and reduce it with another array so that each value of the first array can be iterated and destructured in an accumulative behavior, ignoring the already included values by exploiting the persistence of the array because of the recursiveness.

array2.reduce(reducer, array1.reduce(reducer, []))

Test Example:

var array1 = ["Vijendra","Singh","Singh"];
var array2 = ["Singh", "Shakya", "Shakya"];
const reducer = (accumulator, currentValue) => accumulator.includes(currentValue) ? accumulator : [...accumulator, currentValue];

console.log(
  array2.reduce(reducer, array1.reduce(reducer, []))
);

// a reduce on first array is needed to ensure a deduplicated array used as initial value on the second array being reduced

Conclusion

By far more elegant and useful when boring for-each approach wants to be avoided (not that it is not useful).

Deals with the concat() limitations on deduplication.

No need for external libraries like Underscore.js, JQuery or Lo-Dash, nor the trouble to create any built-in function to achieve the desired merged and deduplicated effect.

Oh, and HEY!, it can be done as a one-liner!!!


This answered was possible thanks to ES5 (ECMAScript 2015), beautiful include() and gorgeous reduce().

2

Here is another neat solution using Set:

const o1 = {a: 1};
const arr1 = ['!@#$%^&*()', 'gh', 123, o1, 1, true, undefined, null];
const arr2 = ['!@#$%^&*()', 123, 'abc', o1, 0x001, true, void 0, 0];

const mergeUnique = (...args) => [ ...new Set([].concat(...args)) ];

console.log(mergeUnique(arr1, arr2));

2

Care about efficiency, yet want to do it inline

const s = new Set(array1);
array2.forEach(a => s.add(a));
const merged_array = [...s]; // optional: convert back in array type
1

This is my second answer, but I believe the fastest? I'd like someone to check for me and reply in the comments.

My first attempt hit about 99k ops/sec and this go around is saying 390k ops/sec vs the other leading jsperf test of 140k (for me).

http://jsperf.com/merge-two-arrays-keeping-only-unique-values/26

I tried to minimize as much array interaction as possible this time around and it looked like I netted some performance.

function findMerge(a1, a2) {
    var len1 = a1.length;

    for (var x = 0; x < a2.length; x++) {
        var found = false;

        for (var y = 0; y < len1; y++) {
            if (a2[x] === a1[y]) {
                found = true;
                break;
            }
        }

        if(!found){
            a1.push(a2.splice(x--, 1)[0]);
        }
    }

    return a1;
}

Edit: I made some changes to my function, and the performance is drastic compared to others on the jsperf site.

0
1
var MergeArrays=function(arrayOne, arrayTwo, equalityField) {
    var mergeDictionary = {};

    for (var i = 0; i < arrayOne.length; i++) {
        mergeDictionary[arrayOne[i][equalityField]] = arrayOne[i];
    }

    for (var i = 0; i < arrayTwo.length; i++) {
        mergeDictionary[arrayTwo[i][equalityField]] = arrayTwo[i];
    }

    return $.map(mergeDictionary, function (value, key) { return value });
}

Leveraging dictionaries and Jquery you could merge the two arrays and not get duplicates. In my example I'm using a given field on the object but could be just the object itself.

1

Another approach for your review with reduce func:

function mergeDistinct(arResult, candidate){
  if (-1 == arResult.indexOf(candidate)) {
    arResult.push(candidate);
  }
  return arResult;
}

var array1 = ["Vijendra","Singh"];
var array2 = ["Singh", "Shakya"];

var arMerge = [];
arMerge = array1.reduce(mergeDistinct, arMerge);
arMerge = array2.reduce(mergeDistinct, arMerge);//["Vijendra","Singh","Shakya"];
1

If you want to check for unique objects, then use JSON.stringify in your comparison.

function arrayUnique(array) {
    var a = array.concat();
    for(var i=0; i<a.length; ++i) {
        for(var j=i+1; j<a.length; ++j) {
            if(JSON.stringify(a[i]) === JSON.stringify(a[j]))
                a.splice(j--, 1);
        }
    }

    return a;
}
1
Array.prototype.union = function (other_array) {
/* you can include a test to check whether other_array really is an array */
  other_array.forEach(function(v) { if(this.indexOf(v) === -1) {this.push(v);}}, this);    
}
1

One line solution as a segue to LiraNuna's:

let array1 = ["Vijendra","Singh"];
let array2 = ["Singh", "Shakya"];

// Merges both arrays
let array3 = array1.concat(array2); 

//REMOVE DUPLICATE
let removeDuplicate = [...new Set(array3)];
console.log(removeDuplicate);
1

Here is a simple example:

var unique = function(array) {
    var unique = []
    for (var i = 0; i < array.length; i += 1) {
        if (unique.indexOf(array[i]) == -1) {
            unique.push(array[i])
        }
    }
    return unique
}

var uniqueList = unique(["AAPL", "MSFT"].concat(["MSFT", "BBEP", "GE"]));

We define unique(array) to remove redundant elements and use the concat function to combine two arrays.

1

ES2019

You can use it like union(array1, array2, array3, ...)

/**
 * Merges two or more arrays keeping unique items. This method does
 * not change the existing arrays, but instead returns a new array.
 */
function union<T>(...arrays: T[]) {
  return [...new Set([...arrays].flat())];
}

It is ES2019 because of the flat() function, but you can use core-js to get it as a polyfill. The T here is TypeScript generic type, that you can remove if you are not using TypeScript. If you are using TypeScript, make sure to add "lib": ["es2019.array"] to compiler options in tsconfig.json.

or...

just use lodash _.union

2
  • regarding union<T>(...arrays: T[]) - is this JavaScript/ES2019?
    – Abdull
    Commented Sep 21, 2023 at 7:55
  • @Abdull no, but flat() is. Please read the description in the post.
    – orad
    Commented Sep 29, 2023 at 19:55

Not the answer you're looking for? Browse other questions tagged or ask your own question.