By kramden88

2012-02-10 14:53:22 8 Comments

I have a very simple JavaScript array that may or may not contain duplicates.

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];

I need to remove the duplicates and put the unique values in a new array.

I could point to all the codes that I've tried but I think it's useless because they don't work. I accept jQuery solutions too.

Similar question:


@georg 2012-02-10 15:05:41


Using the Set constructor and the spread syntax:

uniq = [ Set(array)];

"Smart" but naïve way

uniqueArray = a.filter(function(item, pos) {
    return a.indexOf(item) == pos;

Basically, we iterate over the array and, for each element, check if the first position of this element in the array is equal to the current position. Obviously, these two positions are different for duplicate elements.

Using the 3rd ("this array") parameter of the filter callback we can avoid a closure of the array variable:

uniqueArray = a.filter(function(item, pos, self) {
    return self.indexOf(item) == pos;

Although concise, this algorithm is not particularly efficient for large arrays (quadratic time).

Hashtables to the rescue

function uniq(a) {
    var seen = {};
    return a.filter(function(item) {
        return seen.hasOwnProperty(item) ? false : (seen[item] = true);

This is how it's usually done. The idea is to place each element in a hashtable and then check for its presence instantly. This gives us linear time, but has at least two drawbacks:

  • since hash keys can only be strings or symbols in JavaScript, this code doesn't distinguish numbers and "numeric strings". That is, uniq([1,"1"]) will return just [1]
  • for the same reason, all objects will be considered equal: uniq([{foo:1},{foo:2}]) will return just [{foo:1}].

That said, if your arrays contain only primitives and you don't care about types (e.g. it's always numbers), this solution is optimal.

The best from two worlds

A universal solution combines both approaches: it uses hash lookups for primitives and linear search for objects.

function uniq(a) {
    var prims = {"boolean":{}, "number":{}, "string":{}}, objs = [];

    return a.filter(function(item) {
        var type = typeof item;
        if(type in prims)
            return prims[type].hasOwnProperty(item) ? false : (prims[type][item] = true);
            return objs.indexOf(item) >= 0 ? false : objs.push(item);

sort | uniq

Another option is to sort the array first, and then remove each element equal to the preceding one:

function uniq(a) {
    return a.sort().filter(function(item, pos, ary) {
        return !pos || item != ary[pos - 1];

Again, this doesn't work with objects (because all objects are equal for sort). Additionally, we silently change the original array as a side effect - not good! However, if your input is already sorted, this is the way to go (just remove sort from the above).

Unique by...

Sometimes it's desired to uniquify a list based on some criteria other than just equality, for example, to filter out objects that are different, but share some property. This can be done elegantly by passing a callback. This "key" callback is applied to each element, and elements with equal "keys" are removed. Since key is expected to return a primitive, hash table will work fine here:

function uniqBy(a, key) {
    var seen = {};
    return a.filter(function(item) {
        var k = key(item);
        return seen.hasOwnProperty(k) ? false : (seen[k] = true);

A particularly useful key() is JSON.stringify which will remove objects that are physically different, but "look" the same:

a = [[1,2,3], [4,5,6], [1,2,3]]
b = uniqBy(a, JSON.stringify)
console.log(b) // [[1,2,3], [4,5,6]]

If the key is not primitive, you have to resort to the linear search:

function uniqBy(a, key) {
    var index = [];
    return a.filter(function (item) {
        var k = key(item);
        return index.indexOf(k) >= 0 ? false : index.push(k);

In ES6 you can use a Set:

function uniqBy(a, key) {
    let seen = new Set();
    return a.filter(item => {
        let k = key(item);
        return seen.has(k) ? false : seen.add(k);

or a Map:

function uniqBy(a, key) {
    return [ Map(
   => [key(x), x])

which both also work with non-primitive keys.

First or last?

When removing objects by a key, you might to want to keep the first of "equal" objects or the last one.

Use the Set variant above to keep the first, and the Map to keep the last:

function uniqByKeepFirst(a, key) {
    let seen = new Set();
    return a.filter(item => {
        let k = key(item);
        return seen.has(k) ? false : seen.add(k);

function uniqByKeepLast(a, key) {
    return [ Map(
   => [key(x), x])


data = [
    {a:1, u:1},
    {a:2, u:2},
    {a:3, u:3},
    {a:4, u:1},
    {a:5, u:2},
    {a:6, u:3},

console.log(uniqByKeepFirst(data, it => it.u))
console.log(uniqByKeepLast(data, it => it.u))


Both underscore and Lo-Dash provide uniq methods. Their algorithms are basically similar to the first snippet above and boil down to this:

var result = [];
a.forEach(function(item) {
     if(result.indexOf(item) < 0) {

This is quadratic, but there are nice additional goodies, like wrapping native indexOf, ability to uniqify by a key (iteratee in their parlance), and optimizations for already sorted arrays.

If you're using jQuery and can't stand anything without a dollar before it, it goes like this:

  $.uniqArray = function(a) {
        return $.grep(a, function(item, pos) {
            return $.inArray(item, a) === pos;

which is, again, a variation of the first snippet.


Function calls are expensive in JavaScript, therefore the above solutions, as concise as they are, are not particularly efficient. For maximal performance, replace filter with a loop and get rid of other function calls:

function uniq_fast(a) {
    var seen = {};
    var out = [];
    var len = a.length;
    var j = 0;
    for(var i = 0; i < len; i++) {
         var item = a[i];
         if(seen[item] !== 1) {
               seen[item] = 1;
               out[j++] = item;
    return out;

This chunk of ugly code does the same as the snippet #3 above, but an order of magnitude faster (as of 2017 it's only twice as fast - JS core folks are doing a great job!)

function uniq(a) {
    var seen = {};
    return a.filter(function(item) {
        return seen.hasOwnProperty(item) ? false : (seen[item] = true);

function uniq_fast(a) {
    var seen = {};
    var out = [];
    var len = a.length;
    var j = 0;
    for(var i = 0; i < len; i++) {
         var item = a[i];
         if(seen[item] !== 1) {
               seen[item] = 1;
               out[j++] = item;
    return out;


var r = [0,1,2,3,4,5,6,7,8,9],
    a = [],
    LEN = 1000,
    LOOPS = 1000;

    a = a.concat(r);

var d = new Date();
for(var i = 0; i < LOOPS; i++)
document.write('<br>uniq, ms/loop: ' + (new Date() - d)/LOOPS)

var d = new Date();
for(var i = 0; i < LOOPS; i++)
document.write('<br>uniq_fast, ms/loop: ' + (new Date() - d)/LOOPS)


ES6 provides the Set object, which makes things a whole lot easier:

function uniq(a) {
   return Array.from(new Set(a));


let uniq = a => [ Set(a)];

Note that, unlike in python, ES6 sets are iterated in insertion order, so this code preserves the order of the original array.

However, if you need an array with unique elements, why not use sets right from the beginning?


A "lazy", generator-based version of uniq can be built on the same basis:

  • take the next value from the argument
  • if it's been seen already, skip it
  • otherwise, yield it and add it to the set of already seen values

function* uniqIter(a) {
    let seen = new Set();

    for (let x of a) {
        if (!seen.has(x)) {
            yield x;

// example:

function* randomsBelow(limit) {
    while (1)
        yield Math.floor(Math.random() * limit);

// note that randomsBelow is endless

count = 20;
limit = 30;

for (let r of uniqIter(randomsBelow(limit))) {
    if (--count === 0)

// exercise for the reader: what happens if we set `limit` less than `count` and why

@Roman Bataev 2012-02-10 15:26:19

filter and indexOf have been introduced in ECMAScript 5, so this will not work in old IE versions (<9). If you care about those browsers, you will have to use libraries with similar functions (jQuery, underscore.js etc.)

@Michael Robinson 2012-12-17 02:25:00

@RoderickObrist you might if you want your page to work in older browsers

@qw3n 2013-01-07 16:31:08

Why not make use of the array reference provided by the filter method to make the answer more general?

@georg 2013-01-07 20:07:19

@qw3n: readability. The 3rd parameter of filter() is rarely used, one has to look up the docs to figure out what it does. That said, your suggestion still deserves to be included in the answer, see edits.

@qw3n 2013-01-07 23:44:40

@thg435 well the reason I wondered is I was looking for this answer which was what I wanted, but in copying it over I didn't like the fact that it was so specific. So, I checked the docs myself to see if there was another parameter.

@seriyPS 2013-02-03 00:47:56

This is O(n^2) solution, which can run very slow in large arrays...

@Bruno 2013-06-06 12:55:00

Your first two functions will return [] for the array [ NaN, NaN ]. This is because NaN === NaN // false so as far as indexOf is concerned it fails to find it in the array. Although unlikely this is possible if the array element is the result of calculations that might return NaN

@gabeno 2013-08-01 13:14:42

Does any body have an idea how we can apply this solution to a use case like [[1,2,3],[4,5,6],[1,2,3]] to give [[1,2,3],[4,5,6]]?

@Charles Beattie 2014-06-16 14:02:57

Try this array: ["toString", "valueOf", "failed"]. toString and valueOf are stripped completely. Use Object.create(null) instead of {}.

@Lawrence 2014-11-04 09:04:07

I was facing the same issue bit instead I figured maybe it was best to avoid duplicates in the first place. Since there is nothing like a unique List in JS, I decided to just add properties to an object; If you add the same twice, it will just be overwritten and problem of duplicates is solved. No need to filter anymore and all I need is just 1 for loop.

@georg 2014-11-04 09:06:21

@Lawrence: sure, but see the part 2 (Hashtables) above. You can only add strings this way, no objects.

@dashmug 2014-11-23 23:23:03

Not enough jQuery! This answer should do it. :-)

@elad.chen 2014-12-13 13:57:11

@georg in the last example, (uniq_fast), wouldn't it be easier to use out.push(item) instead of out[j++] = item ?. Update: never mind, you opted to avoid any function call altogether.. +1

@Dan 2015-04-08 20:51:40

@georg need your help plz, I written code here to avoid duplicate insertion of values into arry but its not working , can u plz help me

@quillbreaker 2015-05-14 19:50:38

minus one for shortage of dollar signs. grin

@Denis 2015-09-15 18:59:25

btw, "Smart" but naïve way support third argument with full array. uniqueArray = a.filter(function(item, pos, arr) { return arr.indexOf(item) == pos; })

@georg 2016-02-04 12:21:13

@vasa: thanks for your edit, it was rejected for some reason, but I restored it. ES6 code looks much better now!

@superluminary 2016-08-04 09:00:25

Convert to a Set, and then back again. Lovely.

@Eric Nguyen 2016-08-12 19:34:21

Anyone know how fast the Set conversion solution is, compared to the others?

@Washington Guedes 2016-09-29 14:47:39

Since the default sort order is according to string Unicode code points ... ([10, 2, 1].sort() results [1, 10, 2]), you'll need to use .sort((x,y)=>x-y) instead.

@georg 2016-09-29 18:04:43

@Guedes: this doesn't matter for this case, because whatever sorting logic you use, equal elements are always next to each other.

@mesqueeb 2017-02-25 06:43:19

I tried the uniqBy(a, key) function, but it said 'key is not a function' and does not exist... Don't I get to fill in a property name of the objects which are in my array?

@worc 2017-03-14 21:38:18

trying the code snippet, i don't get an order of magnitude improvement with the ugly solution any more. it's about twice as fast.

@georg 2017-03-14 22:59:50

@worc: right, things improve greatly over time.

@elboletaire 2017-06-18 22:05:35

Really nice post. Thanks for it, was really useful (and a really interesting reading). BTW, functions should be declared as const, unless you want to leave them open to be overwritten (which is not the case, and not a typical one).

@georg 2017-06-19 14:00:06

@elboletaire: thanks. My personal preference is to only use const for actual constant values (as in const pi = 3.14). I guess this is a bad naming decision in ES6 to use the const keyword for "immutable bindings". Obviously, a function is not a "constant", in any reasonable sense.

@elboletaire 2017-06-19 17:43:20

Yeah but makes sense in javascript, where you can easily override a functions code. I'm with you that it's, strange, but I think is something about javascript, not about ES6.

@Nicholas 2018-01-18 10:18:09

The one line solution <3

@Sudarshan Kalebere 2018-01-24 09:49:12

@georg hi can you please check this question… thank you...

@Hidayt Rahman 2018-02-22 10:48:10

Will ES6 effort support older browsers?

@Lug 2018-03-30 04:27:03

Keep in mind that if you have an array of objects, each one of them are different references, so in ES6 SET will not filter those

@Fred Gandt 2018-04-05 21:07:02

Suggestion: Reorder the possible methods, "best" first.

@Prashant Tapase 2018-07-04 12:41:00

how (seen[item] = true) this works? check hashtables code. i am not understanding. can u just elaborate

@A Jar of Clay 2018-10-03 16:49:46

Note: The Set version uses shallow comparison, so won't work if the elements in the array are objects.

@Ashfaque Rifaye 2018-10-23 06:47:08

Thanks Sir. Explanation is really good.

@Tugrul Emre Atalay 2019-06-02 11:55:00

if elements of the array are the type of object, we should change smart solution to that via unique key like "name": a.filter(function (item, index, self) { return self.findIndex(child => === == index; })

@Soni Kumari 2019-07-27 03:30:14

Found exact answer I wanted here:…

@georg 2019-07-27 07:40:06

@SoniKumari: yep, looks like the author forgot to cite their sources. ;)

@gilly3 2013-02-06 22:32:59

The top answers have complexity of O(n²), but this can be done with just O(n) by using an object as a hash:

function getDistinctArray(arr) {
    var dups = {};
    return arr.filter(function(el) {
        var hash = el.valueOf();
        var isDup = dups[hash];
        dups[hash] = true;
        return !isDup;

This will work for strings, numbers, and dates. If your array contains objects, the above solution won't work because when coerced to a string, they will all have a value of "[object Object]" (or something similar) and that isn't suitable as a lookup value. You can get an O(n) implementation for objects by setting a flag on the object itself:

function getDistinctObjArray(arr) {
    var distinctArr = arr.filter(function(el) {
        var isDup = el.inArray;
        el.inArray = true;
        return !isDup;
    distinctArr.forEach(function(el) {
        delete el.inArray;
    return distinctArr;

2019 edit: Modern versions of JavaScript make this a much easier problem to solve. Using Set will work, regardless of whether your array contains objects, strings, numbers, or any other type.

function getDistinctArray(arr) {
    return [ Set(arr)];

The implementation is so simple, defining a function is no longer warranted.

@tusharmath 2013-09-06 11:11:01

Did you consider the performance hit in your method?

@gilly3 2013-09-06 15:57:01

@Tushar - Where do you see a performance issue?

@tusharmath 2013-09-07 06:17:39

This answer would not even work in the first place. check this

@tusharmath 2013-09-07 06:18:26

The best way would be to filter while sorting.

@gilly3 2013-09-09 18:41:46

@Tushar - Your gist gives a 404. No sorting algorithm has O(n) complexity. Sorting would not be faster.

@tusharmath 2013-09-10 20:05:29

@gilly3 2013-09-10 21:58:18

@Tushar - there are no actual duplicates in that array. If you want to remove objects from an array that have exactly the same properties and values as other objects in the array, you would need to write a custom equality checking function to support it.

@gilly3 2013-09-10 22:12:03

@Tushar - None of the answers on this page would remove any duplicates from such an array as is in your gist.

@Charles Beattie 2014-06-16 13:28:17

Consider using Object.create(null) instead of {}.

@Akin Hwan 2019-12-27 16:55:05

just note that IE is late to the party for Set

@bodich 2016-11-11 17:35:23

Here is very simple for understanding and working anywhere (even in PhotoshopScript) code. Check it!

var peoplenames = new Array("Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl");

peoplenames = unique(peoplenames);

function unique(array){
    var len = array.length;
    for(var i = 0; i < len; i++) for(var j = i + 1; j < len; j++) 
        if(array[j] == array[i]){
    return array;

//*result* peoplenames == ["Mike","Matt","Nancy","Adam","Jenny","Carl"]

@ShAkKiR 2017-06-05 19:56:03

Solution 1

Array.prototype.unique = function() {
    var a = [];
    for (i = 0; i < this.length; i++) {
        var current = this[i];
        if (a.indexOf(current) < 0) a.push(current);
    return a;

Solution 2 (using Set)

Array.prototype.unique = function() {
    return Array.from(new Set(this));


var x=[1,2,3,3,2,1];
x.unique() //[1,2,3]


When I tested both implementation (with and without Set) for performance in chrome, I found that the one with Set is much much faster!

Array.prototype.unique1 = function() {
    var a = [];
    for (i = 0; i < this.length; i++) {
        var current = this[i];
        if (a.indexOf(current) < 0) a.push(current);
    return a;

Array.prototype.unique2 = function() {
    return Array.from(new Set(this));

var x=[];
for(var i=0;i<10000;i++){



@ken 2018-05-24 02:03:40

Upvote for the use of Set. I don't know the performance comparison though

@ShAkKiR 2018-05-25 07:04:14

I have read somewhere that an Array is faster than a Set (overall performance), But when I tested in chrome, the implementation with Set was much much faster! see the edited answer :)

@ShAkKiR 2018-06-06 10:11:59

better practice is to use Object.defineProperty(Array.prototype,"unique".. instead of Array.prototype.unique = ... See more info here…

@Demonblack 2018-06-18 13:43:50

the Set approach doesn't seem to work for me in Node. new Set([5,5]) seems to return [5,5] in some cases. I'm as baffled as you are. Edit: I found out what's happening. new Set([new Number(5), new Number(5)]) returns [5,5]. Apparently Node thinks the two number 5s are different if I instantiate them with new... which is honestly the stupidest thing I've ever seen.

@ShAkKiR 2018-08-28 13:47:51

@Demonblack This is a valid concern. x=new Number(5) and another y=new Number(5) will be two different Objects, as oppose to just var x=5 and var y=5. new keyword will create a new object. I know this explanation is obvious but that's all I know :)

@Demonblack 2018-08-28 15:38:23

@ShAkKiR After dabbling with JS for a while now I think I get it. 5 is a primitive for JS, and Number("5") is a constructor which gives you that primitive 5. JS being JS, you're allowed to call new even on classes that never meant you to do so, but it usually produces undesired behavior. Since Number wasn't meant to be used with new, it defaults to creating two different Number objects, as you said. And since they don't redefine the equality operator (because in js you can't) and they aren't special cases with autoboxing like in Java either, Set doesn't know they're supposed to be equal.

@Levi 2013-01-25 17:52:04

The following is more than 80% faster than the jQuery method listed (see tests below). It is an answer from a similar question a few years ago. If I come across the person who originally proposed it I will post credit. Pure JS.

var temp = {};
for (var i = 0; i < array.length; i++)
  temp[array[i]] = true;
var r = [];
for (var k in temp)
return r;

My test case comparison:

@seriyPS 2013-02-03 00:46:06

I add a more fast version in revision 4. Please, review!

@imma 2013-08-09 12:58:30

the test didn't seem to be using arrays??? i've added (yet another) one that seems to be consistently fast over different browsers (see : for (var n = array.length, result = [array[n--]], i; n--;) { i = array[n]; if (!(i in result)) result.push(i); } return result;

@Sumit Joshi 2017-09-14 06:46:21

use Array.filter() like this

var actualArr = ['Apple', 'Apple', 'Banana', 'Mango', 'Strawberry', 'Banana'];

console.log('Actual Array: ' + actualArr);

var filteredArr = actualArr.filter(function(item, index) {
  if (actualArr.indexOf(item) == index)
    return item;

console.log('Filtered Array: ' + filteredArr);

this can be made shorter in ES6 to

actualArr.filter((item,index,self) => self.indexOf(item)==index);

Here is nice explanation of Array.filter()

@Sketchy Coder 2017-09-14 07:03:58

Can you elaborate what you've done here? :-)

@Sketchy Coder 2017-09-14 08:56:01

Great! It helps users if you add that to your answer.

@DCR 2018-03-12 19:04:54

doesn't work when the array is an array of arrays

@Soumya Gangamwar 2019-08-08 06:24:14

does not work for case sensitive array

@MBJH 2017-10-25 13:54:15

for (i=0; i<originalArray.length; i++) {  
    if (!newArray.includes(originalArray[i])) {

@Teja 2017-11-26 14:19:14

love vanilla js. Thanks

@Ashutosh Jha 2017-06-15 11:05:40

You can simply do it in JavaScript, with the help of the second - index - parameter of the filter method:

var a = [2,3,4,5,5,4];
a.filter(function(value, index){ return a.indexOf(value) == index });

or in short hand

a.filter((v,i) => a.indexOf(v) == i)

@frozen 2017-07-16 03:36:58

this only works for an array containing primitives?

@Ashutosh Jha 2017-07-16 13:58:26

Yes , you are correct @frozen

@Hitmands 2017-07-17 14:43:18

this a.indexOf(v)==i should be a.indexOf(v) === a.lastIndexOf(v)

@Ashutosh Jha 2017-07-18 05:54:29

@Hitmands You are comparing from right , I am comparing from left . nothing else .

@Xenos 2017-12-20 10:36:33

Works also without requiring the a variable, as the array is the 3rd parameter of filter: [1/0, 2,1/0,2,3].filter((v,i,a) => a.indexOf(v) === i) (note that it also works nice with Infinity ☺ )

@cjjenkinson 2017-10-14 14:38:54

For anyone looking to flatten arrays with duplicate elements into one unique array:

function flattenUniq(arrays) {
  var args =;

  var array = [].concat.apply([], args)

  var result = array.reduce(function(prev, curr){
    if (prev.indexOf(curr) < 0) prev.push(curr);
    return prev;

  return result;

@user1429980 2017-09-15 18:05:45

ES2015, 1-liner, which chains well with map, but only works for integers:

[1, 4, 1].sort().filter((current, next) => current !== next)

[1, 4]

@Kroltan 2017-10-05 16:18:34

That works with anything, but only removes sequential duplicates. e.g. [1,1,2,2,3,3] -> [1,2,3] but [1,2,3,1,2,3] -> [1,2,3,1,2,3]

@Xenos 2017-12-20 10:34:24

@Kroltan It's actually not a matter of sequential duplicates, but it's a big issue about understanding what's passed to filter: it's (value, index) not (current, next), so it would work for [1,4,1] but not for [2,4,2]...

@Kroltan 2017-12-20 20:17:43

@Xenos You're right! I skimmed over it too fast xD

@shubham kucheria 2017-03-22 13:43:27

var uniqueCompnies = function(companyArray) {
    var arrayUniqueCompnies = [],
        found, x, y;

    for (x = 0; x < companyArray.length; x++) {
        found = undefined;
        for (y = 0; y < arrayUniqueCompnies.length; y++) {
            if (companyArray[x] === arrayUniqueCompnies[y]) {
                found = true;

        if ( ! found) {

    return arrayUniqueCompnies;

var arr = [
    "Adobe Systems Incorporated",
    "BlackRock, Inc.",
    "BlackRock, Inc.",

@Alexey Subach 2017-03-22 14:04:08

Please format the whole post

@Mohideen bin Mohammed 2017-08-03 14:18:30

here is the simple method without any special libraries are special function,

name_list = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];
get_uniq = name_list.filter(function(val,ind) { return name_list.indexOf(val) == ind; })

console.log("Original name list:"+name_list.length, name_list)
console.log("\n Unique name list:"+get_uniq.length, get_uniq)

enter image description here

@Jonca33 2017-08-01 01:39:01

One line:

let names = ['Mike','Matt','Nancy','Adam','Jenny','Nancy','Carl', 'Nancy'];
let dup = [ Set(names)];

@chetan92 2018-04-14 21:03:31

Best answer, if you're using ES6

@Sancarn 2017-07-06 21:01:51

Although ES6 Solution is the best, I'm baffled as to how nobody has shown the following solution:

function removeDuplicates(arr){
    return Object.keys(o)

The thing to remember here is that objects MUST have unique keys. We are exploiting this to remove all the duplicates. I would have thought this would be the fastest solution (before ES6).

Bear in mind though that this also sorts the array.

@gnujoow 2017-06-16 07:30:46

const numbers = [1, 1, 2, 3, 4, 4];

function unique(array) {
  return array.reduce((a,b) => {
    let isIn = a.find(element => {
        return element === b;
    return a;

let ret = unique(numbers); // [1, 2, 3, 4]

the way using reduce and find.

@Dan Zuzevich 2017-06-09 11:43:33

This solution uses a new array, and an object map inside the function. All it does is loop through the original array, and adds each integer into the object map.If while looping through the original array it comes across a repeat, the

`if (!unique[int])`

catches this because there is already a key property on the object with the same number. Thus, skipping over that number and not allowing it to be pushed into the new array.

    function removeRepeats(ints) {
      var unique = {}
      var newInts = []

      for (var i = 0; i < ints.length; i++) {
        var int = ints[i]

        if (!unique[int]) {
          unique[int] = 1
      return newInts

    var example = [100, 100, 100, 100, 500]
    console.log(removeRepeats(example)) // prints [100, 500]

@John Slegers 2017-03-29 09:31:39

A simple but effective technique, is to use the filter method in combination with the filter function(value, index){ return this.indexOf(value) == index }.

Code example :

var data = [2,3,4,5,5,4];
var filter = function(value, index){ return this.indexOf(value) == index };
var filteredData = data.filter(filter, data );

document.body.innerHTML = '<pre>' + JSON.stringify(filteredData, null, '\t') +  '</pre>';

See also this Fiddle.

@Pedro Ferreira 2017-04-03 13:17:20

Genious! And, for instances, if you want to have the repeated ones, (instead of removing them) all you have to do is replace this.indexOf(value) == index by this.indexOf(value, index+1) > 0 Thanks!

@Pedro Ferreira 2017-04-04 22:03:47

You could even resume it to a single "filter" line: filterData = data.filter((v, i) => (data.indexOf(v) == i) );

@Pedro Ferreira 2017-04-05 00:12:10

Last time I bother! Sorry... picking up my 1st answer, in 2 lines you could get a JSON var JSON_dupCounter = {}; with the repeated ones and how many times they were repeated: data.filter((testItem, index) => (data.indexOf(testItem, index + 1) > 0)).forEach((found_duplicated) => (JSON_dupCounter[found_duplicated] = (JSON_dupCounter [found_duplicated] || 1) + 1));

@frozen 2017-07-16 03:37:48

this only works for arrays of primitives?

@John Slegers 2017-07-16 09:07:58

@frozen : If works with everything where == can be used to determine equality. So, if you're dealing with eg. arrays, objects or functions, the filter will work only for different entries that are references to the same array, object or function (see demo). If you want to determine equality based on different criteria, you'll need to include those criteria in your filter.

@stanislavs 2017-03-24 13:32:23

aLinks is a simple JavaScript array object. If any element exist before the elements on which the index shows that a duplicate record deleted. I repeat to cancel all duplicates. One passage array cancel more records.

var srt_ = 0;
var pos_ = 0;
do {
    var srt_ = 0;
    for (var i in aLinks) {
        pos_ = aLinks.indexOf(aLinks[i].valueOf(), 0);
        if (pos_ < i) {
            delete aLinks[i];
} while (srt_ != 0);

@ofir_aghai 2017-03-15 14:26:39

So the options is:

let a = [11,22,11,22];
let b = []

b = [ Set(a) ];     
// b = [11, 22]

b = Array.from( new Set(a))   
// b = [11, 22]

b = a.filter((val,i)=>{
  return a.indexOf(val)==i
// b = [11, 22]

@Anand Somani 2017-02-22 15:17:49

Quick and Easy using lodash - var array = ["12346","12347","12348","12349","12349"]; console.log(_.uniqWith(array,_.isEqual));

@Vishnu 2017-01-20 06:37:51

var lines = ["Mike", "Matt", "Nancy", "Adam", "Jenny", "Nancy", "Carl"];
var uniqueNames = [];

for(var i=0;i<lines.length;i++)
    if(uniqueNames.indexOf(lines[i]) == -1)
if(uniqueNames.indexOf(uniqueNames[uniqueNames.length-1])!= -1)
for(var i=0;i<uniqueNames.length;i++)

@Santosh 2017-03-16 17:51:24

Your code works great. but the code 'uniqueNames.pop()' is removing the last array element for no reason. It makes the 'Carl' not listing from the array.

@Roman Bataev 2012-02-10 15:13:22

Quick and dirty using jQuery:

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];
var uniqueNames = [];
$.each(names, function(i, el){
    if($.inArray(el, uniqueNames) === -1) uniqueNames.push(el);

@Oscar Pérez 2016-09-15 17:18:00

A little overkill, but beautiful code.

@Casey Kuball 2016-10-25 03:10:55

Isn't this O(n^2)?

@Aus 2016-11-08 15:08:07

Low performance. but I cannot come up with a short code like this.

@iblamefish 2016-12-25 07:52:35

@Darthfett hash table lookup is O(1)

@iblamefish 2016-12-25 07:55:10

This can now be done without jQuery using Array.prototype.filter

@Casey Kuball 2016-12-26 07:52:53

@iblamefish comment is now out-of-date, as the original solution was edited (it previously used $.inArray).

@iblamefish 2016-12-26 13:44:58

@Darthfett ah yes, I see it's been changed now. The $.inArray solution is indeed O(n^2)

@Aditya 2017-01-02 10:56:10

really great inside angular js controller as well...Thanks :)

@Matej Voboril 2017-04-11 18:08:59

wouldn't mind a non-jquery answer for those who don't use it

@sg28 2017-08-29 08:01:53

can anyOne share a O(n) or O(n log n) close to a scalable solution.Thank you

@Casey Kuball 2017-09-21 16:35:36

As this was reverted back to the original inArray solution by a reputable person, I am going to again mention: this solution is O(n^2), making it inefficient.

@EyoelD 2017-10-30 02:43:41

I think he said javascript not jquery. Thanks.

@Jorge Fuentes González 2017-11-14 11:26:55

@EyoelD Says that acceps jQuery solutions too. Btw, I don't know why using jQuery solution when a non-jQuery can be done.

@robertmain 2018-08-13 17:48:29

@JorgeFuentesGonzález because Stackoverflow has this weird thing for jQuery

@Soni Kumari 2019-07-27 03:28:43

Find more about the answer of this question brief:…

@Deke 2016-12-25 05:18:30

Simplest One I've run into so far. In es6.

 var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl", "Mike", "Nancy"]

 var noDupe = Array.from(new Set(names))

@JMichaelTX 2017-04-06 20:08:44

For Mac users, even though this is an ES6 function, it works in macOS 10.11.6 El Capitan, using the Script Editor.

@Matt MacPherson 2016-12-02 21:13:13

function removeDuplicates (array) {
  var sorted = array.slice().sort()
  var result = []

  sorted.forEach((item, index) => {
    if (sorted[index + 1] !== item) {
  return result

@vin_schumi 2013-04-09 14:33:09

A slight modification of thg435's excellent answer to use a custom comparator:

function contains(array, obj) {
    for (var i = 0; i < array.length; i++) {
        if (isEqual(array[i], obj)) return true;
    return false;
function isEqual(obj1, obj2) {
    if ( == return true;
    return false;
function removeDuplicates(ary) {
    var arr = [];
    return ary.filter(function(x) {
        return !contains(arr, x) && arr.push(x);

@Juhan 2015-02-27 09:53:04

Go for this one:

var uniqueArray = duplicateArray.filter(function(elem, pos) {
    return duplicateArray.indexOf(elem) == pos;

Now uniqueArray contains no duplicates.

@RegarBoy 2019-07-07 21:23:23

I think, the BEST one!

@Casey Kuball 2012-02-10 15:03:50

Vanilla JS: Remove duplicates using an Object like a Set

You can always try putting it into an object, and then iterating through its keys:

function remove_duplicates(arr) {
    var obj = {};
    var ret_arr = [];
    for (var i = 0; i < arr.length; i++) {
        obj[arr[i]] = true;
    for (var key in obj) {
    return ret_arr;

Vanilla JS: Remove duplicates by tracking already seen values (order-safe)

Or, for an order-safe version, use an object to store all previously seen values, and check values against it before before adding to an array.

function remove_duplicates_safe(arr) {
    var seen = {};
    var ret_arr = [];
    for (var i = 0; i < arr.length; i++) {
        if (!(arr[i] in seen)) {
            seen[arr[i]] = true;
    return ret_arr;


ECMAScript 6: Use the new Set data structure (order-safe)

ECMAScript 6 adds the new Set Data-Structure, which lets you store values of any type. Set.values returns elements in insertion order.

function remove_duplicates_es6(arr) {
    let s = new Set(arr);
    let it = s.values();
    return Array.from(it);

Example usage:

a = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];

b = remove_duplicates(a);
// b:
// ["Adam", "Carl", "Jenny", "Matt", "Mike", "Nancy"]

c = remove_duplicates_safe(a);
// c:
// ["Mike", "Matt", "Nancy", "Adam", "Jenny", "Carl"]

d = remove_duplicates_es6(a);
// d:
// ["Mike", "Matt", "Nancy", "Adam", "Jenny", "Carl"]

@amenthes 2014-08-26 08:33:55

In more recent browsers, you could even do var c = Object.keys(b). It should be noted that this approach will only work for strings, but it's alright, that's what the original question was asking for.

@Juan Mendes 2015-05-27 19:38:27

It should also be noted that you may lose the order of the array because objects don't keep their properties in order.

@Casey Kuball 2015-06-23 21:36:23

@JuanMendes I have created an order-safe version, which simply copies to the new array if the value has not been seen before.

@kittu 2018-10-11 16:46:18

What is happening on this line obj[arr[i]] = true; ??

@Casey Kuball 2018-10-12 02:54:48

@kittu, that is getting the ith element of the array, and putting it into the object (being used as a set). The key is the element, and the value is true, which is entirely arbitrary, as we only care about the keys of the object.

@kittu 2018-10-12 04:55:34

@Darthfett So the element will become key here and value could be anything/arbitrary?

@Casey Kuball 2018-10-12 05:12:30

@kittu precisely. If the element already exists in the object, it doesn't matter, as there can only be one entry for each unique value, thus eliminating the duplicate.

@user6445533 2016-09-04 12:41:30

Generic Functional Approach

Here is a generic and strictly functional approach with ES2015:

// small, reusable auxiliary functions

const apply = f => a => f(a);

const flip = f => b => a => f(a) (b);

const uncurry = f => (a, b) => f(a) (b);

const push = x => xs => (xs.push(x), xs);

const foldl = f => acc => xs => xs.reduce(uncurry(f), acc);

const some = f => xs => xs.some(apply(f));

// the actual de-duplicate function

const uniqueBy = f => foldl(
   acc => x => some(f(x)) (acc)
    ? acc
    : push(x) (acc)
 ) ([]);

// comparators

const eq = y => x => x === y;

// string equality case insensitive :D
const seqCI = y => x => x.toLowerCase() === y.toLowerCase();

// mock data

const xs = [1,2,3,1,2,3,4];

const ys = ["a", "b", "c", "A", "B", "C", "D"];

console.log( uniqueBy(eq) (xs) );

console.log( uniqueBy(seqCI) (ys) );

We can easily derive unique from unqiueBy or use the faster implementation utilizing Sets:

const unqiue = uniqueBy(eq);

// const unique = xs => Array.from(new Set(xs));

Benefits of this approach:

  • generic solution by using a separate comparator function
  • declarative and succinct implementation
  • reuse of other small, generic functions

Performance Considerations

uniqueBy isn't as fast as an imperative implementation with loops, but it is way more expressive due to its genericity.

If you identify uniqueBy as the cause of a concrete performance penalty in your app, replace it with optimized code. That is, write your code first in an functional, declarative way. Afterwards, provided that you encounter performance issues, try to optimize the code at the locations, which are the cause of the problem.

Memory Consumption and Garbage Collection

uniqueBy utilizes mutations (push(x) (acc)) hidden inside its body. It reuses the accumulator instead of throwing it away after each iteration. This reduces memory consumption and GC pressure. Since this side effect is wrapped inside the function, everything outside remains pure.

@HBP 2013-02-11 21:18:54

A single line version using array filter and indexOf functions:

arr = arr.filter (function (value, index, array) { 
    return array.indexOf (value) == index;

@neelmeg 2016-06-22 04:59:48

care to explain how it eliminates dupes?

@HBP 2016-06-22 07:06:17

@web_dev: it doesn't !! I have corrected a previous edit which broke the code. Hope it makes more sens now. Thanks for asking!

@Casey Kuball 2016-10-18 16:59:24

This unfortunately has poor performance if this is a large array -- arr.indexOf is O(n), which makes this algorithm O(n^2)

@Ivo 2015-09-11 23:44:35

The most concise way to remove duplicates from an array using native javascript functions is to use a sequence like below:

vals.sort().reduce(function(a, b){ if (b != a[0]) a.unshift(b); return a }, [])

there's no need for slice nor indexOf within the reduce function, like i've seen in other examples! it makes sense to use it along with a filter function though:

vals.filter(function(v, i, a){ return i == a.indexOf(v) })

Yet another ES6(2015) way of doing this that already works on a few browsers is:

Array.from(new Set(vals))

or even using the spread operator:

[ Set(vals)]


@caiohamamura 2015-10-28 01:38:40

Set is great and very intuitive for those used to python. Too bad they do not have those great (union, intersect, difference) methods.

@Alexander Dixon 2016-06-10 14:07:37

I went with the simplistic one line of code that utilizes the set mechanic. This was for a custom automation task so I was not leery of using it in the latest version of Chrome (within jsfiddle). However, I would still like to know the shortest all browser compliant way to de-dupe an array.

@Ivo 2016-06-10 14:17:07

sets are part of the new specification, you should use the sort/reduce combo to assure cross-browser compatibility @AlexanderDixon

@Alexander Dixon 2016-06-10 14:56:58

.reduce() is not cross-browser compatible as I would have to apply a poly-fill. I appreciate your response though.…

Related Questions

Sponsored Content

94 Answered Questions

[SOLVED] How do I remove a particular element from an array in JavaScript?

  • 2011-04-23 22:17:18
  • Walker
  • 6266812 View
  • 7826 Score
  • 94 Answer
  • Tags:   javascript arrays

87 Answered Questions

[SOLVED] Get all unique values in a JavaScript array (remove duplicates)

39 Answered Questions

[SOLVED] For-each over an array in JavaScript?

50 Answered Questions

45 Answered Questions

[SOLVED] Sort array of objects by string property value

42 Answered Questions

[SOLVED] How do I remove a property from a JavaScript object?

43 Answered Questions

[SOLVED] Loop through an array in JavaScript

29 Answered Questions

[SOLVED] Finding duplicate values in a SQL table

  • 2010-04-07 18:17:29
  • Alex
  • 2703758 View
  • 1829 Score
  • 29 Answer
  • Tags:   sql duplicates

34 Answered Questions

[SOLVED] Create ArrayList from array

30 Answered Questions

[SOLVED] How to append something to an array?

Sponsored Content