[SOLVED] Get all unique values in a JavaScript array (remove duplicates)

By Mottie

I have an array of numbers that I need to make sure are unique. I found the code snippet below on the internet and it works great until the array has a zero in it. I found this other script here on SO that looks almost exactly like it, but it doesn't fail.

So for the sake of helping me learn, can someone help me determine where the prototype script is going wrong?

``````Array.prototype.getUnique = function() {
var o = {}, a = [], i, e;
for (i = 0; e = this[i]; i++) {o[e] = 1};
for (e in o) {a.push (e)};
return a;
}
``````

Similar question:

To filter-out undefined and null values because most of the time you do not need them.

``````const uniques = myArray.filter(e => e).filter((e, i, a) => a.indexOf(e) === i);
``````

or

``````const uniques = [...new Set(myArray.filter(e => e))];
``````

Es6 based solution...

``````var arr = [2,3,4,2,3,4,2];
const result = [...new Set(arr)];
console.log(result);
``````

Magic

``````a.filter(( t={}, e=>!(t[e]=e in t) ))
``````

O(n) performance; we assume your array is in `a`. Explanation here (+Jeppe improvement)

``````var a1 = [5,6,0,4,9,2,3,5,0,3,4,1,5,4,9];
var a2 = [[2, 17], [2, 17], [2, 17], [1, 12], [5, 9], [1, 12], [6, 2], [1, 12]];

let nd = (a) => a.filter((t={},e=>!(t[e]=e in t)))

// print
let c= x => console.log(JSON.stringify(x));
c( nd(a1) );
c( nd(a2) );
c( nd(a3) );``````

@Ondřej Želazko 2019-01-08 14:16:10

this look so super cool, that without a solid explanation i fell you're gonna mine bitcoins when i run this

@Ondřej Želazko 2019-01-09 09:49:28

what i meant is that you should expand your answer with some explanation and commented deconstruction of it. don't expect people will find useful answers like this. (though it really looks cool a probably works)

@Jeppe 2019-01-13 20:21:15

Not magic, but is much like the "Set"-answers, using O(1) key-lookups in the dictionary. Do you need to increment the counters though? How about "e=>!(t[e]=e in t)". Nice answer though.

@Kamil Kiełczewski 2019-01-14 03:32:29

@Jeppe when I run your improvement then I experience aha effect (before I don't know that I can use `in` operator outside the other construction than `for` loop :P) - Thank you - I appreciate it and will give +2 to your other good answers.

@Junaid Khan 2018-12-12 12:06:21

``````var numbers = [1,1,2,3,4,4];

function unique(dupArray) {
return  dupArray.reduce(function (previous, num){

if (previous.find(function(item){
return item == num;
})) {
return previous;
} else {
previous.push(num);
return previous;
}
}, [])
}

var check = unique(numbers);
console.log(check);
``````

@user3591464 2018-11-29 15:03:44

The Object answer above does not seem to work for me in my use case with Objects.

I have modified it as follows:

``````var j = {};

this.forEach( function(v) {
var typ = typeof v;
var v = (typ === 'object') ? JSON.stringify(v) : v;

j[v + '::' + typ] = v;
});

return Object.keys(j).map(function(v){
if ( v.indexOf('::object') > -1 ) {
return JSON.parse(j[v]);
}

return j[v];
});
``````

This seems to now work correctly for objects, arrays, arrays with mixed values, booleans, etc.

@shunryu111 2018-10-11 13:30:32

i had a slightly different problem where i needed to remove objects with duplicate id properties from an array. this worked..

``````let objArr = [ { id: '123' }, { id: '123' }, { id: '456' } ];

objArr = objArr.reduce( ( acc, cur ) => [
...acc.filter( ( obj ) => obj.id !== cur.id ), cur
], [] );
``````

@tjacks3 2018-08-23 14:45:53

I have a solution that uses es6 reduce and find array helper methods to remove duplicates.

``````let numbers = [2,2,3,3,5,6,6];

const removeDups = array => {
return array.reduce((acc, inc) => {
if(!acc.find( i => i === inc)) {
acc.push(inc);
}
return acc;
},[]);
}

removeDups(numbers); /// [2,3,5,6]
``````

@BazSTR 2018-07-13 09:43:42

Do it with lodash and identity lambda function, just define it before use your object

``````const _ = require('lodash');
...
_.uniqBy([{a:1,b:2},{a:1,b:2},{a:1,b:3}], v=>v.a.toString()+v.b.toString())
_.uniq([1,2,3,3,'a','a','x'])
``````

and will have:

``````[{a:1,b:2},{a:1,b:3}]
[1,2,3,'a','x']
``````

(this is the simplest way )

@TLindig 2013-01-21 12:46:24

With JavaScript 1.6 / ECMAScript 5 you can use the native `filter` method of an Array in the following way to get an array with unique values:

``````function onlyUnique(value, index, self) {
return self.indexOf(value) === index;
}

// usage example:
var a = ['a', 1, 'a', 2, '1'];
var unique = a.filter( onlyUnique ); // returns ['a', 1, 2, '1']
``````

The native method `filter` will loop through the array and leave only those entries that pass the given callback function `onlyUnique`.

`onlyUnique` checks, if the given value is the first occurring. If not, it must be a duplicate and will not be copied.

This solution works without any extra library like jQuery or prototype.js.

It works for arrays with mixed value types too.

For old Browsers (<ie9), that do not support the native methods `filter` and `indexOf` you can find work arounds in the MDN documentation for filter and indexOf.

If you want to keep the last occurrence of a value, simple replace `indexOf` by `lastIndexOf`.

With ES6 it could be shorten to this:

``````// usage example:
var myArray = ['a', 1, 'a', 2, '1'];
var unique = myArray.filter((v, i, a) => a.indexOf(v) === i);

// unique is ['a', 1, 2, '1']
``````

Thanks to Camilo Martin for hint in comment.

ES6 has a native object `Set` to store unique values. To get an array with unique values you could do now this:

``````var myArray = ['a', 1, 'a', 2, '1'];

let unique = [...new Set(myArray)];

// unique is ['a', 1, 2, '1']
``````

The constructor of `Set` takes an iterable object, like Array, and the spread operator `...` transform the set back into an Array. Thanks to Lukas Liese for hint in comment.

@Mottie 2013-01-21 14:02:57

+1 Thanks for sharing! The only issue I see is that IE versions < 9 don't have an array `indexOf` function (obviously), which is why the other answers are using loops.

@TLindig 2013-02-13 09:42:37

@Motti Exactly, because <=ie8 do not have JavaScript 1.6 support. If you follow the link filter you will find at bottom a table with Browser compatibility.

@Jack Franzen 2013-11-23 10:11:12

This solution will run much slower, unfortunately. You're looping twice, once with filter and once with index of

@TLindig 2013-11-23 17:40:45

@JackFranzen Slower than what? The solution from Rafael? Rafaels solution do not work for mixed type arrays. For my example `['a', 1, 'a', 2, '1']` you would get `['a', 1, 2]`. But this is not what I expected. BTW, much slower is very relative.

@Jack Franzen 2013-11-26 17:47:27

I see. You need to hash out the number one to be ###1 or something unique so that the hash key doesnt mess up. Afterwards, you need to convert back. It's unbelievable that'd you'd have a data set where it'd be impossible to make one of these keys, that's just disorganization

@L S 2016-02-17 14:41:09

While I'm aware the asker specified a solution for a list of numbers, I want to point out that this may not work with other data types. I.e. strings of differing case or objects.

@Camilo Martin 2016-07-24 08:43:19

In modern JS: `.filter((v,i,a)=>a.indexOf(v)==i)` (fat arrow notation).

@TLindig 2016-09-19 09:49:18

@Nico: show me one that does it better! you will not found one. In a unsorted array each entry must be compared with all other entries. So every solution will get exponentially slower.

@Nico 2016-09-24 11:25:07

@TLindig: try to get O(n) solution here @ codility.com/programmers/task/odd_occurrences_in_array

@Dan Green-Leipciger 2016-11-10 21:33:39

Probably shouldn't put personally identifiable info on Stack Overflow

@Lukas 2016-11-19 15:07:06

`let unique_values = [...new Set(random_array)];` developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/…

@Dan Nissenbaum 2017-01-22 21:01:53

How about sorting the array first, then finding unique values? Seems this should be `O(n log n)` - from stackoverflow.com/a/8306539/368896

@Dan Nissenbaum 2017-01-22 21:08:11

For a much more detailed answer, including many possibilities - such as sorting first, and dealing with varying data types - see stackoverflow.com/a/9229821/368896

@Jakub Synowiec 2017-02-26 22:36:38

While this might be the shortest solution using a Set as a cache (and checking for values with Set.has() method) in a loop is ~2 times faster than spreading a Set. This might be important for very large arrays.

@Andre 2017-09-01 18:32:07

not work with objects

`let unique = [...new Set(myArray)]` threw a type error for me in TypeScript - anyone have an example for that?

@monzonj 2017-10-26 06:57:49

@BradGreens. Because TypeScript is strong-typed, you cannot do this kind of hacks. Try `Array.from(new Set(myArray));`

@Manuel 2017-10-31 10:40:11

How would `Set` be used with an array of dictionaries where a specific key should be unique?

@Arthur Tacca 2017-12-09 18:52:47

@Nico This answer is O(n²), which is polynomial time, not exponential time. Exponential growth is e.g. O(2^n).

@Nico 2017-12-12 10:04:43

I stand corrected my dudes it's polynomial - not exponential - and there is a big difference.

@Jakub 2018-02-15 23:44:15

Similar function finding duplicated values is also useful sometimes `function onlyDuplicates(value, index, self) { return self.indexOf(value) !== index; }` note - note how each duplicated value is returned as many times as it is duplicated

@Cristi Mihai 2018-08-21 13:24:16

Interestingly enough, the arrow function method is the fastest (but varies on your dataset): jsben.ch/remd9

@Coder of Code 2018-10-05 04:39:31

Just an heads up `[...new Set(myArray)]` this fails on IE 11.

@Morris S 2018-06-26 18:46:35

You can use Ramda.js, a functional javascript library to do this:

``````var unique = R.uniq([1, 2, 1, 3, 1, 4])
console.log(unique)``````
``<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.25.0/ramda.js"></script>``

@NikeshPathania 2018-06-11 11:34:39

If you're okay with extra dependencies, or you already have one of the libraries in your codebase, you can remove duplicates from an array in place using LoDash (or Underscore).

Usage

If you don't have it in your codebase already, install it using npm:

``````npm install lodash
``````

Then use it as follows:

``````import _ from 'lodash';
let idArray = _.uniq ([
1,
2,
3,
3,
3
]);
console.dir(idArray);
``````

Out:

``````[ 1, 2, 3 ]
``````

@chinmayan 2018-04-09 03:49:15

We can do this using ES6 sets:

``````var duplicatedArray = [1,2,3,4,5,1,1,1,2,3,4];
var uniqueArray = Array.from(new Set(duplicatedArray));
``````

//The output will be

``````uniqueArray = [1,2,3,4,5];
``````

@daviestar 2018-02-01 16:36:31

strange this hasn't been suggested before.. to remove duplicates by object key (`id` below) in an array you can do something like this:

``````const uniqArray = array.filter((obj, idx, arr) => (
arr.findIndex((o) => o.id === obj.id) === idx
))
``````

@Joginder Pawan Kumar 2018-01-21 12:06:27

You can make use of arrays' helper functions reduce() and some() to achieve your result. Check my code snippet:

``````var arrayWithDuplicates = [0, 0, 1, 2, 3, 3, 4, 4, 'a', 'a', '', '', null, null];

var arrayWithUniqueValues = arrayWithDuplicates
.reduce((previous, item) => {
if(!previous.some(element => element === item)) {
previous.push(item)
}
return previous;
}, []);

console.log('arrayWithUniqueValues', arrayWithUniqueValues)``````

@John Smith 2017-12-19 20:54:44

Similar to @sergeyz solution, but more compact by using more shorthand formats such as arrow functions and `array.includes`. Warning: JSlint will complain due to the use of the logical or and comma. (still perfectly valid javascript though)

``````my_array.reduce((a,k)=>(a.includes(k)||a.push(k),a),[])
``````

@A.T. 2015-10-14 09:42:34

Updated answer for ES6/ES2015: Using the Set, the single line solution is:

``````var items = [4,5,4,6,3,4,5,2,23,1,4,4,4]
var uniqueItems = Array.from(new Set(items))
``````

Which returns

``````[4, 5, 6, 3, 2, 23, 1]
``````

As le_m suggested, this can also be shortened using spread operator , like

``````var uniqueItems = [...new Set(items)]
``````

@Alexander Goncharov 2016-10-24 13:49:59

Notice, that inner array wouldn't work `Array.from(new Set([[1,2],[1,2],[1,2,3]]))`

@Simoyw 2017-03-09 10:08:40

Performance of this solution compared to `myArray.filter((v, i, a) => a.indexOf(v) === i);`?

@mortb 2017-04-05 09:14:11

Please note that if you use the `Set` and add objects instead of primitive values it will contain unique references to the objects. Thus the set `s` in `let s = new Set([{Foo:"Bar"}, {Foo:"Bar"}]);` will return this: `Set { { Foo: 'Bar' }, { Foo: 'Bar' } }` which is a `Set` with unique object references to objects that contain the same values. If you write `let o = {Foo:"Bar"};` and then create a set with two references like so: `let s2 = new Set([o,o]);`, then s2 will be `Set { { Foo: 'Bar' } }`

@mortb 2017-04-05 09:22:11

So it is a bit cumbersome to use `Set`s to get unique objects

@highmaintenance 2017-11-14 10:37:29

LOL Set is not an operator. It's an Object

@Thymine 2017-11-28 23:45:29

Both of these options have issues on IE10 even with a polyfill.io layer being present

@Matthew Herbst 2018-01-10 18:56:58

On Chrome 63, spread seems to outperform `Array.from` by about 35%. The difference on Firefox 57 is about 13%, and it's only about 10% difference on Safari 11.0.2. jsperf.com/set-conversion-using-array-from-vs-spread/1

@fiddur 2018-09-17 09:09:00

I really like the simplicity of spread version, but in my chrome at least, the filter/indexOf is still a lot faster. jsperf.com/get-a-unique-array2

@Alex K 2018-09-19 09:32:15

the speed of `[...new Set(items)]` is amazing, comparing to those indexOf solutions

@fiddur 2018-11-01 08:23:55

@1valdis: Hmm, seems most links I had to jsperf are broken now. I hope they fix that on their end...

@Omer Shukar 2019-01-13 18:55:18

new test case jsperf.com/array-filter-unique-vs-new-set/1 seems like `new Set`'s trophy

@Khateeb321 2019-01-16 14:50:35

Ahh! That's hot! That's hot!

@Andrei 2017-10-27 10:10:30

This is an ES6 function which removes duplicates from an array of objects, filtering by the specified object property

``````function dedupe(arr = [], fnCheck = _ => _) {
const set = new Set();
let len = arr.length;

for (let i = 0; i < len; i++) {
const primitive = fnCheck(arr[i]);
if (set.has(primitive)) {
// duplicate, cut it
arr.splice(i, 1);
i--;
len--;
} else {
}
}

return arr;
}

const test = [
{video:{slug: "a"}},
{video:{slug: "a"}},
{video:{slug: "b"}},
{video:{slug: "c"}},
{video:{slug: "c"}}
]
console.log(dedupe(test, x => x.video.slug));

// [{video:{slug: "a"}}, {video:{slug: "b"}}, {video:{slug: "c"}}]
``````

@Ali 2017-09-03 09:18:54

Making an array of unique arrays, using field[2] as an Id:

``````[ [ '497', 'Q0', 'WTX091-B06-138', '0', '1.000000000', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B09-92', '1', '0.866899288', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B09-92', '2', '0.846036819', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B09-57', '3', '0.835025326', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B43-79', '4', '0.765068215', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B43-56', '5', '0.764211464', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B44-448', '6', '0.761701704', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B44-12', '7', '0.761701704', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B49-128', '8', '0.747434800', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B18-17', '9', '0.746724770', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B19-374', '10', '0.733379549', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B19-344', '11', '0.731421782', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B09-92', '12', '0.726450470', 'GROUP001' ],
[ '497', 'Q0', 'WTX091-B19-174', '13', '0.712757036', 'GROUP001' ] ]
.filter((val1, idx1, arr) => !!~val1.indexOf(val1[2]) &&
!(arr.filter((val2, idx2) => !!~val2.indexOf(val1[2]) &&
idx2 < idx1).length));
``````

@Max Makhrov 2017-03-27 12:24:52

I split all answers to 4 possible solutions:

1. Use new ES6 feature: `[...new Set( [1, 1, 2] )];`
2. Use object `{ }` to prevent duplicates
3. Use helper array `[ ]`
4. Use `filter + indexOf`

Here's sample codes found in answers:

Use new ES6 feature: `[...new Set( [1, 1, 2] )];`

``````function uniqueArray0(array) {
var result = Array.from(new Set(array));
return result
}
``````

Use object `{ }` to prevent duplicates

``````function uniqueArray1( ar ) {
var j = {};

ar.forEach( function(v) {
j[v+ '::' + typeof v] = v;
});

return Object.keys(j).map(function(v){
return j[v];
});
}
``````

Use helper array `[ ]`

``````function uniqueArray2(arr) {
var a = [];
for (var i=0, l=arr.length; i<l; i++)
if (a.indexOf(arr[i]) === -1 && arr[i] !== '')
a.push(arr[i]);
return a;
}
``````

Use `filter + indexOf`

``````function uniqueArray3(a) {
function onlyUnique(value, index, self) {
return self.indexOf(value) === index;
}

// usage
var unique = a.filter( onlyUnique ); // returns ['a', 1, 2, '1']

return unique;
}
``````

And I wondered which one is faster. I've made sample Google Sheet to test functions. Note: ECMA 6 is not avaliable in Google Sheets, so I can't test it.

Here's the result of tests:

I expected to see that code using object `{ }` will win because it uses hash. So I'm glad that tests showed best results for this algorithm in Chrome and IE. Thanks to @rab for the code.

@xcatliu 2017-06-30 09:21:52

In `uniqueArray2`, what is `&& arr[i] !== ''` for?

@liberborn 2017-09-14 13:18:31

The option "filter + indexOf" is extremely slow on arrays over 100.000 items. I had to use "object map" approach however it breaks original sorting.

@Timothy Zorn 2018-08-20 11:17:37

The ES6 version is the fastest: jsperf.com/zorn-unique-array/1

@kornfridge 2012-07-11 16:25:53

You can also use underscore.js.

``console.log(_.uniq([1, 2, 1, 3, 1, 4]));``
``<script src="http://underscorejs.org/underscore-min.js"></script>``

which will return:

``````[1, 2, 3, 4]
``````

@Jacob Dalton 2016-04-26 20:06:11

Please do this folks. Don't jack something onto to the Array prototype. Please.

@superluminary 2016-06-23 15:37:01

@JacobDalton - This isn't extending the Array prototype. It's namespaced in the _ object.

@Jacob Dalton 2016-06-24 00:40:39

@superluminary I know that's why I said please do this. The accepted solution suggests modifying the Array prototype. DON'T do that.

@Camilo Martin 2016-07-24 08:53:18

@JacobDalton There's no problem with modifying prototypes in your own code. If any code uses `for (var i in array)`, that's what you should be throwing away. That said, libraries probably shouldn't mess with prototypes, because they might be used in legacy/broken environments where a drunk intern at 4AM wrote code that might possibly break when used alongside clean JS that modifies the array prototype.

@Jacob Dalton 2016-07-25 18:03:18

"There's no problem with modifying prototypes in your own code." I would only agree with that if the objects you're modifying are wholly your own. Don't modify the prototype of primitive objects in your own code. Don't touch Array.protoype. If you use a library, you'll get a name space that you can track down and you are guaranteed not to cause side effects for the code used by all your included libraries.

@K48 2018-07-06 07:02:44

@JacobDalton Please don't do this. There's no need to add an extra library just for a small job that can be done with `array = [...new Set(array)]`

@cocco 2014-08-01 14:49:02

PERFORMANCE ONLY! this code is probably 10X faster than all the codes in here *works on all browsers and also has the lowest memory impact.... and more

if you don't need to reuse the old array;btw do the necessary other operations before you convert it to unique here is probably the fastest way to do this, also very short.

``````var array=[1,2,3,4,5,6,7,8,9,0,1,2,1];
``````

then you can try this

``````var array = [1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 1];

function toUnique(a, b, c) { //array,placeholder,placeholder
b = a.length;
while (c = --b)
while (c--) a[b] !== a[c] || a.splice(c, 1);
return a // not needed ;)
}
console.log(toUnique(array));
//[3, 4, 5, 6, 7, 8, 9, 0, 2, 1]``````

http://www.shamasis.net/2009/09/fast-algorithm-to-find-unique-items-in-javascript-array/

I don't like the for loop. it has to many parameters.i like the while-- loop. while is the fastest loop in all browsers except the one we all like so much... chrome.

anyway i wrote the first function that uses while.And yep it's a little faster than the function found in the article.but not enough.`unique2()`

next step use modern js.`Object.keys` i replaced the other for loop with js1.7's Object.keys... a little faster and shorter (in chrome 2x faster) ;). Not enough!.`unique3()`.

at this point i was thinking about what i really need in MY unique function. i don't need the old array, i want a fast function. so i used 2 while loops + splice.`unique4()`

Useless to say that i was impressed.

chrome: the usual 150,000 operations per second jumped to 1,800,000 operations per second.

ie: 80,000 op/s vs 3,500,000 op/s

ios: 18,000 op/s vs 170,000 op/s

safari: 80,000 op/s vs 6,000,000 op/s

Proof http://jsperf.com/wgu or better use console.time... microtime... whatever

`unique5()` is just to show you what happens if you want to keep the old array.

Don't use `Array.prototype` if yu don't know what your doing. i just did alot of copy and past. Use `Object.defineProperty(Array.prototype,...,writable:false,enumerable:false})` if you want to create a native prototype.example: https://stackoverflow.com/a/20463021/2450730

NOTE: your old array is destroyed/becomestheunique after this operation.

some are using `indexOf` ... don't ... http://jsperf.com/dgfgghfghfghghgfhgfhfghfhgfh

for empty arrays

``````!array.length||toUnique(array);
``````

seems good to me

@xShirase 2014-08-13 00:06:43

tested on node.js, with a 100k array of Urls (strings). The result was 2x slower than underscore.js _.uniq... although a separate jsperf agrees with you (jsperf.com/uniq-performance/5), I'm disappointed :(

@cocco 2014-08-13 11:30:51

theoretically my toUnique should be faster especially on big arrays as i don't use indexOf and the array is smaller every iteration because i remove the duplicates. jsperf.com/dgfgghfghfghghgfhgfhfghfhgfh tests only the indexOf, chrome altough likes also indexOf & lastIndexOF. in any case looping through objects with in could not be faster than while or for.... i hope everyone knows that...anyway thx for the tests, i see they removed my test case from jsperf....btw ... you should round,ceil floor the numbers in your test else you don't have many duplicates.filter,map or other new stuff-

@cocco 2014-08-13 11:35:03

like forEach are slow like using for in .... or even slower.the slow part on most functions here is to keep the original array.i tested that here jsperf.com/fastclone

@cocco 2014-08-13 11:41:21

i tested that here jsperf.com/fastclone all browsers except chrome prefer Object.create() which is new ... and chrome prefers my while loop.. bah...strange

@cocco 2014-08-13 15:54:33

you are not testing correctly in jsperf... in your example you define the function everytime... but the underscore.js functions are already defined.. this penalizes my function. also test 3 & 4. another thing i should say is that if you use mixed variables (strings & numbers) you should replace a[b]!==a[c] with a[b]!=a[c]

@Iain Ballard 2014-10-06 14:17:22

Note this hangs on empty arrays!

@cocco 2014-10-06 16:17:07

!array.length||toUnique(array);

Definitely hangs (on empty arrays), you should add a check there.

@Victor Ivens 2017-02-03 19:44:03

I'm not sure if did the jsPerf correct, but seems that the Reduce alternative (that, for me is easier to understand) is somewhat 94% faster than your solution. jsperf.com/reduce-for-distinct Edit: Yours is slower on chrome, the same on Edge and faster on Firefox.

Like Victor, I tested this and answers here. This is the best code for FF, but Rafael's getUnique() (the accepted answer) is only slightly slower on FF and is second-fastest on Chrome as well, making it a decent compromise. getUnique() is faster than this answer's toUnique() in Chrome (by 1.6x) as well as FF (1.3x). Joeytje50's Array.unique() is the fastest on Chrome (2.5x faster than this code).

Ah, it turns out the above test is too small to be worthwhile. The revised test shows Rafael's getUnique() as the winner. I created another revision to test this code and found it to be intolerably slow (Firefox offered to kill it several times), at 0.04 Ops/sec (FF) and 0.01 Ops/sec (Chrome). No other solution was anywhere near this slow.

@ankhzet 2017-05-31 12:08:04

Usable only with small arrays/low duplicates percentage. Each time non-unique object has been found, engine will be forced to shift all of the rest elements to the left, by: 1) iterating over them 2) read them 3) write them to new location. And then, it also creates new array with that one spliced element, and returns it as result... Algorithm quickly degrade with bigger arrays/more frequent duplicates. For arrays of size 1000->10000->50000 and ~40% duplicates, average taken time will be like 2->775->19113 ms. (size changes as 1->x10->x5, time as 1->x10->x25) in Chrome.

@liberborn 2017-09-14 13:22:17

Thanks. Nice approach. But what about correct items order?

@Pedro L. 2016-03-14 22:01:17

Simplest solution:

``````var arr = [1, 3, 4, 1, 2, 1, 3, 3, 4, 1];
console.log([...new Set(arr)]);``````

Or:

``````var arr = [1, 3, 4, 1, 2, 1, 3, 3, 4, 1];
console.log(Array.from(new Set(arr)));``````

Note that this is not supported on IE10 and below. Note that IE11+ and Safari offer limited support (not even sure that above example works) developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/…

@pgee70 2016-11-10 22:10:02

I know this has been answered to death already... but... no one has mentioned the javascript implementation of linq. Then the `.distinct()` method can be used - and it makes the code super easy to read.

``````var Linq = require('linq-es2015');
var distinctValues =  Linq.asEnumerable(testValues)
.Select(x)
.distinct()
.toArray();
``````

``````var testValues = [1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 1];

var distinctValues = Enumerable.asEnumerable(testValues)
.distinct()
.toArray();

console.log(distinctValues);``````
``<script src="https://npmcdn.com/linq-es5/dist/linq.js"></script>``

@fubar 2017-02-23 22:07:30

When you say unique values, to me, that means values that appear once and only once in the data set.

The following filters the values array, checking that the first and last index of a given value are equal. If the index is equal, it means the value must appear only once.

``````var values = [1, 2, 3, 4, 5, 2, 4, 6, 2, 1, 5];

var unique = values.filter(function(value) {
return values.indexOf(value) === values.lastIndexOf(value);
});

console.log(unique); // [3, 6]``````

Based on feedback that I misunderstood the question, here's an alternative approach that will return unique, non-duplicate values from the values array.

``````var values = [1, 2, 3, 4, 5, 2, 4, 6, 2, 1, 5];
var unique = values.reduce(function(unique, value) {
return unique.indexOf(value) === -1 ? unique.concat([value]) : unique;
}, []);

console.log(unique); // [1, 2, 3, 4, 5, 6]``````

@fubar 2017-02-24 01:21:04

Downvoted without any feedback?

@Thomas Leduc 2017-02-27 09:12:50

I did not downvote you, but I understand why a guy did. You don't return once a duplicate value. Th question is not very clear but Mottie wants a table with unique values in it, not remove all the non-unique value. Output should be [1, 2, 3, 4, 5, 6] not [3, 6].

@fubar 2017-02-27 09:20:06

@ThomasLeduc - thanks for the feedback. I've revised my answer.

@Yi Feng Xie 2017-01-15 06:19:11

I used Array#reduce as way to create Array#unique

``````Array.prototype.unique = function() {
var object = this.reduce(function(h, v) {
h[v] = true;
return h;
}, {});
return Object.keys(object);
}

console.log(["a", "b", "c", "b", "c", "a", "b"].unique()); // => ["a", "b", "c"]``````

@BrassApparatus 2017-06-21 00:23:33

I think most would consider it poor form to modify Array.prototype.

@Yi Feng Xie 2017-06-21 04:08:13

if it's a good way, why not? and I didn't override any method.

@BrassApparatus 2017-06-21 22:21:10

Here are a few thoughts on why it could be negative stackoverflow.com/questions/948358/… Another reason is that users have a very concrete expectation of what the core js classes will act like. If unexpected behavior is added it could be extremely hard to debug. I'm not saying it's never something to consider but as a general rule I won't do it.

@bvmCoder 2017-01-08 17:33:19

``````(function() {
"use strict";

Array.prototype.unique = function unique() {
var self = this;
return self.filter(function(a) {
var that = this;
// console.log(that);
return !that[a] ? that[a] = true : false;
}, {});
}

var sampleArray = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9];
var distinctArray = sampleArray.unique();
console.log(distinctArray);
})();``````
``Here is the simple way to solve this problem...``

@bvmCoder 2017-11-20 00:08:15

Can I know what is wrong here?

@Ben 2017-02-23 22:16:15

If you have an array of objects, and you want a `uniqueBy` function, say, by an id field:

``````function uniqueBy(field, arr) {
return arr.reduce((acc, curr) => {
const exists = acc.find(v => v[field] === curr[field]);
return exists ? acc : acc.concat(curr);
}, [])
}
``````

@p1100i 2016-11-15 11:04:47

If you want to change it in place (not creating a new array) you can just:

``````var
uniq = function uniq(array) {
var
len = array.length;

while (len--) {
if (array.indexOf(array[len]) !== len) {
array.splice(len, 1);
}
}

return array;
},

myArray = [1, 2, 2, 4, 2];

console.log(uniq(myArray));
// [1, 2, 4];
``````

@Leonardo 2017-01-05 22:30:04

For a array of strings:

``````function removeDuplicatesFromArray(arr) {
const unique = {};
arr.forEach((word) => {
unique[word] = 1; // it doesn't really matter what goes here
});
return Object.keys(unique);
}
``````

Many of the answers here may not be useful to beginners. If de-duping an array is difficult, will they really know about the prototype chain, or even jQuery?

In modern browsers, a clean and simple solution is to store data in a Set, which is designed to be a list of unique values.

``````const cars = ['Volvo', 'Jeep', 'Volvo', 'Lincoln', 'Lincoln', 'Ford'];
const uniqueCars = Array.from(new Set(cars));
``````

The `Array.from` is useful to convert the Set back to an Array so that you have easy access to all of the awesome methods (features) that arrays have. There are also other ways of doing the same thing. But you may not need `Array.from` at all, as Sets have plenty of useful features like forEach.

If you need to support old Internet Explorer, and thus cannot use Set, then a simple technique is to copy items over to a new array while checking beforehand if they are already in the new array.

``````// Create a list of cars, with duplicates.
var cars = ['Volvo', 'Jeep', 'Volvo', 'Lincoln', 'Lincoln', 'Ford'];
// Create a list of unique cars, to put a car in if we haven't already.
var uniqueCars = [];

// Go through each car, one at a time.
cars.forEach(function (car) {
// The code within the following block runs only if the
// current car does NOT exist in the uniqueCars list
// - a.k.a. prevent duplicates
if (uniqueCars.indexOf(car) === -1) {
// Since we now know we haven't seen this car before,
// copy it to the end of the uniqueCars list.
uniqueCars.push(car);
}
});
``````

To make this instantly reusable, let's put it in a function.

``````function deduplicate(data) {
if (data.length > 0) {
var result = [];

data.forEach(function (elem) {
if (result.indexOf(elem) === -1) {
result.push(elem);
}
});

return result;
}
}
``````

So to get rid of the duplicates, we would now do this.

``````var uniqueCars = deduplicate(cars);
``````

The `deduplicate(cars)` part becomes the thing we named result when the function completes.

Just pass it the name of any array you like.

By the way, I used an array full of strings to show that my technique is flexible. It will work properly for numbers.

[SOLVED] How do I remove a particular element from an array in JavaScript?

• 2011-04-23 22:17:18
• Walker
• 5490257 View
• 6813 Score
• Tags:   javascript arrays