#### [SOLVED] How to make a flat list out of list of lists?

By Emma

I wonder whether there is a shortcut to make a simple list out of list of lists in Python.

I can do that in a `for` loop, but maybe there is some cool "one-liner"? I tried it with `reduce()`, but I get an error.

Code

``````l = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
reduce(lambda x, y: x.extend(y), l)
``````

Error message

``````Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 1, in <lambda>
AttributeError: 'NoneType' object has no attribute 'extend'
``````

#### @kederrac 2020-01-25 21:08:38

you can use `list` `extend` method, it shows to be the fastest:

``````flat_list = []
for sublist in l:
flat_list.extend(sublist)
``````

performance:

``````import functools
import itertools
import numpy
import operator
import perfplot

def functools_reduce_iconcat(a):
return functools.reduce(operator.iconcat, a, [])

def itertools_chain(a):
return list(itertools.chain.from_iterable(a))

def numpy_flat(a):
return list(numpy.array(a).flat)

def extend(a):
n = []

list(map(n.extend, a))

return n

perfplot.show(
setup=lambda n: [list(range(10))] * n,
kernels=[
functools_reduce_iconcat, extend,itertools_chain, numpy_flat
],
n_range=[2**k for k in range(16)],
xlabel='num lists',
)
``````

output:

#### @Nico Schlömer 2017-07-26 09:38:16

I tested most suggested solutions with perfplot (a pet project of mine, essentially a wrapper around `timeit`), and found

``````functools.reduce(operator.iconcat, a, [])
``````

to be the fastest solution. (`operator.iadd` is equally fast.)

Code to reproduce the plot:

``````import functools
import itertools
import numpy
import operator
import perfplot

def forfor(a):
return [item for sublist in a for item in sublist]

def sum_brackets(a):
return sum(a, [])

def functools_reduce(a):
return functools.reduce(operator.concat, a)

def functools_reduce_iconcat(a):
return functools.reduce(operator.iconcat, a, [])

def itertools_chain(a):
return list(itertools.chain.from_iterable(a))

def numpy_flat(a):
return list(numpy.array(a).flat)

def numpy_concatenate(a):
return list(numpy.concatenate(a))

perfplot.show(
setup=lambda n: [list(range(10))] * n,
kernels=[
forfor, sum_brackets, functools_reduce, functools_reduce_iconcat,
itertools_chain, numpy_flat, numpy_concatenate
],
n_range=[2**k for k in range(16)],
xlabel='num lists'
)
``````

#### @Sara 2019-01-20 13:57:20

For huge nested lists,' list(numpy.array(a).flat)' is the fastest among all functions above.

#### @Justas 2019-10-14 08:05:22

Tried using regex: 'list(map(int, re.findall(r"[\w]+", str(a))))'. Speed is bit slower that numpy_concatenate

#### @Shawn Chin 2009-06-04 21:06:17

You can use `itertools.chain()`:

``````>>> import itertools
>>> list2d = [[1,2,3], [4,5,6], [7], [8,9]]
>>> merged = list(itertools.chain(*list2d))
``````

Or you can use `itertools.chain.from_iterable()` which doesn't require unpacking the list with the `*` operator:

``````>>> import itertools
>>> list2d = [[1,2,3], [4,5,6], [7], [8,9]]
>>> merged = list(itertools.chain.from_iterable(list2d))
``````

This approach is arguably more readable than `[item for sublist in l for item in sublist]` and appears to be faster too:

``````\$ python3 -mtimeit -s'l=[[1,2,3],[4,5,6], [7], [8,9]]*99;import itertools' 'list(itertools.chain.from_iterable(l))'
20000 loops, best of 5: 10.8 usec per loop
\$ python3 -mtimeit -s'l=[[1,2,3],[4,5,6], [7], [8,9]]*99' '[item for sublist in l for item in sublist]'
10000 loops, best of 5: 21.7 usec per loop
\$ python3 -mtimeit -s'l=[[1,2,3],[4,5,6], [7], [8,9]]*99' 'sum(l, [])'
1000 loops, best of 5: 258 usec per loop
\$ python3 -mtimeit -s'l=[[1,2,3],[4,5,6], [7], [8,9]]*99;from functools import reduce' 'reduce(lambda x,y: x+y,l)'
1000 loops, best of 5: 292 usec per loop
\$ python3 --version
Python 3.7.5rc1
``````

#### @Tim Dierks 2014-09-03 14:13:45

The `*` is the tricky thing that makes `chain` less straightforward than the list comprehension. You have to know that chain only joins together the iterables passed as parameters, and the * causes the top-level list to be expanded into parameters, so `chain` joins together all those iterables, but doesn't descend further. I think this makes the comprehension more readable than the use of chain in this case.

@TimDierks: I'm not sure "this requires you to understand Python syntax" is an argument against using a given technique in Python. Sure, complex usage could confuse, but the "splat" operator is generally useful in many circumstances, and this isn't using it in a particularly obscure way; rejecting all language features that aren't necessarily obvious to beginning users means you're tying one hand behind your back. May as well throw out list comprehensions too while you're at it; users from other backgrounds would find a `for` loop that repeatedly `append`s more obvious.

#### @Cristian Ciupitu 2018-04-13 18:42:31

`reduce(lambda x,y: x+y,l)` could be replaced with `reduce(operator.add, l)`

@PM2Ring: Not sure why you introduced `product` here... The equivalent code example to the answer would just be `[*chain.from_iterable(list2d)]`

#### @Mitch McMabers 2019-11-06 14:32:25

Warning: `chain.from_iterable` is not recursive and does not handle deeply nested containers! Check my answer for a better solution.

#### @gouravkr 2020-01-01 10:39:50

This answer, and other answers here, give incorrect result if the top level also contains a value. for instance, `list = [["abc","bcd"],["cde","def"],"efg"]` will result in an output of `["abc", "bcd", "cde", "def", "e", "f", "g"].`

#### @Igor Krivokon 2009-06-04 20:47:13

The reason your function didn't work is because the extend extends an array in-place and doesn't return it. You can still return x from lambda, using something like this:

``````reduce(lambda x,y: x.extend(y) or x, l)
``````

Note: extend is more efficient than + on lists.

#### @agf 2011-09-24 10:12:35

`extend` is better used as `newlist = []`, `extend = newlist.extend`, `for sublist in l: extend(l)` as it avoids the (rather large) overhead of the `lambda`, the attribute lookup on `x`, and the `or`.

#### @Markus Dutschke 2019-07-02 12:24:14

for python 3 add `from functools import reduce`

One posibility is to treat the array as a string:

``````elements = [[180.0, 1, 2, 3], [173.8], [164.2], [156.5], [147.2], [138.2]]
list(map(float, str(elements).replace("[", "").replace("]", "").split(",")))
``````

#### @Georgy 2019-11-15 11:56:01

The result will be a list of strings though. Also, this code may produce wrong result in case if the inner elements are strings containing commas/square brackets.

This method is valid for lists of integers or floats.

#### @Xero Smith 2018-11-28 01:00:34

This works with abitrarily nested lists. It can be easily extended to work with other kinds of iterables.

``````def flatten(seq):
"""list -> list
return a flattend list from an abitrarily nested list
"""
if not seq:
return []
if not isinstance(seq[0], list):
return [seq[0]] + flatten(seq[1:])
return flatten(seq[0]) + flatten(seq[1:])
``````

Sample run

``````>>> flatten([1, [2, 3], [[[4, 5, 6], 7], [[8]]], 9])
[1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

#### @Aran-Fey 2019-10-23 05:42:41

`return seq` <- No no no no no. Never write a function that sometimes returns a new object and sometimes returns the input object. Always return a new list.

#### @Xero Smith 2019-10-23 06:50:16

@Aran-Fey I see how that may be problematic with functions such as `def identity(x): return x` in which case it will cause problems when applied to mutable objects passed by reference such as lists because what is returned is essentialy an alias to the original. However in this case, provided the original list passed in is not the empty list, there should be no such problem. I will update it accordingly to address that case.

#### @Alijy 2019-09-30 10:49:25

``````from nltk import flatten

l = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
flatten(l)
``````

The advantage of this solution over most others here is that if you have a list like:

``````l = [1, [2, 3], [4, 5, 6], [7], [8, 9]]
``````

while most other solutions throw an error this solution handles them.

#### @sudeepgupta90 2019-08-28 08:09:59

another fun way to do this:

``````from functools import reduce

li=[[1,2],[3,4]]
``````

#### @Mitch McMabers 2019-11-07 06:54:42

If by "fun" you mean slow and not properly handling input like `li=[ [1,2,3], 4, [5, 6], "foo" ]`. ;-)

#### @sudeepgupta90 2019-11-07 09:39:34

@MitchMcMabers by "fun" as in another snippet of code which runs on the sample input as shared by the OP. The nerd me never claimed speed as a value add. Happy to receive your "down vote" :)

#### @Abhishek Bhatia 2019-08-11 11:00:13

Here is a function using recursion which will work on any arbitrary nested list.

``````def flatten(nested_lst):
""" Return a list after transforming the inner lists
so that it's a 1-D list.

>>> flatten([[[],["a"],"a"],[["ab"],[],"abc"]])
['a', 'a', 'ab', 'abc']
"""
if not isinstance(nested_lst, list):
return(nested_lst)

res = []
for l in nested_lst:
if not isinstance(l, list):
res += [l]
else:
res += flatten(l)

return(res)

``````
``````>>> flatten([[[],["a"],"a"],[["ab"],[],"abc"]])
['a', 'a', 'ab', 'abc']
``````

# Don't reinvent the wheel if you are using Django:

``````>>> from django.contrib.admin.utils import flatten
>>> l = [[1,2,3], [4,5], [6]]
>>> flatten(l)
>>> [1, 2, 3, 4, 5, 6]
``````

...Pandas:

``````>>> from pandas.core.common import flatten
>>> list(flatten(l))
``````

...Itertools:

``````>>> import itertools
>>> flatten = itertools.chain.from_iterable
>>> list(flatten(l))
``````

...Matplotlib

``````>>> from matplotlib.cbook import flatten
>>> list(flatten(l))
``````

...Unipath:

``````>>> from unipath.path import flatten
>>> list(flatten(l))
``````

...Setuptools:

``````>>> from setuptools.namespaces import flatten
>>> list(flatten(l))
``````

#### @geckos 2019-08-26 00:57:00

`flatten = itertools.chain.from_iterable` should be the right answer

#### @Markus Dutschke 2019-09-11 08:28:40

great answer! works also for l=[[[1, 2, 3], [4, 5]], 5] in the case of pandas

#### @imjoseangel 2020-02-20 10:14:04

I like the Pandas solution. If you have something like: `list_of_menuitems = [1, 2, [3, [4, 5, [6]]]]`, it will result on: `[1, 2, 3, 4, 5, 6]`. What I miss is the flatten level.

#### @pylang 2016-11-29 04:14:45

Here is a general approach that applies to numbers, strings, nested lists and mixed containers.

Code

``````#from typing import Iterable
from collections import Iterable                            # < py38

def flatten(items):
"""Yield items from any nested iterable; see Reference."""
for x in items:
if isinstance(x, Iterable) and not isinstance(x, (str, bytes)):
for sub_x in flatten(x):
yield sub_x
else:
yield x
``````

Notes:

• In Python 3, `yield from flatten(x)` can replace `for sub_x in flatten(x): yield sub_x`
• In Python 3.8, abstract base classes are moved from `collection.abc` to the `typing` module.

Demo

``````lst = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
list(flatten(lst))                                         # nested lists
# [1, 2, 3, 4, 5, 6, 7, 8, 9]

mixed = [[1, [2]], (3, 4, {5, 6}, 7), 8, "9"]              # numbers, strs, nested & mixed
list(flatten(mixed))
# [1, 2, 3, 4, 5, 6, 7, 8, '9']
``````

Reference

• This solution is modified from a recipe in Beazley, D. and B. Jones. Recipe 4.14, Python Cookbook 3rd Ed., O'Reilly Media Inc. Sebastopol, CA: 2013.
• Found an earlier SO post, possibly the original demonstration.

#### @Martin Thoma 2017-03-25 15:32:05

I just wrote pretty much the same, because I didn't see your solution ... here is what I looked for "recursively flatten complete multiple lists" ... (+1)

#### @pylang 2017-03-25 17:51:51

@MartinThoma Much appreciated. FYI, if flattening nested iterables is a common practice for you, there are some third-party packages that handle this well. This may save from reinventing the wheel. I've mentioned `more_itertools` among others discussed in this post. Cheers.

#### @Wolf 2017-06-15 10:22:27

Maybe `traverse` could also be a good name for this way of a tree, whereas I'd keep it less universal for this answer by sticking to nested lists.

#### @Ryan Allen 2018-04-30 16:46:07

You can check `if hasattr(x, '__iter__')` instead of importing/checking against `Iterable` and that will exclude strings as well.

#### @sunnyX 2019-06-12 21:35:28

the above code doesnt seem to work for if one of the nested lists is having a list of strings. [1, 2, [3, 4], [4], [], 9, 9.5, 'ssssss', ['str', 'sss', 'ss'], [3, 4, 5]] output:- [1, 2, 3, 4, 4, 9, 9.5, 'ssssss', 3, 4, 5]

#### @pylang 2019-06-12 22:52:36

@sunnyX It seems to work when I try your input, even with a deeply nested list of strings, e.g. `list(flatten([["a", "b", ["c", "d", ["e", "f", ["g"]]]]]))` -> `['a', 'b', 'c', 'd', 'e', 'f', 'g']`. What version of Python are you using?

#### @sunnyX 2019-06-13 13:53:52

@pylang Python 3.6.2

#### @Meitham 2016-09-14 15:09:16

There seems to be a confusion with `operator.add`! When you add two lists together, the correct term for that is `concat`, not add. `operator.concat` is what you need to use.

If you're thinking functional, it is as easy as this::

``````>>> from functools import reduce
>>> list2d = ((1, 2, 3), (4, 5, 6), (7,), (8, 9))
>>> reduce(operator.concat, list2d)
(1, 2, 3, 4, 5, 6, 7, 8, 9)
``````

You see reduce respects the sequence type, so when you supply a tuple, you get back a tuple. Let's try with a list::

``````>>> list2d = [[1, 2, 3],[4, 5, 6], [7], [8, 9]]
>>> reduce(operator.concat, list2d)
[1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

Aha, you get back a list.

``````>>> list2d = [[1, 2, 3],[4, 5, 6], [7], [8, 9]]
>>> %timeit list(itertools.chain.from_iterable(list2d))
1000000 loops, best of 3: 1.36 µs per loop
``````

`from_iterable` is pretty fast! But it's no comparison to reduce with `concat`.

``````>>> list2d = ((1, 2, 3),(4, 5, 6), (7,), (8, 9))
>>> %timeit reduce(operator.concat, list2d)
1000000 loops, best of 3: 492 ns per loop
``````

#### @Mr_and_Mrs_D 2017-05-28 13:20:00

Hmm to be fair second example should be list also (or first tuple ?)

#### @kaya3 2019-12-18 20:38:31

Using such small inputs isn't much of a fair comparison. For 1000 sequences of length 1000, I get 0.037 seconds for `list(chain.from_iterable(...))` and 2.5 seconds for `reduce(concat, ...)`. The problem is that `reduce(concat, ...)` has quadratic runtime, whereas `chain` is linear.

#### @donlan 2019-04-15 19:28:46

Throwing my hat in the ring...

``````B = [ [...], [...], ... ]
A = []
for i in B:
A.extend(i)
``````

#### @Mitch McMabers 2019-11-07 06:52:13

Constantly calling `A.extend()` to append and rearrange that temporary list is gonna be slow. Also doesn't handle recursion or non-list elements inside the list. Good luck with input: `B = [ [1,2,3], 4, [5, 6], "foo" ]`... Not gonna work.

#### @donlan 2019-11-07 15:29:48

@MitchMcMabers that doesn't make sense. Is Python a programming language or a child's toy? appending to a vector is appending to a vector. Or, at least, it should be. You shouldn't be operating in python lists if performance is your issue.

#### @donlan 2019-11-07 16:01:11

@MitchMcMabers no regular language incurs an `O(n*(l1) + (n-1)*l2 + ... + ln))` performance penalty for an `O(number_of_lists)` operation. Python does in order to foster semantic simplicity. This answer is semantically simple, therefore it is in the spirit of the language. Like I said: if performance is a requirement, you should not be using python to begin with.

#### @Mitch McMabers 2019-11-08 12:16:57

It's possible to write this with a recursive generator which visits each element exactly once and does not generate any temporary lists. I think such a solution is `O(1)`.

#### @donlan 2019-11-08 17:58:42

@MitchMcMabers That is not correct. Even if you merely add the pointers to the end of the root list, the solution is at least O(n), where n is the length of the entire list, unless you have a doubly linked list, in which case it is O(m), where m is the number of lists. it may be "possible" using recursion, but why on earth would you want to write such a complex semantic permutation to do something so incredibly simple? In any compiled language, I can write the above more or less equivalently for zero performance impact. The entire point of python is semantic simplicity.

#### @Alex Martelli 2009-06-04 20:37:01

Given a list of lists `l`,

`flat_list = [item for sublist in l for item in sublist]`

which means:

``````flat_list = []
for sublist in l:
for item in sublist:
flat_list.append(item)
``````

is faster than the shortcuts posted so far. (`l` is the list to flatten.)

Here is the corresponding function:

``````flatten = lambda l: [item for sublist in l for item in sublist]
``````

As evidence, you can use the `timeit` module in the standard library:

``````\$ python -mtimeit -s'l=[[1,2,3],[4,5,6], [7], [8,9]]*99' '[item for sublist in l for item in sublist]'
10000 loops, best of 3: 143 usec per loop
\$ python -mtimeit -s'l=[[1,2,3],[4,5,6], [7], [8,9]]*99' 'sum(l, [])'
1000 loops, best of 3: 969 usec per loop
\$ python -mtimeit -s'l=[[1,2,3],[4,5,6], [7], [8,9]]*99' 'reduce(lambda x,y: x+y,l)'
1000 loops, best of 3: 1.1 msec per loop
``````

Explanation: the shortcuts based on `+` (including the implied use in `sum`) are, of necessity, `O(L**2)` when there are L sublists -- as the intermediate result list keeps getting longer, at each step a new intermediate result list object gets allocated, and all the items in the previous intermediate result must be copied over (as well as a few new ones added at the end). So, for simplicity and without actual loss of generality, say you have L sublists of I items each: the first I items are copied back and forth L-1 times, the second I items L-2 times, and so on; total number of copies is I times the sum of x for x from 1 to L excluded, i.e., `I * (L**2)/2`.

The list comprehension just generates one list, once, and copies each item over (from its original place of residence to the result list) also exactly once.

#### @intuited 2010-10-15 01:21:33

I tried a test with the same data, using `itertools.chain.from_iterable` : `\$ python -mtimeit -s'from itertools import chain; l=[[1,2,3],[4,5,6], [7], [8,9]]*99' 'list(chain.from_iterable(l))'`. It runs a bit more than twice as fast as the nested list comprehension that's the fastest of the alternatives shown here.

#### @Rob Crowell 2011-07-27 16:43:18

I found the syntax hard to understand until I realized you can think of it exactly like nested for loops. for sublist in l: for item in sublist: yield item

#### @Boris Chervenkov 2012-05-19 21:59:24

@intuited your solution actually returns an iterator, not an actual list - that's why it runs twice as fast. see `type(itertools.chain([ [1,2], [3,4] ]))`. but if @emma needs the nested list just to iterate over it - that's a fine and optimal solution :o)

#### @intuited 2012-05-20 22:56:40

@BorisChervenkov: Notice that I wrapped the call in `list()` to realize the iterator into a list.

#### @Makoto 2012-07-19 08:04:55

`numpy.concatenate` seems a bit faster than any of the methods here, if you are willing to accept an array.

#### @Sven 2013-03-27 14:00:29

Doesn't universally work! `l=[1,2,[3,4]] [item for sublist in l for item in sublist] TypeError: 'int' object is not iterable`

#### @Mark E. Haase 2013-05-23 22:29:51

@Noio It makes sense if you re-order it: `[item for item in sublist for sublist in l ]`. Of course, if you re-order it, then it won't make sense to Python, because you're using `sublist` before you defined what it is.

#### @John Mee 2013-08-29 01:38:20

[leaf for tree in forest for leaf in tree] might be easier to comprehend and apply.

#### @Air 2013-10-03 17:51:46

@Sven It works for any list of lists; `[1,2,[3,4]]` is not a list of lists. You could hack together a one-line solution for that particular case with something like: `[item for sublist in [sublist if isinstance(sublist, list) else [sublist] for sublist in l] for item in sublist]`; but that's not concise enough to be worth fitting into one line. Alternatively, it's simple enough to write yourself a function for flattening arbitrarily nested sequences; see also stackoverflow.com/a/2158532/2359271

#### @Cruncher 2013-10-08 13:18:31

@wim I really just write the for loop structure out in full if i'm not sure what a list comprehension is doing. Though I will grant that at face value `item for sublist in l for item in sublist` sounds like nonsense.

#### @LBarret 2014-05-14 19:21:21

the reduce example in the main text use the + operator, and therefore create N list. `reduce(lambda x, y : x.extend(y) or x, ll, [])` is much faster than list comprehension in my test. The tricky part is the expression : x.extend(y) return None, so we use or to get the accumulator list extended : a or b return b if bool(a) evaluate to false.

#### @Joel 2015-01-04 12:55:36

@AlexMartelli Can you suggest whether this would still be the best option if all sublists are the same length (in my case length 2)?

#### @Alex Martelli 2015-01-04 15:40:56

@Joel, actually nowadays `list(itertools.chain.from_iterable(l))` is best -- as noticed in other comments and Shawn's answer.

#### @Oren 2015-02-12 11:57:15

My attempt to understand [item**,** for sublist in l-> for item in sublist]

#### @Dan Lenski 2015-07-11 05:19:01

Really nice illustration! The versions based on `sum`/`reduce` fall victim to Schlemiel the Painter's algorithm, an antipattern named as such by Stack Overflow's founder Joel Spolsky :)

#### @JuanXarg 2015-08-26 13:06:14

Sorry to revive this very old thread, but I was curious why this solution is the best. If I'm reading the `timeit` output correctly, each loop of this solution runs roughly 10 times faster, but 10 times more loops are required.

#### @Alex Martelli 2015-08-28 21:50:27

@JuanXarg, nope, you're totally misunderstanding `timeit`'s output: it iterates 10 more times because it can (within the rough constraint of taking about the same amount of elapsed time), not because the iterations are in any way, shape, or form, "required".

Any `sublist` that is not actually a list but an `str` would get split too, anyway to dodge that in a pythonic way?

#### @Kenneth Jiang 2016-09-28 18:38:30

I don't care about how fast it runs as long as it's hard to understand!!! Can anyone tell me he/she can remember the syntax 3 months later without referring to this same post again? Can anyone tell me why you need to repeat `item for sublist`? Ugly Ugly Ugly!!

#### @ChaimG 2016-10-30 02:29:56

Why all the upvotes? `reduce(operator.concat, list2d)` is faster AND easier to understand! See this answer.

#### @Johannes Schaub - litb 2017-01-09 18:02:29

I have tried to put parentheses to clarify the associativity of these (nested?) generators.. but I fail so far. None of `list(i for a in ([[1, 2], [3, 4]] for i in a))` and `list((i for a in [[1, 2], [3, 4]]) for i in a)` compile, but give syntax errors. I understand that the latest `a` refers to the first `a` introduced there. So I assumed that the expression must be left-associative. But the latter fails to parse aswell. Any hint to this?

#### @Arco Bast 2017-03-10 14:25:08

@KennethJiang I find it is extremely easy to memorize. Just use: `[x for x in x for x in x]`. It became one of my favorites :D

#### @Eric Duminil 2017-04-15 16:06:12

@JohnMee: `[leaf for leaf in tree for tree in forest]` would have been even better and much easier to read but sadly, Python devs didn't think so.

#### @MichaelChirico 2017-06-30 13:41:50

@intuited evaluating efficency on a list of 99 items hardly amounts to a true benchmark

#### @Davos 2017-09-12 03:42:25

At first the `[leaf for tree in forest for leaf in tree]` syntax seems confusing and in the wrong order, but its the same as a nested for loop, except the `yield leaf` is moved to the front so it fits the list comprehension standard. e.g. `for tree in forest: for leaf in tree: yield leaf` . The other suggested syntax order `[leaf for leaf in tree for tree in forest]` seems much more like piping or composition rather than nesting so yeah from that perspective it's easier to reason about that deep nesting.

#### @Davos 2017-09-12 03:51:50

I wish there were something like R's magrittr package for piping functions together in Python. It's generally easier to read from left to right rather than from inside out, although perhaps that's a very biased perspective from someone who natively writes a left-to-right language like English.

#### @Neal Gokli 2018-08-01 20:26:23

@AlexMartelli You said "nowadays `list(itertools.chain.from_iterable(l))` is best". Can you add that to the top of your answer, with a link to a better answer? Your answer has so many upvotes that it seems authoritative!

#### @Akshay Sehgal 2019-04-07 20:05:08

Just to add, the reason why we write the list comprehension in such a way is because it solves the 'undefined variable" problem. So, when working with list comprehension you can think of writing this as [ item for item in sublist for sublist in list] which might make more sense, but sadly will result in error. Since ''sublist'' hasn't been defined before, u cant really iterate. So you take the same list comprehension, and swap the for loops (bring the second one before the first one) resulting in - [ item for sublist in list for item in sublist ]

#### @Samuel Muldoon 2019-11-06 20:32:41

This solution does not work it you have a list of lists of lists of lists of lists...

#### @Triptych 2009-06-04 20:35:53

Note from the author: This is inefficient. But fun, because monoids are awesome. It's not appropriate for production Python code.

``````>>> sum(l, [])
[1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

This just sums the elements of iterable passed in the first argument, treating second argument as the initial value of the sum (if not given, `0` is used instead and this case will give you an error).

Because you are summing nested lists, you actually get `[1,3]+[2,4]` as a result of `sum([[1,3],[2,4]],[])`, which is equal to `[1,3,2,4]`.

Note that only works on lists of lists. For lists of lists of lists, you'll need another solution.

#### @Emma 2009-06-04 20:59:36

See Nadia's reply below, too. This appears to be the fastest solution.

#### @andrewrk 2010-06-15 18:55:14

that's pretty neat and clever but I wouldn't use it because it's confusing to read.

#### @Thomas Ahle 2011-05-20 17:31:07

@Nick: It's not immediately clear that you can do this. Even though + is overloaded, the sum function is implemented in C. I believe it has only recently been changed to support non number objects.

#### @Mike Graham 2012-04-25 18:24:57

This is a Shlemiel the painter's algorithm joelonsoftware.com/articles/fog0000000319.html -- unnecessarily inefficient as well as unnecessarily ugly.

#### @Mike Graham 2012-04-25 18:25:34

@ThomasAhle, I think you might be be confused.

#### @ulidtko 2014-12-03 10:35:23

The append operation on lists forms a `Monoid`, which is one of the most convenient abstractions for thinking of a `+` operation in a general sense (not limited to numbers only). So this answer deserves a +1 from me for (correct) treatment of lists as a monoid. The performance is concerning though...

#### @jhegedus 2015-10-05 08:51:42

@andrewrk Well, some people think that this is the cleanest way of doing it : youtube.com/watch?v=IOiZatlZtGU the ones who do not get why this is cool just need to wait a few decades until everybody does it this way :) let's use programming languages (and abstractions) that are discovered and not invented, Monoid is discovered.

#### @Jean-François Fabre 2017-07-31 18:04:59

this is a very inefficient way because of the quadratic aspect of the sum.

#### @isaaclw 2018-02-08 20:04:51

This doesn't seem to work in python 2.7, unless I'm doing something wrong. When I'm dealing with strings: `TypeError: can only concatenate list (not "str") to list` When I change it to ints, I get the same error but `s/str/int/`. I was assuming 'sum' is a standard operator... ?

#### @naught101 2018-02-20 01:28:46

So.. this is perfectly fine for short cases, then?

#### @J. Doe 2018-07-04 01:49:15

what is the use of this

@isaaclw then put a `''` instead of a `[]`

#### @Jean-François Fabre 2018-08-25 13:44:03

@ᴡʜᴀᴄᴋᴀᴍᴀᴅᴏᴏᴅʟᴇ3000 no, `sum` is protected against string summing. You have to use `str.join`

#### @mortonjt 2018-10-29 22:22:35

Note that this doesn't work in every situation. You may have to do sum(l+[[]]) instead

#### @Saurabh Singh 2018-12-14 10:51:18

Recursive version

``````x = [1,2,[3,4],[5,[6,[7]]],8,9,[10]]

def flatten_list(k):
result = list()
for i in k:
if isinstance(i,list):

#The isinstance() function checks if the object (first argument) is an
#instance or subclass of classinfo class (second argument)

result.extend(flatten_list(i)) #Recursive call
else:
result.append(i)
return result

flatten_list(x)
#result = [1,2,3,4,5,6,7,8,9,10]
``````

#### @Goran B. 2019-06-25 08:46:07

nice, no imports needed and it's clear as to what it's doing ... flattening a list, period :)

#### @Sachin Sharma 2019-08-19 18:25:22

simply brilliant!

#### @user9074332 2018-09-24 02:08:16

The accepted answer did not work for me when dealing with text-based lists of variable lengths. Here is an alternate approach that did work for me.

``````l = ['aaa', 'bb', 'cccccc', ['xx', 'yyyyyyy']]
``````

### Accepted answer that did not work:

``````flat_list = [item for sublist in l for item in sublist]
print(flat_list)
['a', 'a', 'a', 'b', 'b', 'c', 'c', 'c', 'c', 'c', 'c', 'xx', 'yyyyyyy']
``````

### New proposed solution that did work for me:

``````flat_list = []
_ = [flat_list.extend(item) if isinstance(item, list) else flat_list.append(item) for item in l if item]
print(flat_list)
['aaa', 'bb', 'cccccc', 'xx', 'yyyyyyy']
``````

#### @hash_purple 2018-09-20 04:55:19

A simple recursive method using `reduce` from `functools` and the `add` operator on lists:

``````>>> from functools import reduce
>>> flatten = lambda lst: [lst] if type(lst) is int else reduce(add, [flatten(ele) for ele in lst])
>>> flatten(l)
[1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

The function `flatten` takes in `lst` as parameter. It loops all the elements of `lst` until reaching integers (can also change `int` to `float`, `str`, etc. for other data types), which are added to the return value of the outermost recursion.

Recursion, unlike methods like `for` loops and monads, is that it is a general solution not limited by the list depth. For example, a list with depth of 5 can be flattened the same way as `l`:

``````>>> l2 = [[3, [1, 2], [[[6], 5], 4, 0], 7, [[8]], [9, 10]]]
>>> flatten(l2)
[3, 1, 2, 6, 5, 4, 0, 7, 8, 9, 10]
``````

#### @Mitch McMabers 2019-11-07 07:03:12

Sad to say it (I don't like downvoting answers), but don't use this answer in your projects. It suffers from too deep recursion: `RecursionError: maximum recursion depth exceeded while calling a Python object`, and it's slow when it actually works. Example input that fails: `flatten([ [1,2,3], 4, [5, 6], "foo" ])`. Should use Generators instead.

#### @tharndt 2018-01-09 14:34:15

Another unusual approach that works for hetero- and homogeneous lists of integers:

``````from typing import List

def flatten(l: list) -> List[int]:
"""Flatten an arbitrary deep nested list of lists of integers.

Examples:
>>> flatten([1, 2, [1, [10]]])
[1, 2, 1, 10]

Args:
l: Union[l, Union[int, List[int]]

Returns:
Flatted list of integer
"""
return [int(i.strip('[ ]')) for i in str(l).split(',')]
``````

#### @Darkonaut 2018-01-10 22:03:32

That's just a more complicated and a bit slower way of what ᴡʜᴀᴄᴋᴀᴍᴀᴅᴏᴏᴅʟᴇ3000 already posted before. I reinvented his proposal yesterday, so this approach seems quite popular these days ;)

#### @tharndt 2018-01-11 08:17:10

Not quite: `wierd_list = [[1, 2, 3], [4, 5, 6], [7], [8, 9], 10]` >> `nice_list=[1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 0]`

#### @tharndt 2018-01-11 08:32:18

my code as one liner would be : `flat_list = [int(e.replace('[','').replace(']','')) for e in str(deep_list).split(',')]`

#### @Darkonaut 2018-01-11 16:31:09

You are indeed right +1, ᴡʜᴀᴄᴋᴀᴍᴀᴅᴏᴏᴅʟᴇ3000's proposal won't work with multiple digit numbers, I also didn't test this before although it should be obvious. You could simplify your code and write `[int(e.strip('[ ]')) for e in str(deep_list).split(',')]`. But I'd suggest to stick with Deleet's proposal for real use cases. It doesn't contain hacky type transformations, it's faster and more versatile because it naturally also handles lists with mixed types.

#### @tharndt 2018-01-13 08:02:22

Thanks! This was of course supposed to be funny. I've seen Deleet's proposal in a python book before.

#### @Darkonaut 2018-01-13 16:04:15

Can you tell us which book? I contemplated a lot about this because it's so effective and beautiful. Will hit recursion limit inevitably in general but for cases like this with few recursions it seems perfect.

#### @tharndt 2018-01-15 08:18:01

Unfortunately no. But I saw this code recently here: Python Practice Book 6.1.2

#### @A. Attia 2018-07-24 09:11:26

You can use numpy :
`flat_list = list(np.concatenate(list_of_list))`

#### @Nitin 2018-09-19 07:53:11

This works for numerical, strings and mixed lists also

#### @EL_DON 2019-04-22 21:32:02

Fails for unevenly nested data, like `[1, 2, [3], [[4]], [5, [6]]]`

#### @EL_DON 2018-02-01 18:22:49

`matplotlib.cbook.flatten()` will work for nested lists even if they nest more deeply than the example.

``````import matplotlib
l = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
print(list(matplotlib.cbook.flatten(l)))
l2 = [[1, 2, 3], [4, 5, 6], [7], [8, [9, 10, [11, 12, [13]]]]]
print list(matplotlib.cbook.flatten(l2))
``````

Result:

``````[1, 2, 3, 4, 5, 6, 7, 8, 9]
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13]
``````

This is 18x faster than underscore._.flatten:

``````Average time over 1000 trials of matplotlib.cbook.flatten: 2.55e-05 sec
Average time over 1000 trials of underscore._.flatten: 4.63e-04 sec
(time for underscore._)/(time for matplotlib.cbook) = 18.1233394636
``````

#### @pylang 2016-12-02 18:35:17

Consider installing the `more_itertools` package.

``````> pip install more_itertools
``````

It ships with an implementation for `flatten` (source, from the itertools recipes):

``````import more_itertools

lst = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
list(more_itertools.flatten(lst))
# [1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

As of version 2.4, you can flatten more complicated, nested iterables with `more_itertools.collapse` (source, contributed by abarnet).

``````lst = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
list(more_itertools.collapse(lst))
# [1, 2, 3, 4, 5, 6, 7, 8, 9]

lst = [[1, 2, 3], [[4, 5, 6]], [[[7]]], 8, 9]              # complex nesting
list(more_itertools.collapse(lst))
# [1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

#### @brunetton 2019-12-17 14:30:44

Indeed. This should be the accepted answer

``````flat_list = []
for i in list_of_list:
flat_list+=i
``````

This Code also works fine as it just extend the list all the way. Although it is much similar but only have one for loop. So It have less complexity than adding 2 for loops.

Note: Below applies to Python 3.3+ because it uses `yield_from`. `six` is also a third-party package, though it is stable. Alternately, you could use `sys.version`.

In the case of `obj = [[1, 2,], [3, 4], [5, 6]]`, all of the solutions here are good, including list comprehension and `itertools.chain.from_iterable`.

However, consider this slightly more complex case:

``````>>> obj = [[1, 2, 3], [4, 5], 6, 'abc', [7], [8, [9, 10]]]
``````

There are several problems here:

• One element, `6`, is just a scalar; it's not iterable, so the above routes will fail here.
• One element, `'abc'`, is technically iterable (all `str`s are). However, reading between the lines a bit, you don't want to treat it as such--you want to treat it as a single element.
• The final element, `[8, [9, 10]]` is itself a nested iterable. Basic list comprehension and `chain.from_iterable` only extract "1 level down."

You can remedy this as follows:

``````>>> from collections import Iterable
>>> from six import string_types

>>> def flatten(obj):
...     for i in obj:
...         if isinstance(i, Iterable) and not isinstance(i, string_types):
...             yield from flatten(i)
...         else:
...             yield i

>>> list(flatten(obj))
[1, 2, 3, 4, 5, 6, 'abc', 7, 8, 9, 10]
``````

Here, you check that the sub-element (1) is iterable with `Iterable`, an ABC from `itertools`, but also want to ensure that (2) the element is not "string-like."

#### @pylang 2018-06-19 19:06:43

If you are still interested in Python 2 compatibility, change `yield from` to a `for` loop, e.g. `for x in flatten(i): yield x`

#### @phoxis 2018-05-16 09:41:57

This may not be the most efficient way but I thought to put a one-liner (actually a two-liner). Both versions will work on arbitrary hierarchy nested lists, and exploits language features (Python3.5) and recursion.

``````def make_list_flat (l):
flist = []
flist.extend ([l]) if (type (l) is not list) else [flist.extend (make_list_flat (e)) for e in l]
return flist

a = [[1, 2], [[[[3, 4, 5], 6]]], 7, [8, [9, [10, 11], 12, [13, 14, [15, [[16, 17], 18]]]]]]
flist = make_list_flat(a)
print (flist)
``````

The output is

``````[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18]
``````

This works in a depth first manner. The recursion goes down until it finds a non-list element, then extends the local variable `flist` and then rolls back it to the parent. Whenever `flist` is returned, it is extended to the parent's `flist` in the list comprehension. Therefore, at the root, a flat list is returned.

The above one creates several local lists and returns them which are used to extend the parent's list. I think the way around for this may be creating a gloabl `flist`, like below.

``````a = [[1, 2], [[[[3, 4, 5], 6]]], 7, [8, [9, [10, 11], 12, [13, 14, [15, [[16, 17], 18]]]]]]
flist = []
def make_list_flat (l):
flist.extend ([l]) if (type (l) is not list) else [make_list_flat (e) for e in l]

make_list_flat(a)
print (flist)
``````

The output is again

``````[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18]
``````

Although I am not sure at this time about the efficiency.

#### @MSeifert 2016-11-26 00:20:37

If you want to flatten a data-structure where you don't know how deep it's nested you could use `iteration_utilities.deepflatten`1

``````>>> from iteration_utilities import deepflatten

>>> l = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
>>> list(deepflatten(l, depth=1))
[1, 2, 3, 4, 5, 6, 7, 8, 9]

>>> l = [[1, 2, 3], [4, [5, 6]], 7, [8, 9]]
>>> list(deepflatten(l))
[1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

It's a generator so you need to cast the result to a `list` or explicitly iterate over it.

To flatten only one level and if each of the items is itself iterable you can also use `iteration_utilities.flatten` which itself is just a thin wrapper around `itertools.chain.from_iterable`:

``````>>> from iteration_utilities import flatten
>>> l = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
>>> list(flatten(l))
[1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

Just to add some timings (based on Nico Schlömer answer that didn't include the function presented in this answer):

It's a log-log plot to accommodate for the huge range of values spanned. For qualitative reasoning: Lower is better.

The results show that if the iterable contains only a few inner iterables then `sum` will be fastest, however for long iterables only the `itertools.chain.from_iterable`, `iteration_utilities.deepflatten` or the nested comprehension have reasonable performance with `itertools.chain.from_iterable` being the fastest (as already noticed by Nico Schlömer).

``````from itertools import chain
from functools import reduce
from collections import Iterable  # or from collections.abc import Iterable
import operator
from iteration_utilities import deepflatten

def nested_list_comprehension(lsts):
return [item for sublist in lsts for item in sublist]

def itertools_chain_from_iterable(lsts):
return list(chain.from_iterable(lsts))

def pythons_sum(lsts):
return sum(lsts, [])

return reduce(lambda x, y: x + y, lsts)

def pylangs_flatten(lsts):
return list(flatten(lsts))

def flatten(items):
"""Yield items from any nested iterable; see REF."""
for x in items:
if isinstance(x, Iterable) and not isinstance(x, (str, bytes)):
yield from flatten(x)
else:
yield x

def reduce_concat(lsts):
return reduce(operator.concat, lsts)

def iteration_utilities_deepflatten(lsts):
return list(deepflatten(lsts, depth=1))

from simple_benchmark import benchmark

b = benchmark(
pylangs_flatten, reduce_concat, iteration_utilities_deepflatten],
arguments={2**i: [[0]*5]*(2**i) for i in range(1, 13)},
argument_name='number of inner lists'
)

b.plot()
``````

1 Disclaimer: I'm the author of that library

#### @Yann Vernier 2018-05-14 06:29:00

`sum` no longer works on arbitrary sequences as it starts with `0`, making `functools.reduce(operator.add, sequences)` the replacement (aren't we glad they removed `reduce` from builtins?). When the types are known it might be faster to use `type.__add__`.

#### @MSeifert 2018-05-15 09:24:44

@YannVernier Thanks for the information. I thought I ran these benchmarks on Python 3.6 and it worked with `sum`. Do you happen to know on which Python versions it stopped working?

#### @Yann Vernier 2018-05-15 09:31:10

I was somewhat mistaken. `0` is just the default starting value, so it works if one uses the start argument to start with an empty list... but it still special cases strings and tells me to use join. It's implementing `foldl` instead of `foldl1`. The same issue pops up in 2.7.

#### @bli 2018-01-31 09:52:50

This can be done using `toolz.concat` or `cytoolz.concat` (cythonized version, that could be faster in some cases):

``````from cytoolz import concat
l = [[1, 2, 3], [4, 5, 6], [7], [8, 9]]
list(concat(l)) # or just `concat(l)` if one only wants to iterate over the items
``````

On my computer, in python 3.6, this seems to time almost as fast as `[item for sublist in l for item in sublist]` (not counting the import time):

``````In [611]: %timeit L = [item for sublist in l for item in sublist]
695 ns ± 2.75 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

In [612]: %timeit L = [item for sublist in l for item in sublist]
701 ns ± 5.5 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

In [613]: %timeit L = list(concat(l))
719 ns ± 12 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

In [614]: %timeit L = list(concat(l))
719 ns ± 22.9 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)
``````

The `toolz` version is indeed slower:

``````In [618]: from toolz import concat

In [619]: %timeit L = list(concat(l))
845 ns ± 29 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

In [620]: %timeit L = list(concat(l))
833 ns ± 8.73 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)
``````

#### @Greg Hewgill 2009-06-04 20:35:30

``````from functools import reduce #python 3

>>> l = [[1,2,3],[4,5,6], [7], [8,9]]
>>> reduce(lambda x,y: x+y,l)
[1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

The `extend()` method in your example modifies `x` instead of returning a useful value (which `reduce()` expects).

A faster way to do the `reduce` version would be

``````>>> import operator
>>> l = [[1,2,3],[4,5,6], [7], [8,9]]
>>> reduce(operator.concat, l)
[1, 2, 3, 4, 5, 6, 7, 8, 9]
``````

#### @agf 2011-09-24 10:04:39

`reduce(operator.add, l)` would be the correct way to do the `reduce` version. Built-ins are faster than lambdas.

#### @lukmdo 2012-03-20 22:13:11

@agf here is how: * `timeit.timeit('reduce(operator.add, l)', 'import operator; l=[[1, 2, 3], [4, 5, 6, 7, 8], [1, 2, 3, 4, 5, 6, 7]]', number=10000)` 0.017956018447875977 * `timeit.timeit('reduce(lambda x, y: x+y, l)', 'import operator; l=[[1, 2, 3], [4, 5, 6, 7, 8], [1, 2, 3, 4, 5, 6, 7]]', number=10000)` 0.025218963623046875

#### @Mike Graham 2012-04-25 18:26:07

This is a Shlemiel the painter's algorithm joelonsoftware.com/articles/fog0000000319.html

#### @Freddy 2015-09-11 07:16:54

this can use only for `integers`. But what if list contains `string`?

#### @Greg Hewgill 2015-09-11 07:38:59

@Freddy: The `operator.add` function works equally well for both lists of integers and lists of strings.

#### @Rupen B 2019-02-10 02:31:49

I tried reduce with operator.concat on a list of 1,000,000 integer sublists of lengths 1-10, and it never returned (after 1 minute). The list comprehension, operator.itertools and a simple list.extend over loop all returned in sub-seconds. The last method was the fastest!

#### @FredMan 2017-10-19 05:46:09

You can avoid recursive calls to the stack using an actual stack data structure pretty simply.

``````alist = [1,[1,2],[1,2,[4,5,6],3, "33"]]
newlist = []

while len(alist) > 0 :
templist = alist.pop()
if type(templist) == type(list()) :
while len(templist) > 0 :
temp = templist.pop()
if type(temp) == type(list()) :
for x in temp :
templist.append(x)
else :
newlist.append(temp)
else :
newlist.append(templist)
print(list(reversed(newlist)))
``````

#### @Some Java Programmer 2018-04-18 14:05:44

This doesn't support iterable collections other than lists. You might want to consider using isinstance(temp, Iterable) like some of the other examples. I think you can also simplify this a bit, if you add alist to templist at the beginning, you should only need the nested while loop. You could also use a queue data structure in order to avoid reversing the entire list at the end.

#### @FredMan 2019-12-23 14:31:23

This is old but I'll address the comment. The question asks for list not collections. While there may be some version of this that could avoid the outer loop, it wouldn't be a simple change here as this version requires templist to be overwritten once sub arrays are flattened. And while you might be able to use a queue to do this, there's zero performance loss for using reversed because this is python, reversed is just a backwards iterator.

#### @Jon 2017-09-21 18:53:02

I recently came across a situation where I had a mix of strings and numeric data in sublists such as

``````test = ['591212948',
['special', 'assoc', 'of', 'Chicago', 'Jon', 'Doe'],
['Jon'],
['Doe'],
['fl'],
92001,
555555555,
'hello',
['hello2', 'a'],
'b',
['hello33', ['z', 'w'], 'b']]
``````

where methods like `flat_list = [item for sublist in test for item in sublist]` have not worked. So, I came up with the following solution for 1+ level of sublists

``````def concatList(data):
results = []
for rec in data:
if type(rec) == list:
results += rec
results = concatList(results)
else:
results.append(rec)
return results
``````

And the result

``````In [38]: concatList(test)
Out[38]:
Out[60]:
['591212948',
'special',
'assoc',
'of',
'Chicago',
'Jon',
'Doe',
'Jon',
'Doe',
'fl',
92001,
555555555,
'hello',
'hello2',
'a',
'b',
'hello33',
'z',
'w',
'b']
``````

#### @Mitch McMabers 2019-11-07 06:56:44

Yup that works but the constant concatenation to a temporary list, and then returning temporary lists, and appending those temporary lists to other temporary lists, of other return-values etc etc, is slow... Should be using Generators instead.

#### @Jon 2019-11-07 18:13:34

@MitchMcMabers you can provide an edit if you have a better solution. I wasn’t aiming at building something efficient at the time. Just something that would work. If your data has many (~1k+) nested lists, then I think the problem is something else entirely.

### [SOLVED] Finding the index of an item given a list containing it in Python

• 2008-10-07 01:39:38
• Eugene M
• 3657071 View
• 3006 Score
• Tags:   python list indexing

### [SOLVED] How do I check if a list is empty?

• 2008-09-10 06:20:11
• Ray Vega
• 2672421 View
• 3234 Score
• Tags:   python list

### [SOLVED] How do I get the number of elements in a list?

• 2009-11-11 00:30:54
• y2k
• 3203594 View
• 1889 Score
• Tags:   python list

### [SOLVED] How do I list all files of a directory?

• 2010-07-08 19:31:22
• duhhunjonn
• 3985036 View
• 3474 Score
• Tags:   python directory