By AnonymousCoward


2010-11-23 02:17:14 8 Comments

Today I was thinking about composition of functions. It has nice properties, its always associative, there is an identity, and if we restrict to bijective functions then we have an inverse.

But then I thought about commutativity. My first intuition was that bijective self maps of a space should commute but then I saw some counter-examples. The symmetric group is only abelian if $n \le 2$ so clearly there need to be more restrictions on functions than bijectivity for them to commute.

The only examples I could think of were boring things like multiplying by a constant or maximal tori of groups like $O(n)$ (maybe less boring).

My question: In a euclidean space, what are (edit) some nice characterizations of sets of functions that commute? What about in a more general space?

Bonus: Is this notion of commutativity important anywhere in analysis?

3 comments

@Daniel Korenblum 2014-01-17 18:21:19

This question may also be related to how certain functions behave under functions of their variables. In this context, the property of commuting with binary operators, such as addition and multiplication, can be used to define classes of functions:

  1. additive commutation: if $g(x, y) = x + y$, then $f\big(g(x, y)\big) = g\big(f(x),\ f(y)\big)$ if and only if $f(x + y) = f(x) + f(y)$ thus $f$ is a homogeneous linear function of the form $f(x; a) \equiv ax$

  2. multiplicative commutation: if $g(x, y) = xy$, then $f\big( g(x, y) \big) = g\big(f(x),\ f(y)\big)$ if and only if $f(xy) = f(x)f(y)$ thus $f$ is "scale invariant" i.e. a power law of the form $f(x; a) \equiv x^a$

  3. log-additive commutation: if $g(x, y) = x + y$, then $\log f\big( g(x, y) \big) = g\big( \log f(x),\ \log f(y) \big)$ if and only if $f(x + y) = f(x)f(y)$ thus $f$ is an exponential function of the form $f(x; a) \equiv \exp(ax)$

The last item (3) involves a third function (the logarithm) which when denoted as $h$ gives

$h\big(f[g(x, y)]\big) = g\big(h[f(x)],\ h[f(y)]\big)$

or

$h \circ f \circ g(x, y) = g\big(h \circ f(x),\ h \circ f(y)\big).$

Since $h \circ f$ occurs on both sides, we can denote this as $\tilde f$ to get

$\tilde f \big( g(x, y) \big) = g \big( \tilde f(x), \tilde f(y) \big)$

which has the same form as item (1) above. From this perspective, items (1) and (3) above can be seen as being isomorphic under the $\exp$ and $\log$ pair of invertible mappings.

@Bill Dubuque 2010-11-23 03:32:23

A classic result of Ritt shows that polynomials that commute under composition must be, up to a linear homeomorphism, either both powers of $x$, both iterates of the same polynomial, or both Chebychev polynomials. Actually Ritt proved a more general rational function case - follow the link. His work was motivated by work of Julia and Fatou's work on Julia sets of rational functions, e.g. see here for a modern presentation.

@J. M. is a poor mathematician 2010-11-23 03:39:42

Yes, the identity $T_m(T_n(x))=T_n(T_m(x))=T_{mn}(x)$, even after looking at it in terms of cosines, is pretty deep, I think.

@AnonymousCoward 2010-11-23 03:57:26

Thanks, this is a very nice characterization. I will certainly read the links.

@T.. 2010-11-23 09:52:27

Ritt's theorem is for polynomials with complex coefficients (which implies the same result for polynomials over a field of characteristic 0). The problem is unsolved in characteristic p > 0.

@PrimeRibeyeDeal 2013-08-03 12:30:40

What is meant by "up to linear homeomorphism"? Not even scalar multiples of powers of $x$ will commute in general unless the scalars and the powers are the same.

@Yuval Filmus 2010-11-23 03:37:47

According to Wikipedia, a set of diagonalizable matrices commute if and only if they are simultaneously diagonalizable. There is a far-reaching generalization, namely the Gelfand representation theorem.

The Gelfand representation theorem for commutative $C^*$ algebras represents every commutative $C^*$ algebra as an algebra of functions with pointwise multiplication; the domain of the latter algebra is the spectrum of the former algebra.

@Jonas Meyer 2010-11-23 03:58:07

The Gelfand-Neumark theorem you cite generalizes the case where the matrices are normal, and therefore simultaneously unitarily diagonalizable, if you want to think of this as realizing the matrices as complex-valued functions on a finite discrete space.

@Jonas Meyer 2010-11-23 04:12:55

One direction of your first statement is clear. In the other direction, after changing bases so that one of the matrices, T, is diagonal, the others will have to be block diagonal (up to conjugating by a permutation matrix) with blocks corresponding to the distinct eigenvalues of T. If they're not yet all diagonal, take a matrix in the set that is not diagonal, and change bases to diagonalize its blocks (which will leave T diagonal because its blocks are scalar). Because the size of the blocks will decrease, this process terminates with the matrices all diagonalized.

Related Questions

Sponsored Content

1 Answered Questions

1 Answered Questions

1 Answered Questions

[SOLVED] On composition of functions

  • 2018-01-01 16:10:23
  • pompeu
  • 63 View
  • 1 Score
  • 1 Answer
  • Tags:   functions

1 Answered Questions

1 Answered Questions

1 Answered Questions

[SOLVED] Composition of functions, examples

  • 2015-08-18 22:39:43
  • Rickz0rz
  • 52 View
  • 0 Score
  • 1 Answer
  • Tags:   functions

2 Answered Questions

[SOLVED] When does function composition commute?

  • 2015-03-17 16:42:35
  • Jonathan Hebert
  • 1716 View
  • 1 Score
  • 2 Answer
  • Tags:   functions

2 Answered Questions

2 Answered Questions

[SOLVED] Finite groups of functions under function composition

3 Answered Questions

Sponsored Content