By lab bhattacharjee


2013-07-06 15:28:01 8 Comments

How to establish the following identities without the help of calculus:

For positive integer $n, $ $$\sum_{1\le r\le n}\frac{(-1)^{r-1}\binom nr}r=\sum_{1\le r\le n}\frac1r $$

and $$\sum_{0\le r\le n}\frac{(-1)^r\binom nr}{4r+1}=\frac{4^n\cdot n!}{1\cdot5\cdot9\cdots(4n+1)}$$

6 comments

@Felix Marin 2017-06-13 04:25:56

$\newcommand{\bbx}[1]{\,\bbox[15px,border:1px groove navy]{\displaystyle{#1}}\,} \newcommand{\braces}[1]{\left\lbrace\,{#1}\,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\,{#1}\,\right\rbrack} \newcommand{\dd}{\mathrm{d}} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,\mathrm{e}^{#1}\,} \newcommand{\ic}{\mathrm{i}} \newcommand{\mc}[1]{\mathcal{#1}} \newcommand{\mrm}[1]{\mathrm{#1}} \newcommand{\pars}[1]{\left(\,{#1}\,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,{#2}\,}\,} \newcommand{\totald}[3][]{\frac{\mathrm{d}^{#1} #2}{\mathrm{d} #3^{#1}}} \newcommand{\verts}[1]{\left\vert\,{#1}\,\right\vert}$ $\ds{\Huge\left.1\right):} \bbox[15px,#ffd]{\ds{\sum_{1\ \leq\ r\ \leq\ n}{\pars{-1}^{\,r - 1}{n \choose r} \over r} = \sum_{1\ \leq\ r\ \leq\ n}{1 \over r}:\ {\large ?}}}$. \begin{align} \sum_{1\ \leq\ r\ \leq\ n}{\pars{-1}^{\,r-1}{n \choose r} \over r} & = \sum_{r = 1}^{n}\pars{-1}^{\,r - 1}{n \choose r}\int_{0}^{1}t^{r - 1}\,\dd t = -\int_{0}^{1}\sum_{r = 1}^{n}{n \choose r}\pars{-t}^{\,r}\,{\dd t \over t} \\[5mm] & = -\int_{0}^{1}{\pars{1 - t}^{n} - 1 \over t}\,\dd t = \int_{0}^{1}{t^{n} - 1 \over t - 1}\,\dd t = \int_{0}^{1}\sum_{r = 1}^{n}t^{r - 1}\,\dd t \\[5mm] & = \sum_{r = 1}^{n}\int_{0}^{1}t^{r - 1}\,\dd t = \bbx{\sum_{1\ \leq\ r\ \leq\ n}{1 \over r}} \end{align}


$\ds{\Huge\left.2\right):} \bbox[15px,#ffd]{\ds{% \sum_{0\ \leq\ r\ \leq\ n}{\pars{-1}^{r}{n \choose r} \over 4r + 1} = {4^{n}\,n! \over 1 \cdot 5 \cdot 9 \cdot \pars{4n + 1}}:\ {\large ?}}}$. \begin{align} \sum_{0\ \leq\ r\ \leq\ n}{\pars{-1}^{\,r}\,{n \choose r} \over 4r + 1} & = \sum_{r = 0}^{n}\pars{-1}^{\,r}{n \choose r}\int_{0}^{1}t^{4r}\,\dd t = \int_{0}^{1}\sum_{r = 0}^{n}{n \choose r}\pars{-t^{4}}^{r}\,\dd t = \int_{0}^{1}\pars{1 - t^{4}}^{n}\,\dd t \\[5mm] & \stackrel{\large t^{4}\ \mapsto\ t}{=}\,\,\, {1 \over 4}\int_{0}^{1}t^{-3/4}\,\pars{1 - t}^{n}\,\dd t = {1 \over 4}\,{\Gamma\pars{1/4}\Gamma\pars{n + 1} \over \Gamma\pars{n + 5/4}} \\[5mm] & = {1 \over 4}\,n!\,{1 \over \Gamma\pars{1/4 + \bracks{n + 1}}/\Gamma\pars{1/4}} = {1 \over 4}\,n!\,{1 \over \pars{1/4}^{\,\overline{n + 1}}} \\[5mm] & = {1 \over 4}\,n!\,{1 \over \prod_{r = 0}^{n}\pars{1/4 + r}} = {1 \over 4}\,n!\,{1 \over \prod_{r = 0}^{n}\bracks{\pars{4r + 1}/4}} \\[5mm] & = {1 \over 4}\,n!\,{1 \over \bracks{\prod_{r = 0}^{n}\pars{4r + 1}}/4^{n + 1}} = \bbx{\ds{{4^{n}\,n! \over 1 \cdot 5 \cdot 9 \cdots \pars{4n + 1}}}} \end{align}

@marty cohen 2018-04-15 02:51:35

Your answer to 2 generalizes readily to $\sum_{r=0}^n \dfrac{(-1)^r\binom{n}{r}}{ar+b} = \dfrac{n!a^{n}}{\prod_{r=0}^n(ar+b)}$.

@Felix Marin 2018-04-18 02:42:28

@martycohen I guess so. Thanks.

@G Cab 2019-10-17 22:37:19

the OP and you made me wonder a lot till arriving to understand that $1 \cdot 5 \cdot 9 \cdot \pars{4n + 1}$ means $1 \cdot 5 \cdot 9\cdot \; \ldots \; \cdot \pars{4n + 1}$

@G Cab 2019-10-16 22:38:23

A different way to solve the first question following an algebraic approach might go as follows.

We have $$ S_{\,1} (n) = \sum\limits_{1\, \le \,\,r\, \le \,n} {\left( { - 1} \right)^{\,r - 1} \;{1 \over r}\left( \matrix{ n \cr r \cr} \right)} \quad \quad S_{\,2} (n) = \sum\limits_{1\, \le \,\,r\, \le \,n} {{1 \over r}} $$

Let's take the finite difference of $S_1$ $$ \eqalign{ & \Delta S_{\,1} (n) = S_{\,1} (n + 1) - S_{\,1} (n) = \cr & = \sum\limits_{1\, \le \,\,r\, \le \,n + 1} {\left( { - 1} \right)^{\,r - 1} \;{1 \over r}\left( \matrix{ n + 1 \cr r \cr} \right)} - \sum\limits_{1\, \le \,\,r\, \le \,n} {\left( { - 1} \right)^{\,r - 1} \;{1 \over r}\left( \matrix{ n \cr r \cr} \right)} = \quad\quad (0) \cr & = \sum\limits_{1\, \le \,\,r\, \le \,n + 1} {\left( { - 1} \right)^{\,r - 1} \; {1 \over r}\left( {\left( \matrix{ n + 1 \cr r \cr} \right) - \left( \matrix{ n \cr r \cr} \right)} \right)} = \quad \quad (1) \cr & = \sum\limits_{1\, \le \,\,r\, \le \,n + 1} {\left( { - 1} \right)^{\,r - 1} \;{1 \over r}\left( \matrix{ n \cr r - 1 \cr} \right)} = \quad \quad (2) \cr & = {1 \over {n + 1}}\sum\limits_{1\, \le \,\,r\, \le \,n + 1} {\left( { - 1} \right)^{\,r - 1} \;\left( \matrix{ n + 1 \cr r \cr} \right)} = \quad \quad (3) \cr & = {1 \over {n + 1}}\left( { - \left( { - 1} \right) - \sum\limits_{0\, \le \,\,r\, \le \,n + 1} {\left( { - 1} \right)^{\,r} \;\left( \matrix{ n + 1 \cr r \cr} \right)} } \right) = \quad \quad (4) \cr & = {1 \over {n + 1}}\quad \quad (5) \cr} $$ where:
- (0) we can extend the limit of the 2nd sum;
- (1) join the sums;
- (2) recursion of binomial;
- (3) absorption identity;
- (4) minus plus the term $r=0$;
- (5) the sum is null.

Since it is $$ \left\{ \matrix{ S_{\,1} (1) = S_{\,2} (1) = 1 \hfill \cr \Delta S_{\,1} (n) = \Delta S_{\,2} (n) = {1 \over {n + 1}} \hfill \cr} \right. $$ then the two sums are equal.

Concerning the second question, we rewrite the sum as an iterated Finite Difference $$ \eqalign{ & S_{\,3} (n) = \sum\limits_{\left( {0\, \le } \right)\,\,r\,\left( { \le \,n} \right)} {\left( { - 1} \right)^{\,r} \;{1 \over {1 + 4r}}\left( \matrix{ n \cr r \cr} \right)} = \cr & = {{\left( { - 1} \right)^{\,n} } \over 4}\sum\limits_{\left( {0\, \le } \right)\,\,r\,\left( { \le \,n} \right)} {\left( { - 1} \right)^{\,n - r} \;{1 \over {1/4 + r}}\left( \matrix{ n \cr r \cr} \right)} = \cr & = {{\left( { - 1} \right)^{\,n} } \over 4}\left. {\Delta ^{\,n} \left( {{1 \over {1/4 + x}}} \right)\,} \right|_{\,x\, = \,0} \cr} $$

Then we pass and consider the Falling and Rising Factorials for which the delta has a simple expression $$ \Delta _{\,x} ^m \;x^{\,\underline {\,n\,} } = n^{\,\underline {\,m\,} } x^{\,\underline {\,n - m\,} } $$ and for which applies the following indentity $$ x^{\underline {\, - m\,} } = {1 \over {\left( {x + m} \right)^{\underline {\,m\,} } }} = {1 \over {\left( {x + 1} \right)^{\overline {\,m\,} } }} $$ Therefore we can write $$ {1 \over {1/4 + x}} = {1 \over {\left( {1/4 + x} \right)^{\overline {\,1\,} } }} = \left( { - 3/4 + x} \right)^{\underline {\, - 1\,} } $$ and $$ \eqalign{ & \Delta ^{\,n} \left( {{1 \over {1/4 + x}}} \right) = \Delta ^{\,n} \left( { - 3/4 + x} \right)^{\underline {\, - 1\,} } = \left( { - 1} \right)^{\underline {\,n\,} } \left( { - 3/4 + x} \right)^{\underline {\, - 1 - n\,} } = \cr & = \left( { - 1} \right)^{\,n} n!\left( { - 3/4 + x} \right)^{\underline {\, - 1 - n\,} } \cr} $$ So the sum becomes $$ \eqalign{ & S_{\,3} (n) = {{\left( { - 1} \right)^{\,n} } \over 4}\left. {\Delta ^{\,n} \left( {{1 \over {1/4 + x}}} \right)\,} \right|_{\,x\, = \,0} = \cr & = {{\left( { - 1} \right)^{\,n} } \over 4}\left( { - 1} \right)^{\,n} n!\left( { - 3/4} \right)^{\underline {\, - 1 - n\,} } = {{\left( { - 1} \right)^{\,n + 1} n!} \over 4}\left( {3/4} \right)^{\overline {\, - n - 1\,} } = \cr & = {{\left( { - 1} \right)^{\,n + 1} } \over 4}{{\Gamma \left( {n + 1} \right)\Gamma \left( { - n - 1/4} \right)} \over {\Gamma \left( {3/4} \right)}} = {{\left( { - 1} \right)^{\,n + 1} } \over 4}{\rm B}\left( {n + 1,\; - n - 1/4} \right) = \cr & = {1 \over 4}{{n!} \over {\left( {1/4} \right)^{\overline {\,n + 1\,} } }} = {1 \over 4}{{\prod\limits_{k = 0}^{n - 1} {\left( {1 + k} \right)} } \over {\prod\limits_{k = 0}^n {\left( {1/4 + k} \right)} }} = \cr & = {{4^{\,n} } \over {\left( {1 + 4n} \right)}}\prod\limits_{k = 0}^{n - 1} {{{\left( {1 + k} \right)} \over {\left( {1 + 4k} \right)}}} = \cr & = {{4^{\,n} n!} \over {1 \cdot 5 \cdot 9 \cdot \; \ldots \; \cdot \left( {1 + 4n} \right)}} \cr} $$

@Thomas Andrews 2019-10-16 17:19:51

Here is a probabilistic proof of the first equality.

Given $n$ urns, and each round, you place a ball into one of the $n$ urns randomly and uniformly. What is the expected number of rounds until every urn has a ball?

There are two approaches to solving this.

One approach uses inclusion-exclusion, which gives $$\sum_{k=1}^n(-1)^{k-1}\binom{n}{k}\frac{n}{k}.$$

The other approach let $X_k$ be the number of rounds to get balls in $k$ urns after you've gotten a ball in each of $k-1$ urns. Then it is easy to see that $E(X_k)=\frac{n}{n-k+1}$ and the expected time is $\sum_{k=1}^{n}\frac{n}{n-k+1}$ That can be rewritten as $$\sum_{k=1}^{n}\frac{n}{k}.$$

See this answer for details.

@Sameer Kailasa 2018-04-15 01:34:08

Here is a purely combinatorial proof of the first identity:

Let us count the number $C$ of disjoint cycles in a given permutation $\sigma \in S_n$. On the one hand, we have simply $$C = \sum_{k=1}^{n} C_k$$ where $C_k$ is the number of $k$-cycles in $\sigma$. On the other hand, we can count by "inclusion-exclusion," as follows.

Denote the set $[n]:= \{1,2, 3, \cdots, n\}$. For each subset $S \subset [n]$, let $\chi(S) = 1$ if $S$ is contained wholly in some cycle of $\sigma$ and $0$ otherwise. Then, I claim we have $$C = \sum_{\emptyset \neq S \subset [n]} (-1)^{|S|-1} \chi(S)$$ To prove this, note that the only subsets of $[n]$ contributing to the sum are those contained in some cycle. For each cycle in $\sigma$ of size $k$, the contribution to the sum is $$\binom{k}{1} - \binom{k}{2} + \binom{k}{3} -\cdots + (-1)^{k-1} \binom{k}{k}= 1-(1-1)^k = 1$$ by the binomial theorem; hence, the sum ultimately evaluates to just the number of cycles of $\sigma$.

Thus, we have the identity $$\sum_{k=1}^{n} C_k = \sum_{\emptyset \neq S \subset [n]} (-1)^{|S|-1} \chi(S) = \sum_{k=1}^{n} (-1)^{k-1} \sum_{\substack{S \subset [n] \\ |S| = k}} \chi(S)$$ Now, we suppose $\sigma$ is chosen uniformly at random from $S_n$ and compute the expected number of cycles in $\sigma$. By linearity of expectation, we have $$\sum_{k=1}^{n} \mathbb{E}[C_k] = \sum_{k=1}^{n} (-1)^{k-1}\mathbb{E} \left[ \sum_{\substack{S \subset [n] \\ |S| = k}} \chi(S) \right]$$

Now, we have to compute these expectations. Let us first calculate $\mathbb{E}[C_k]$. We have $$C_k = \frac{1}{k} \sum_{j=1}^{n} \mathbb{1}_{\text{contained in a } k-\text{cycle}} (j) \implies \mathbb{E}[C_k] = \frac{n}{k} \text{Prob}(\text{fixed element contained in } k-\text{cycle})$$ and $$\mathbb{E} \left[ \sum_{\substack{S \subset [n] \\ |S| = k}} \chi(S) \right] = \binom{n}{k} \text{Prob}(\text{fixed set of size } k \text{ contained in some cycle})$$

The evaluation of these probabilities is classical (see Terry Tao's post or this Wikipedia article), and we have $$\text{Prob}(\text{fixed element contained in } k-\text{cycle}) = \frac{1}{n}$$ $$\text{Prob}(\text{fixed set of size } k \text{ contained in some cycle}) = \frac{1}{k}$$ hence the final conclusion follows.

@marty cohen 2018-04-15 02:49:49

To me, this proof is much harder than the calculus proof, as well as being much harder to discover. The calculus proof of Felix Marin is straightforward and natural, while this is magic.

@Thomas Andrews 2019-10-16 21:04:29

@martycohen It's neat to see the different approaches, though. I've also got a probability-based solution, where the both sides count the expected number of rounds to get a ball in each of $n$ urns, given that each round, you put a ball in one of the $n$ urns uniformly random.

@marty cohen 2019-10-17 00:12:01

I've always found it harder to get probabilistic and combinatorial proofs than analytic ones. But that's my problem.

@lab bhattacharjee 2013-12-25 12:15:35

For the first Question let $\displaystyle S_n=\sum_{1\le r\le n}\frac{(-1)^{r-1}\binom nr}r$

$\displaystyle\implies S_{m+1}-S_m=\sum_{1\le r\le m+1}(-1)^{r-1}\frac{\binom{m+1}r-\binom mr}r-\binom m{m+1}\frac{(-1)^m}{m+1}$

$\displaystyle=\sum_{1\le r\le m+1}(-1)^{r-1}\frac{\binom{m+1}r-\binom mr}r$ as $\binom mr=0$ for $r>m$ or $r<0$

Now using this formula, $\displaystyle\binom{m+1}r=\binom mr+\binom m{r-1}\iff \binom{m+1}r-\binom mr=\binom m{r-1}$

Again, $\displaystyle\frac{\binom m{r-1}}r=\frac{m!}{\{m-(r-1)\}!(r-1)!\cdot r}=\frac1{m+1}\cdot\frac{(m+1)!}{(m+1-r)!\cdot r!}$ $\displaystyle=\frac1{m+1}\cdot\binom{m+1}r$

$\displaystyle\implies S_{m+1}-S_m=\sum_{1\le r\le m+1}(-1)^{r-1}\cdot\frac1{m+1}\cdot\binom{m+1}r$ $\displaystyle=\frac1{m+1}\sum_{1\le r\le m+1}(-1)^{r-1}\binom{m+1}r$ $\displaystyle=\frac{1-(1-1)^{m+1}}{m+1}$

$\displaystyle\implies S_{m+1}-S_m=\frac1{m+1}$

Now, $\displaystyle S_1=\frac11$

As $\displaystyle S_2-S_1=\frac12\implies S_2=1+\frac12$ and so on

@OR. 2013-07-06 15:42:22

You can always use Petkovsek's algorithm. It only requires some algebra to prove this and other problems alike.

You can read about it in the book $A=B$ (available free online).

Another thing is that derivation of polynomials is a completely algebraic operation.

You can always write instead of $(P(x))'|_{x=1}$ write $[P(x+1)-P(1)]/x|_{x=0}$, perhaps what is equivalent, rewrite in powers of $(x-1)$, which involves iterated division by $(x-1)$. [I just pressed Alt+F7 to try to compile the LaTeX] And little by little hide the calculus from the proof that you have.

Related Questions

Sponsored Content

3 Answered Questions

3 Answered Questions

1 Answered Questions

[SOLVED] A binomial sum identity

2 Answered Questions

[SOLVED] Help with binomial identity

3 Answered Questions

Sponsored Content