By Peđa Terzić

2017-11-17 13:49:28 8 Comments

Can you provide a proof or a counterexample for the claim given below ?

Inspired by Agrawal's conjecture in this paper and by Theorem 4 in this paper I have formulated the following claim :

Let $n$ be a natural number greater than two . Let $r$ be the smallest odd prime number such that $r \nmid n$ and $n^2 \not\equiv 1 \pmod r$ . Let $T_n(x)$ be Chebyshev polynomial of the first kind , then $n$ is a prime number if and only if $T_n(x) \equiv x^n \pmod {x^r-1,n}$ .

You can run this test here .

I have tested this claim up to $5 \cdot 10^4$ and there were no counterexamples .


Algorithm implementation in PARI/GP without directly computing $T_n(x)$ .


@მამუკა ჯიბლაძე 2017-11-21 18:47:47

Wow. This deserves a separate answer.

As I mentioned in a comment, motivated by the question, in a previous comment, by Igor Rivin whether an efficient primality test can be made if the statement in the question is true, I asked a separate question about whether one could efficiently compute $T_n(x)$ modulo $x^r-1,n$. That question got a brilliant answer by Lucia, which enables to really demonstrate that if the statement in the question is true, one indeed obtains a very efficient primality test based on it.

I made this quick-and-dirty Mathematica code

polmul[f_, g_, r_, n_] := Mod[f.NestList[RotateRight, g, r - 1], n]

matmul[a_, b_, r_, n_] :=  Mod[
 {{polmul[a[[1, 1]], b[[1, 1]], r, n] + polmul[a[[1, 2]], b[[2, 1]], r, n], 
   polmul[a[[1, 1]], b[[1, 2]], r, n] + polmul[a[[1, 2]], b[[2, 2]], r, n]}, 
  {polmul[a[[2, 1]], b[[1, 1]], r, n] + polmul[a[[2, 2]], b[[2, 1]], r, n], 
   polmul[a[[2, 1]], b[[1, 2]], r, n] + polmul[a[[2, 2]], b[[2, 2]], r, n]}}, n]

matsq[a_, r_, n_] := matmul[a, a, r, n]

matpow[a_, k_, r_, n_] := If[k == 1, a,
  matpow[matsq[a, r, n], k/2, r, n], 
  matmul[a, matpow[matsq[a, r, n], (k - 1)/2, r, n], r, n]

xmat[r_, n_] :=
 {{PadRight[{0, 2}, r], PadRight[{n - 1}, r]},
  {PadRight[{1}, r], ConstantArray[0, r]}}

isprime[n_] := With[{r = smallestr[n]}, 
 If[r == 0, n == 2,
  With[{xp = matpow[xmat[r, n], n - 1, r, n]},
   Mod[RotateRight[xp[[1, 1]]] + xp[[1, 2]], n]
    === PadRight[Append[ConstantArray[0, Mod[n, r]], 1], r]

where smallestr is as in my other answer.

Running this on $n$ up to 100000 (all answers correct) gives the following timing:

enter image description here

Seems that it is of at worst logarithmic order (as that answer by Lucia suggests) - actually the graph of $\log(\operatorname{time}(n))/\log\log(n)$ looks almost like going to be bounded above:

enter image description here

Since the algorithm involves a recursive procedure for matrix powers, in principle one also has to check memory use. Here I must confess results are strange, maybe it is something hardware-specific. $$ \begin{array}{r|l} \text{amount of memory}&\text{number of cases (out of 100000)}\\ \hline 32&1407\\ 64&94408\\ 80&1\\ 288&1\\ 320&42\\ 352&3\\ 392&8\\ 424&2316\\ 456&1812\\ 3452552&1 \end{array} $$ This last amount 3452552 corresponds to $n=65969$, I have no idea why it needed so much memory. As I said this might be something machine specific - maybe some garbage collection occurred at that point or something like that. Anyway, as opposed to memory measurement timing data seem to be very accurate, I used the Mathematica command AbsoluteTiming and documentation says it gives actual processor time used for the calculation with quite high precision.

@Igor Rivin 2017-11-21 19:39:14

That IS amazing, thanks for investigating!

@მამუკა ჯიბლაძე 2017-11-21 22:23:08

@IgorRivin Pleasure is mine indeed :)

@Ryan O'Donnell 2017-11-22 03:24:54

Looks like using the FFT method to multiply polynomials, and assuming r = O(log n), the whole algorithm takes O(log^3 n) time (times some polyloglogs, depending on your exact model of computation).

@Vít Tuček 2017-11-22 13:48:56

@RyanO'Donnell Which is the same complexity as in the Agrawal et al. paper. I don't understand what, if anything, was gained by using Čebyšev's polynomials.

@მამუკა ჯიბლაძე 2017-11-23 08:13:26

@VítTuček You are right, that one is in fact simpler: have to find $n-1$st power of $x+a$ modulo $x^r-1$ instead of $n-1$st power of a 2$\times$2 matrix. On the other hand, in their case $r$ is larger and one has to test for several $a$s.

@Vít Tuček 2017-11-23 15:06:27

@მამუკაჯიბლაძე Look at their conjecture in section six -- the one mentioned by the OP. I mean, it's great that this conjecture seems plausible also for $T_n$ and not only for $(x-1)^n$. Maybe it could be worthwhile to to try to find some large class of polynomials with this property.

@მამუკა ჯიბლაძე 2017-11-23 17:44:40

@VítTuček Acknowledged - after all this one is also a conjecture (so far). I don't think it will turn out easier to prove than theirs...

@Vít Tuček 2017-11-23 18:28:31

@მამუკაჯიბლაძე Right. So far we have two families of polynomials. Maybe we can identify more and then try to find out what they have in common which could perhaps give us some information on why such a conjecture might be true.

@Gene S. Kopp 2017-11-26 02:39:05

@RyanO'Donnell While the algorithm is $\tilde{O}(\log^3 n)$ in the worst case, it is $\tilde{O}(\log^2 n)$ in the average case and in the average prime case, because $r$ may be taken to be $O_{\varepsilon}(1)$ except on a set of density $\varepsilon$ defined by congruence conditions.

@Peđa Terzić 2019-05-18 19:02:31

@მამუკა ჯიბლაძე 2017-11-19 18:32:56

Decided to make a cw answer with an illustration of time growth.

I've tried on Mathematica this:

isprime[n_] := 
 With[{r = smallestr[n]}, 
  If[r == 0, n == 2, 
   PolynomialMod[PolynomialRemainder[ChebyshevT[n, t] - t^n, t^r - 1, t], n] === 0


smallestr[n_] := Module[{r},
  If[n==1 \[Or] EvenQ[n], Return[0]];
  For[r = 3, MemberQ[{0, 1, r - 1}, Mod[n, r]], r = NextPrime[r + 1],
   If[r < n \[And] Mod[n, r] == 0, Return[0]]

I've run it on $n$ up to about 31000 (all answers are correct); here is the graph of time needed as a function of $n$.

enter image description here

Growth looks like faster than polynomial - the graph of $\log(\text{time}(n))/\log(n)$ does not seem to stabilize:

enter image description here

On the other hand a rough upper bound on growth can be deduced from the fact that $\log(\text{time}(n))/(n\log(n))$ seems to go down:

enter image description here

Some additional remarks.


Datapoints are only for those $n$ which have positive value of smallestr, i. e. such that the corresponding $r$ is smaller than any nontrivial divisor of $n$. Understandably, for other $n$ calculation is qualitatively quicker.


Finding $r$ is very efficient: $$ \begin{array}{c|c} r&\text{smallest $n$ that requires this $r$}\\ \hline 11&29\\ 13&419\\ 19&1429\\ 23&315589\\ 29&734161\\ 31&1456729 \end{array} $$


Seems like $n$ is prime iff all coefficients of $T_n(x)-x^n$ are divisible by $n$. If true, this must be well known of course, but I don't know. Should be provable from the explicit form of coefficients of $T_n$.


Given (2), it is obvious that at prime $n$ the algorithm gives correct answer. To also prove that it detects composite $n$ one has to show the following. Denote by $a_0$, ..., $a_n$ the coefficients of $T_n(x)-x^n$. Then, if some of the $a_i$ is not divisible by $n$, then also one of the sums $s_j:=a_j+a_{j+r}+a_{j+2r}+...$, $j=0,...,r-1$ is not divisible by $n$. Seemingly if $a_j$ is not divisible by $n$ then $j$ is not coprime to $n$; maybe this can help.

@Gerhard Paseman 2017-11-20 05:41:37

It is clear that n is prime implies the desired mod n congruence. What results do you get for composite n? In particular, are there any composite n for which the mod n congruence hold and the x^r-1 congruence fails? Gerhard "Bidirectionals Are Not Always Symmetric" Paseman, 2017.11.19.

@მამუკა ჯიბლაძე 2017-11-20 05:49:18

@GerhardPaseman Experimentally, the mod n congruence holds iff n is prime. I believe it is not difficult to show using explicit expressions for coefficients of $T_n$,$$a_{n-2k}=(-1)^k2^{n-2k}\frac n2\frac{(n-k-1)!}{k!(n-2k)!},$$$k=0,1,...$ (all others zero).

@მამუკა ჯიბლაძე 2017-11-20 06:03:38

But you probably wanted to ask the opposite, right? Because if the mod n congruence holds, it will also hold mod x^r-1. The nontrivial fact to prove is that if mod n fails then it will also fail mod x^r-1.

@Gerhard Paseman 2017-11-20 06:19:39

I probably did mean the opposite. Yes, if n is composite, then both congruences should fail. Gerhard "Doesn't Work With Ideals Ideally" Paseman, 2017.11.19.

Related Questions

Sponsored Content

0 Answered Questions

1 Answered Questions

1 Answered Questions

2 Answered Questions

0 Answered Questions

0 Answered Questions

Primality test similar to the AKS test

2 Answered Questions

[SOLVED] Is there an explicit expression for Chebyshev polynomials modulo $x^r-1$?

1 Answered Questions

Sponsored Content