TXT

By Vincent Walker,2014-05-11 12:15
23 views 0

??ÎÄÓÉmargaritaville??Ï×

pdfÎÄµµ?ÉÄÜÔÚWAP?Ëä?ÀÀÌåÑé???Ñ????ÒéÄúÓÅÏÈÑ?ÔñTXT???òÏÂÔØÔ?ÎÄ?þµ????ú?é????

C. KRATTENTHALER

math.CO/9902004 v3 31 May 1999

Institut f??r Mathematik der Universit??t Wien, u a Strudlhofgasse 4, A-1090 Wien, Austria. E-mail: kratt@pap.univie.ac.at WWW: http://radon.mat.univie.ac.at/People/kratt

Dedicated to the pioneer of determinant evaluations (among many other things), George Andrews Abstract. The purpose of this article is threefold. First, it provides the reader with a few useful and ecient tools which should enable her/him to evaluate nontrivial determinants for the case such a determinant should appear in her/his research. Second, it lists a number of such determinants that have been already evaluated, together with explanations which tell in which contexts they have appeared. Third, it points out references where further such determinant evaluations can be found.

1. Introduction Imagine, you are working on a problem. As things develop it turns out that, in order to solve your problem, you need to evaluate a certain determinant. Maybe your determinant is 1 , (1.1) det 1?Üi,j,?Ün i+j or a+b det , (1.2) 1?Üi,j?Ün ai+j or it is possibly

0?Üi,j?Ün1

det

+i+j 2i j

,

(1.3)

1991 Mathematics Subject Classication. Primary 05A19; Secondary 05A10 05A15 05A17 05A18 05A30 05E10 05E15 11B68 11B73 11C20 15A15 33C45 33D45. Key words and phrases. Determinants, Vandermonde determinant, Cauchy's double alternant, Pfaan, discrete Wronskian, Hankel determinants, orthogonal polynomials, Chebyshev polynomials, Meixner polynomials, Meixner?CPollaczek polynomials, Hermite polynomials, Charlier polynomials, Laguerre polynomials, Legendre polynomials, ultraspherical polynomials, continuous Hahn polynomials, continued fractions, binomial coecient, Genocchi numbers, Bernoulli numbers, Stirling numbers, Bell numbers, Euler numbers, divided dierence, interpolation, plane partitions, tableaux, rhombus tilings, lozenge tilings, alternating sign matrices, noncrossing partitions, perfect matchings, permutations, inversion number, major index, descent algebra, noncommutative symmetric functions. Research partially supported by the Austrian Science Foundation FWF, grants P12094-MAT and P13190-MAT.

1

2

C. KRATTENTHALER

or maybe

1?Üi,j?Ün

det

x+y+j x i + 2j

x+y+j x + i + 2j

.

(1.4)

description) required a good "guesser" and an excellent "hypergeometer" (both of which he was and is). While at that time especially to be the latter was quite a task, in the meantime both guessing and evaluating binomial and hypergeometric sums has been largely trivialized, as both can be done (most of the time) completely automatically. For guessing (see Appendix A)

Turnbull's book [178] does in fact contain rather lots of very general identities satised by determinants, than determinant "evaluations" in the strict sense of the word. However, suitable specializations of these general identities do also yield "genuine" evaluations, see for example Appendix B. Since the value of this book may not be easy to appreciate because of heavy notation, we refer the reader to [102] for a clarication of the notation and a clear presentation of many such identities.

1

3

this is due to tools like Superseeker2, gfun and Mgfun3 [152, 24], and Rate4 (which is by far the most primitive of the three, but it is the most eective in this context). For "hypergeometrics" this is due to the "WZ-machinery"5 (see [130, 190, 194, 195, 196]). And even if you should meet a case where the WZ-machinery should exhaust your computer's capacity, then there are still computer algebra packages like HYP and HYPQ6, or HYPERG7, which make you an expert hypergeometer, as these packages comprise large parts of the present hypergeometric knowledge, and, thus, enable you to conveniently manipulate binomial and hypergeometric series (which George Andrews did largely by hand) on the computer. Moreover, as of today, there are a few new (perhaps just overlooked) insights which make life easier in many cases. It is these which form large parts of Section 2. So, if you see a determinant, don't be frightened, evaluate it yourself! 2. Methods for the evaluation of determinants In this section I describe a few useful methods and theorems which (may) help you to evaluate a determinant. As was mentioned already in the Introduction, it is always possible that simple-minded things like doing some row and/or column operations, or applying Laplace expansion may produce an (usually inductive) evaluation of a determinant. Therefore, you are of course advised to try such things rst. What I am mainly addressing here, though, is the case where that rst, "simple-minded" attempt failed. (Clearly, there is no point in addressing row and column operations, or Laplace expansion.) Yet, we must of course start (in Section 2.1) with some standard determinants, such as the Vandermonde determinant or Cauchy's double alternant. These are of course well-known. In Section 2.2 we continue with some general determinant evaluations that generalize the evaluation of the

Vandermonde determinant, which are however apparently not equally well-known, although they should be. In fact, I claim that about 80 % of the determinants that you meet in "real life," and which can apparently be evaluated, are a special case of just the very rst of these (Lemma 3; see in particular Theorem 26 and the subsequent remarks). Moreover, as is demonstrated in Section 2.2, it is pure routine to check whether a determinant is a special case of one of these general determinants. Thus, it can be really considered as a "method" to see if a determinant can be evaluated by one of the theorems in Section 2.2.

the electronic version of the "Encyclopedia of Integer Sequences" [162, 161], written and developed by Neil Sloane and Simon Ploue; see http://www.research.att.com/~njas/sequences/ol.html 3 written by Bruno Salvy and Paul Zimmermann, respectively Frederic Chyzak; available from

http://pauillac.inria.fr/algo/libraries/libraries.html 4 written in Mathematica by the author; available from

http://radon.mat.univie.ac.at/People/kratt; the Maple equivalent GUESS by Franois Braud and Bruno Gauthier is available from c e http://www-igm.univ-mlv.fr/~gauthier 5 Maple implementations written by Doron Zeilberger are available from

http://www.math.temple.edu/~zeilberg, Mathematica implementations written by Peter Paule, Axel Riese, Markus Schorn, Kurt Wegschaider are available from

http://www.risc.uni-linz.ac.at/research/combinat/risc/software 6 written in Mathematica by the author; available from

http://radon.mat.univie.ac.at/People/kratt 7 written in Maple by Bruno Ghauthier; available from http://www-igm.univ-mlv.fr/~gauthier

2

4

C. KRATTENTHALER

The next method which I describe is the so-called "condensation method" (see Section 2.3), a method which allows to evaluate a determinant inductively (if the method works). In Section 2.4, a method, which I call the "identication of factors" method, is described. This method has been extremely successful recently. It is based on a very simple idea, which comes from one of the standard proofs of the Vandermonde determinant evaluation (which is therefore described in Section 2.1). The subject of Section 2.5 is a method which is based on nding one or more dierential or dierence equations for the matrix of which the determinant is to be evaluated. Section 2.6 contains a short description of George Andrews' favourite method, which basically consists of explicitly doing the LU-factorization of the matrix of which the determinant is to be evaluated. The remaining subsections in this

section are conceived as a complement to the preceding. In Section 2.7 a special type of determinants is addressed, Hankel determinants. (These are determinants of the form det1?Üi,j?Ün (ai+j ), and are sometimes also called persymmetric or Turnian determinants.) As is explained there, you should expect that a a Hankel determinant evaluation is to be found in the domain of orthogonal polynomials and continued fractions. Eventually, in Section 2.8 a few further, possibly useful results are exhibited. Before we nally move into the subject, it must be pointed out that the methods of determinant evaluation as presented here are ordered according to the conditions a determinant must satisfy so that the method can be applied to it, from "stringent" to "less stringent". I. e., rst come the methods which require that the matrix of which the determinant is to be taken satises a lot of conditions (usually: it contains a lot of parameters, at least, implicitly), and in the end comes the method (LU-factorization) which requires nothing. In fact, this order (of methods) is also the order in which I recommend that you try them on your determinant. That is, what I suggest is (and this is the rule I follow): (0) First try some simple-minded things (row and column operations, Laplace expansion). Do not waste too much time. If you encounter a Hankel-determinant then see Section 2.7. (1) If that fails, check whether your determinant is a special case of one of the general determinants in Sections 2.2 (and 2.1). (2) If that fails, see if the condensation method (see Section 2.3) works. (If necessary, try to introduce more parameters into your determinant.) (3) If that fails, try the "identication of factors" method (see Section 2.4). Alternatively, and in particular if your matrix of which you want to nd the determinant is the matrix dening a system of dierential or dierence equations, try the

differential/dierence equation method of Section 2.5. (If necessary, try to introduce a parameter into your determinant.) (4) If that fails, try to work out the LU-factorization of your determinant (see Section 2.6). (5) If all that fails, then we are really in trouble. Perhaps you have to put more eorts into determinant manipulations (see suggestion (0))? Sometimes it is worthwile to interpret the matrix whose determinant you want to know as a linear map and try to nd a basis on which this map acts triangularly, or even diagonally (this

5

requires that the eigenvalues of the matrix are "nice"; see [47, 48, 84, 93, 192] for examples where that worked). Otherwise, maybe something from Sections 2.8 or 3 helps? A nal remark: It was indicated that some of the methods require that your determinant contains (more or less) parameters. Therefore it is always a good idea to: Introduce more parameters into your determinant! (We address this in more detail

in the last paragraph of Section 2.1.) The more parameters you can play with, the more likely you will be able to carry out the determinant evaluation. (Just to mention a few examples: The condensation method needs, at least, two parameters. The "identication of factors" method needs, at least, one parameter, as well as the dierential/dierence equation method in Section 2.5.) 2.1. A few standard determinants. Let us begin with a short proof of the Vandermonde determinant evaluation

1?Üi,j?Ün

det

Xij1 =

1?Üi<j?Ün

(Xj Xi ).

(2.1)

Although the following proof is well-known, it makes still sense to quickly go through it because, by extracting the essence of it, we will be able to build a very powerful method out of it (see Section 2.4). If Xi1 = Xi2 with i1 = i2, then the Vandermonde determinant (2.1) certainly vanishes because in that case two rows of the determinant are identical. Hence, (Xi1 Xi2 ) divides the determinant as a polynomial in the Xi 's. But that means that the complete product 1?Üi<j?Ün (Xj Xi ) (which is exactly the right-hand side of (2.1)) must divide the determinant. On the other hand, the determinant is a polynomial in the Xi 's of degree at most n . Combined with the previous observation, this implies that the determinant equals 2 the right-hand side product times, possibly, some constant. To compute the constant, 0 1 n1 compare coecients of X1 X2 Xn on both sides of (2.1). This completes the proof of (2.1). At this point, let us extract the essence of this proof as we will come back to it in Section 2.4. The basic steps are: 1. Identication of factors 2. Determination of degree bound 3. Computation of the multiplicative constant. An immediate generalization of the Vandermonde determinant evaluation is given by the proposition below. It can be proved in just the same way as the above proof of the Vandermonde determinant evaluation itself. Proposition 1. Let X1 , X2 , . . . , Xn be indeterminates. If p1 , p2 , . . . , pn are polynomials of the form pj (x) = aj xj1 + lower terms, then

1?Üi,j?Ün

det (pj (Xi )) = a1 a2 an

1?Üi<j?Ün

(Xj Xi ).

(2.2)

6

C. KRATTENTHALER

The following variations of the Vandermonde determinant evaluation are equally easy to prove. Lemma 2. The following identities hold true:

n 1?Üi,j?Ün

det

(Xij

Xij )

= (X1 Xn )

n 1?Üi<j?Ün

(Xi Xj )(1 Xi Xj )

i=1

(Xi2 1), (2.3)

1?Üi,j?Ün

det (Xi

j1/2

Xi

(j1/2)

)

n

= (X1 Xn )n+1/2

1?Üi<j?Ün

(Xi Xj )(1 Xi Xj )

i=1

(Xi 1),

(2.4)

1?Üi,j?Ün

det (Xij1 + Xi

(j1)

) = 2 (X1 Xn )n+1

1?Üi<j?Ün

(Xi Xj )(1 Xi Xj ) ,

(2.5)

1?Üi,j?Ün

det (Xij1/2 + Xi(j1/2) )

n

= (X1 Xn )

n+1/2 1?Üi<j?Ün

(Xi Xj )(1 Xi Xj )

i=1

(Xi + 1). (2.6)

We remark that the evaluations (2.3), (2.4), (2.5) are basically

the Weyl denominator factorizations of types C, B, D, respectively (cf.

[52, Lemma 24.3, Ex. A.52, Ex. A.62, Ex. A.66]). For that reason they

may be called the "symplectic", the "odd orthogonal", and the "even

orthogonal" Vandermonde determinant evaluation, respectively. ?Ë If

you encounter generalizations of such determinants of the form

det1?Üi,j?Ün (xi j ) ?Ë ?Ë or det1?Üi,j?Ün (xi j xi j ), etc., then

you should be aware that what you encounter is basically Schur functions, characters for the symplectic groups, or characters for the orthogonal groups (consult [52, 105, 137] for more information on these matters; see in particular [105, Ch. I, (3.1)], [52, p. 403, (A.4)], [52, (24.18)], [52, (24.40) + rst paragraph on p. 411], [137, Appendix A2], [52, (24.28)]). In this context, one has to also mention Okada's general results on evaluations of determinants and Pfaans (see Section 2.8 for denition) in [124, Sec. 4] and [125, Sec. 5]. Another standard determinant evaluation is the evaluation of Cauchy's double alternant (see [119, vol. III, p. 311]),

1?Üi,j?Ün

det

1 Xi + Yj

=

1?Üi<j?Ün (Xi

Xj )(Yi Yj ) . 1?Üi,j?Ün (Xi + Yj )

(2.7)

Once you have seen the above proof of the Vandermonde determinant evaluation, you will immediately know how to prove this determinant evaluation. On setting Xi = i and Yi = i, i = 1, 2, . . . , n, in (2.7), we obtain the evaluation of our rst determinant in the Introduction, (1.1). For the evaluation of a mixture of Cauchy's double alternant and Vandermonde's determinant see [15, Lemma 2].

7

Whether or not you tried to evaluate (1.1) directly, here is an important lesson to be learned (it was already mentioned earlier): To evaluate (1.1) directly is quite dicult, whereas proving its generalization (2.7) is almost completely trivial. Therefore, it is always a good idea to try to introduce more parameters into your determinant. (That is, in a way such that the more general determinant still evaluates nicely.) More parameters mean that you have more objects at your disposal to play with. The most stupid way to introduce parameters is to just write Xi instead of the row index i, or write Yj instead of the column index j.8 For the determinant (1.1) even both simultaneously was possible. For the determinant (1.2) either of the two (but not both) would work. On the contrary, there seems to be no nontrivial way to introduce more parameters in the determinant (1.4). This is an indication that the evaluation of this determinant is in a dierent category of diculty of evaluation. (Also (1.3) belongs to this "dierent category". It is possible to introduce one more parameter, see (3.32), but it does not seem to be possible to introduce more.) 2.2. A general determinant lemma, plus variations and generalizations. In this section I present an apparently not so well-known determinant

evaluation that generalizes Vandermonde's determinant, and some companions. As Lascoux pointed out to me, most of these determinant evaluations can be derived from the evaluation of a certain determinant of minors of a given matrix due to Turnbull [179, p. 505], see Appendix B. However, this (these) determinant evaluation(s) deserve(s) to be better known. Apart from the fact that there are numerous applications of it (them) which I am aware of, my proof is that I meet very often people who stumble across a special case of this (these) determinant evaluation(s), and then have a hard time to actually do the evaluation because, usually, their special case does not show the hidden general structure which is lurking behind. On the other hand, as I will demonstrate in a moment, if you know this (these) determinant evaluation(s) then it is a matter completely mechanical in nature to see whether it (they) is (are) applicable to your determinant or not. If one of them is applicable, you are immediately done. The determinant evaluation of which I am talking is the determinant lemma from [85, Lemma 2.2] given below. Here, and in the following, empty products (like (Xi + An)(Xi + An1 ) (Xi + Aj+1 ) for j = n) equal 1 by convention. Lemma 3. Let X1 , . . . , Xn , A2, . . . , An, and B2 , . . . , Bn be indeterminates. Then there holds

1?Üi,j?Ün

det

(Xi + An)(Xi + An1 ) (Xi + Aj+1 )(Xi + Bj )(Xi + Bj1 ) (Xi + B2 ) =

1?Üi<j?Ün

(Xi Xj )

2?Üi?Üj?Ün

(Bi Aj ). (2.8)

Other common examples of introducing more parameters are: Given that the (i, j)-entry of your i+j determinant is a binomial such as 2ij , try x+i+j (that works; see (3.30)), or even x+y+i+j (that 2ij y+2ij x+i+j does not work; but see (1.2)), or 2ij + y+i+j (that works; see (3.32), and consult Lemma 19 2ij and the remarks thereafter). However, sometimes parameters have to be introduced in an unexpected way, see (3.49). (The parameter x was introduced into a determinant of Bombieri, Hunt and van der Poorten, which is obtained by setting x = 0 in (3.49).)

8

8

C. KRATTENTHALER

Once you have guessed such a formula, it is easily proved. In the proof in [85] the determinant is reduced to a determinant of the form (2.2) by suitable column operations. Another proof, discovered by Amdeberhan (private communication), is by condensation, see Section 2.3. For a derivation from the above mentioned evaluation of a

determinant of minors of a given matrix, due to Turnbull, see Appendix B. Now let us see what the value of this formula is, by checking if it is of any use in the case of the second determinant in the Introduction, (1.2). The recipe that you should follow is: 1. Take as many factors out of rows and/or columns of your determinant, so that all denominators are cleared. 2. Compare your result with the determinant in (2.8). If it matches, you have found the evaluation of your determinant. Okay, let us do so: det a+b ai+j

n

1?Üi,j?Ün

=

i=1 1?Üi,j?Ün

(a + b)! (a i + n)! (b + i 1)!

?Á det

n

n = (1)( 2 )

(a i + n)(a i + n 1) (a i + j + 1) (b + i j + 1)(b + i j + 2) (b + i 1)

i=1

(a + b)! (a i + n)! (b + i 1)! (i a n)(i a n + 1) (i a j 1) (i + b j + 1)(i + b j + 2) (i + b 1) .

?Á det

1?Üi,j?Ün

Now compare with the determinant in (2.8). Indeed, the determinant in the last line is just the special case Xi = i, Aj = a j, Bj = b j + 1. Thus, by (2.8), we have a result immediately. A particularly attractive way to write it is displayed in (2.17). Applications of Lemma 3 are abundant, see Theorem 26 and the remarks accompanying it. In [87, Lemma 7], a determinant evaluation is given which is closely related to Lemma 3. It was used there to establish enumeration results about shifted plane partitions of trapezoidal shape. It is the rst result in the lemma below. It is "tailored" for the use in the context of q-enumeration. For plain enumeration, one would use the second result. This is a limit case of the rst (replace Xi by q Xi , Aj by q Aj and C by q C in (2.9), divide both sides by (1 q)n(n1) , and then let q ?ú 1). Lemma 4. Let X1 , X2 , . . . , Xn , A2, . . . , An be indeterminates. Then there hold

1?Üi,j?Ün

det

(C/Xi + An)(C/Xi + An1 ) (C/Xi + Aj+1 )

n

(Xi + An )(Xi + An1 ) (Xi + Aj+1 ) =

i=2

Ai1 i

Report this document

For any questions or suggestions please email
cust-service@docsford.com