Abhimanyu Pallavi Sudhir 

Research 

Fractional calculusAccording to Ortigueira (2004), "The GrunwaldLetnikov fractional derivative is equal to the generalised Cauchy derivative". However, as readers who've studied the Grunwald derivative in some detail may recall, the Grunwald derivative is in fact illdefined for rather innocuouslooking functions, like $f(x)=x$. Here, the Grunwald is calculated as: $$\begin{gathered} {D^{1/2}}x = \mathop {\lim }\limits_{h \to 0} {h^{  1/2}}\sum\limits_{k = 0}^\infty {\left( {\begin{array}{*{20}{c}} {1/2} \\ k \end{array}} \right)} {(  1)^k}(x  kh) \\ = {h^{  1/2}}\left[ {x\sum\limits_{k = 0}^\infty {\left( {\begin{array}{*{20}{c}} {1/2} \\ k \end{array}} \right)} {{(  1)}^k}  h\sum\limits_{k = 0}^\infty {\left( {\begin{array}{*{20}{c}} {1/2} \\ k \end{array}} \right)k} {{(  1)}^k}} \right] \\ = \frac{{\sqrt h }}{2}\sum\limits_{k = 0}^\infty {\left( {\begin{array}{*{20}{c}} {  1/2} \\ k \end{array}} \right)} {(  1)^k} \\ \end{gathered}$$ The remain summation is the binomial expansion of $(11)^{1/2}$, which is infinite, and multiplied by the infinitesimal $\sqrt{h}/2$ yields an undefined result. However, the Cauchy/RiemannLiouville derivative does in fact give us a welldefined result: $\frac{2}{\sqrt{\pi}}\sqrt{x}$. How might this be reconciled with the position that the two derivatives are the same – and indeed, are the same wherever the Grunwald derivative is convergent? You would declare that the Cauchy derivative is a "principal value" of the Grunwald derivative – and the only area of freedom we have in determining the value of the derivative is in the two limits employed in its calculation, $\mathop {\lim }\limits_{h \to 0}$ and $\mathop {\lim }\limits_{N \to \infty}$ (the limit as the summation approaches impropriety). So we declare that $N$ and $h$ are not in fact independent variables whose limits are calculated one after the other (which is equivalent to setting $N$ to be "infinity over $h$"), but rather related by some relation which when applied to unify the limits, produces the principal value of the summation as the Cauchy derivative. One may easily calculate this relation for $D^{1/2}x$ – as it turns out, the required relation which produces the principal value is $h=x/N$. As it turns out, a more general relation is possible for other functions and other derivative orders (the solution is a solution to a certain characteristic polynomial – read the paper to find out!), but $h=x/N$ is an acceptable relation for any function with a Taylor expansion. Determinantlike functionSuppose you wanted to define for a function $f:D \to R$ to another function $f':D'\to R'$ where $D\subset D'$ such that for all $x\in D$, $f'(x)=f(x)$. Such a function $f'$ would be called a "generalisation" of $f$. But $f'$ is hardly the only way to generalise $f$ to $D'$ – after all, the only constraint on the generalisation is that in $D$, the function is the same as $f$. There is no constraint at all on the value of $f'$ for any other value – for all you know, $f(x)=x^2+\pi(xx)$ is a valid generalisation of $f(x)=x^2$ from the positive numbers to the real numbers, and $f(x)=x^2\cos(2\pi x)$ is a valid generalisation of $f(x)=x^2$ from the integers to the real numbers. But these aren't particularly useful generalisations – or at least the first example isn't. A useful generalisation shares certain important properties with the original function. Of course, there can certainly be multiple useful generalisations of a function too, depending on which defining property you choose. What does this have to do with the determinant? Well, on first glance it may seem like it's nonsensical to generalise the determinant to nonsquare matrices. The determinant shows how volumes transform. Nonsquare matrices are interdimensional transformations – they either flatten volumes into lowerdimensional spaces, or raise them into higherdimensional spaces. In this sense, the determinant of a nonsquare matrix must be either 0 or infinity only. But "determinant is a volume" is not the only property of the determinant – sure, that is often treated as the axiomatic definition of the determinant, but there are several ways to define an axiomatic basis. Another defining property of the determinant is what I call the "determinantwedge equivalence", or "the determinant of a matrix is equal to the exterior product of its column vectors". For square matrices, this obviously reduces to the volume, as the exterior product of the column vectors equals the volume. But for nonsquare matrices, this is calculated differently. Indeed, this is the defining property I choose to define the determinantlike function. After a bit of calculation, I proved [1] that: $$\operatorname{detl}(A)=\sqrt{(mn)!}\sum_{k_1,k_2...k_{mn}}\left({\det}\left(\operatorname{cross}_{k_1,k_2...k_{mn}}A\right)\right)^2$$ For m by n matrix A with more rows than columns (i.e. which raises the dimensionality of the transformed vector), and the $\operatorname{cross}_{k_1,k_2...k_{mn}}A$ operator represents cancelling the rows $k_1,k_2...k_{mn}$ from the matrix (thus obtaining a square matrix whose determinant can be calculated conventionally).There are other generalisations, including HR Pyle's (1962) and M Radic's (1966). Interestingly, the former is a vectorvalued determinant whose norm turns out to be equal to the determinantlike function – I discuss this in [2]. ArchivedTensors in unit vector notation?My first paper was titled "The representation of matrices in unit vector notation". Obviously, this makes no sense. Matrices as they are introduced in linear algebra for kindergarteners are linear transformations of vectors. It doesn't make sense to write them in terms of vectors. That is all very true. However, it is instructive to understand what exactly the representation I introduce means, if it is not a representation of matrices in unit vector notation. In fact, the paper is just a clumsily worded restatement of the fact that tensors can be written as linear combinations of the outer products of vectors. Publication List(numbering different from numbering in inline references above and elsewhere)


EMAIL: [email protected] 