Modulo some borderline cases (and unforseen bugs) abelfunctions is getting close to being able to compute period matrices of algebraic curves. The final ingredient is an algorithm for computing a basis of holomorphic differentials on the curve. In this article, we review the theory behind the algorithm and discuss how SymPy is used in the implementation of the algorithm.

Holomorphic Differentials and Mñuk's Theorem

Given a Riemann surface $X$ defined by the locus of a complex plane algebraic curve $f(x,y) = 0, \; f \in \mathbb{C}[x,y]$ there is a space of differentials $\omega = h(x,y)dx, \; h \in \mathbb{C}(x,y)$ which are holomorphic on $X$. This space, denoted $\Gamma(X, \Omega^1)$, has dimension $g$ where $g$ is the genus of the Riemann surface.

Using a theorem of Mñuk [1], a basis for this space of holomorphic differentials can be computed using the integral basis of the algebraic function field

\[ \mathcal{O}_{A(X)} := \overline{\mathbb{C}[x,y] / f(x,y)} = \beta_1 \mathbb{C}[x] + \cdots + \beta_{n} \mathbb{C}[x]. \]

The holomorphic differerentials are all of the form $P(x,y)/\partial_y f(x,y)$ where $P(x,y) = \sum_{i+j \leq d-3} c_{ij}x^iy^j$ are the so-called adjoint polynomials of $X$ and $d$ is the (projective) degree of $f$. Therefore, finding a basis of differentials reduces to finding conditions on the coefficients of the adjoint polynomials. Mñuk proves that the the space of adjoint polynomials of the Riemann surface, $\text{Adj}(X)$, is determined completely by the integral basis of the algebraic function field.

\[ \text{Adj}(X) = \left\{ P(x,y) \; | \; \mathcal{O}_{A(X)} P(x,y) \subset \mathbb{C}[x,y] \right\} \]

That is, the adjoint polynomials are those such that $\beta_j P(x,y) \in \mathbb{C}[x,y]$ for all $j=1,\ldots,n$. Since $\beta_j \in \mathbb{C}(x)[y]$ these requirements impose linear conditions on the coefficients $c_{ij}$ of $P$.

SymPy and Polynomial Division

abelfunctions uses SymPy for all of its symbolic computaitons. However, personal experience shows that it can be difficult keeping halfway decent performance in the symbolic portions of the code. For example, some manual manipulation trickery is used in both puiseux.py:_new_polynomial() and integral_basis.py:(several functions) as opposed to using built-in SymPy functions. On a personal note, these manual implementations were chosen because I don't understand SymPy well enough and need to take more time to explore its intricacies.

The SymPy Wiki's FAQ Page admits that its current implementation is approximately 10x slower than most other computer algebra systems. This is mostly due to the fact that it is implemented purely in Python. Ondřej Čertík has been working on an experimental SymPy core written in C++ that will hopefully boost performance!

Thankfully, we can determine the aforementioned linear conditions on the coefficients $c_{ij}$ in a reasonable amount of time without having to resort to "manual" approaches.

Working With An Example

Let's look at the example $f(x,y) = y^3 - x^4y + 2x^7$. Here, $d = 7$ so the adjoint polynomials of $X : f(x,y) = 0$ are of the form

\[ \begin{align*} P(x,y) &= \sum_{i+j \leq 4} c_{ij}x^iy^j \\ &= c_{0 0} + c_{0 1} y + c_{0 2} y^{2} + c_{0 3} y^{3} + c_{1 0} x + c_{1 1} x y + c_{1 2} x y^{2} + \\ & \quad c_{1 3} x y^{3} + c_{2 0} x^{2} + c_{2 1} x^{2} y + c_{2 2} x^{2} y^{2} + c_{3 0} x^{3} + c_{3 1} x^{3} y. \end{align*} \]

We need to keep track of these undetermined coefficients so we construct $P$ like so:

import sympy
from sympy.abc import x,y

f = y**3 - x**3*y + 2*x**7
d = 7

c = sympy.symarray('c',(d-3,d-3)).tolist()
P = sum( c[i][j] * x**i * y**j 
         for i in range(d-3) for j in range(d-3) if i+j <= d-3)

print("P(x,y) =")
sympy.pprint(P)
P(x,y) =
                         2          3                         
c_0_0 + c_0_1*y + c_0_2*y  + c_0_3*y  + c_1_0*x + c_1_1*x*y + 

         2            3          2          2            2  2 
c_1_2*x*y  + c_1_3*x*y  + c_2_0*x  + c_2_1*x *y + c_2_2*x *y  

         3          3  
+ c_3_0*x  + c_3_1*x *y

The integral basis can be computed using abelfunctions.

from abelfunctions import integral_basis
beta = integral_basis(f,x,y)

print("beta =")
sympy.pprint(beta)
beta =
        2 
    y  y  
[1, -, --]
    x   3 
       x  

Now, at this point, we could do the "manual approach" and compute product of $\beta_j$ with $P$, do a sympy.expand(), and then use some expression string parsing functions to figure out which terms contain negative exponents. Then, we would have to parse the expression string again to find the $c_{ij}$ coefficients and set the appropriate expressions equal to zero. This has already been attempted and my head hurt after a while with all of the spaghetti code and bugs that cropped up.

Instead, let's take a stab at using SymPy's polynomial division algorithms. What's nice here, as opposed to the Puiseux series and integral basis contexts, is that we're only working with symbolic expressions that have integral exponents. (Rational exponents have been difficult to deal with so far.) We cast the appropriate expressions into polynomials in $x$ since we are only dividing by such creatures.

# separate the numerator and denominator so we can use
# polynomial division
b = beta[2]
numer, denom = b.ratsimp().as_numer_denom()

# expressions are interpreted as polynomials in x since 
# that is the only polynomial division we care about
c_flat = [item for sublist in c for item in sublist]
gens = c_flat + [y]
expr = sympy.poly(numer*P, x, domain=sympy.QQ[gens])
denom = sympy.poly(denom, x, domain=sympy.QQ[sympy.I])

# compute the remainder after division
quo, rem = expr.div(denom)

print("Remainder:")
sympy.pprint(rem.as_expr())

rem = rem.as_poly(x,y)
coeffs = rem.coeffs()

print("\nResulting Equations:")
sympy.pprint(coeffs)
Remainder:
       2          3          4          5    2 /       2  
c_0_0*y  + c_0_1*y  + c_0_2*y  + c_0_3*y  + x *\c_2_0*y  +

        3          4\     /       2          3          4 
 c_2_1*y  + c_2_2*y / + x*\c_1_0*y  + c_1_1*y  + c_1_2*y  

         5\
+ c_1_3*y /

Resulting Equations:
[c_2_2, c_2_1, c_2_0, c_1_3, c_1_2, c_1_1, c_1_0, c_0_3, c
_0_2, c_0_1, c_0_0]

Note that none of these terms are divisible by $x^3$. Therefore, the set of linear equations we obtain from this particular integral basis element is $c_{00} = c_{01} = \cdots = c_{13} = 0$. We repeat this process for each integral basis element and solve the resulting system of linear equations on $c_{ij}$ to obtain our adjoint polynomials; which, in this case, is the polynomial

\[ P(x,y) = c_{11}xy + c_{30}x^3. \]

Caveats and Concerns

In the event where the denominator of $\beta_j$ contains several factors of the form $(x-\alpha_k)^{n_k}$ we might have to repeat the above process for every such factor. (Note that since we are working over $\mathbb{C}$ the denominator splits completely.) This is because we have to impose regularity at all possible singular points, not just those at $x=0$ as in the above example. Whether or not we can simply divide by $(x-\alpha_k)^{n_k}$ and easily arrive at the desired conditions on $c_{ij}$ is something to experiment with. One approach would be to shift $x \mapsto x + \alpha_k$ and then divide by $x^{n_k}$. I'm not sure, though, if this is at all necessary nor am I sure what is the best way in SymPy to factor polynomials over $\mathbb{C}$. One approach wold be to find the roots over $\mathbb{C}$ using Poly.all_roots().

My hope is to have a decenly-perfoming implementation of the Mñuk-based algorithm by the end of the week. Once this is done it's time to compute period matrices!

[1] Mñuk, M. Computing adjoint curves. J. Symb. Comput. 23, 229–240 (1997)