# The Spherical Coordinate Convention

It has come to my attention (actually I think I noticed this a long time ago and then forgot…) that there is a correct answer as to the convention used for spherical coordinates.

The options are:

- The physicist’s convention, which has \(\theta\) as the zenithal (polar) angle, the angle away from “north”, which ranges in \((0, \frac{\pi}{2})\), and \(\phi\) as the azithutal (equatorial) angle, the angle away from a designated line of 0 longitude, which ranges in \((0, 2 \pi)\)
- The mathematician’s convention, which has \(\phi\) as the zenithal angle and \(\theta\) as the azimuthal angle.

It turns out that there is a right answer. The mathematicians are right (for once).

I mean…. look at them:

\[\LARGE{\theta \, \, \, \phi}\]They’re fucking *pictures* of which angle they are.

That’s right. I do not care if everyone writes spherical harmonics the other way. There is only one right answer. Please update your textbooks.