Hi. This is a blog. It's mostly about math and physics, which I study for fun.

# All the Exterior Algebra Operations

[October 15, 2020]

I’m returning to exterior algebra notes. This time, a reference for all of the many operations that I am aware of throughout the subject. It’s easier to define them all in one place than to spread the definitions over articles that use them.

I will make a point of giving explicit algorithms and an explicit example of each, in the lowest dimension that can still be usefully illustrative.

Warning: very long.

more...

# The essence of complex analysis

[August 10, 2020]

Rapid-fire intuitions for calculus on complex numbers, with little rigor.

Not an introduction to the subject.

more...

# The essence of quantum mechanics

[July 24, 2020]

Here’s what I know about QM. I’m trying to learn QFT and it helps to have the prerequisites compressed into the simplest possible representation. It also helps me to write everything down in a compressed form so I can reference it more easily.

This will make no sense if you don’t already have a good understanding of quantum mechanics.

Conventions: $$c = 1$$, $$g_{\mu \nu} = diag(+, -, -, -)$$. I like to write $$S_{\vec{x}}$$ for $$\nabla S$$.

more...

# A possible derivation of the Born Rule?

[December 22, 2019]

I think that the Many-Worlds Interpretation (MWI) of quantum mechanics is probably ‘correct’. There is no reason to think that the rules of atomic phenomena would stop applying at larger scales when an experimenter becomes entangled with their experiment.

However, MWI has the problem (shared with all the other mainstream interpretations of QM) that it does not explain why quantum randomness leads to the probabilities that we observe. The so-called Born Rule says that if a system is in a state $$\alpha \| 0 \> + \beta \| 1 \>$$, upon ‘measurement’ (in which we entangle with one or the other outcome), we measure the eigenvalue associated with the state $$\| 0 \>$$ with probability

$P = \| \alpha \|^2$

The Born Rule is normally included as an additional postulate in MWI, and this is somewhat unsatisfying. Or at least, it is apparently difficult to justify, given that I’ve read a bunch of attempts, each of which talks about how there haven’t been any other satisfactory attempts. I think it would be unobjectionable to say that there is not a consensus on how to motivate the Born Rule from MWI without any other assumptions.

Anyway here’s an argument I found that I find somewhat compelling. It argues that the Born Rule can emerge from interference if you assume that every measurement of a probability that you’re exposed to (which I guess is a Many-Worlds-ish idea) is assigned a random, uncorrelated phase.

more...

# Fourier Transforms via magic

[November 26, 2019]

$\gdef\F#1{\mathcal{F}[#1]}$

A while ago I found a series of papers which do some weird stuff with derivative operators:

1. New Dirac Delta function based methods with applications to perturbative expansions in quantum field theory by Kempf/Jackson/Morales, 2014
2. How to (Path-) Integrate by Differentiating also by Kempf/Jackson/Morales, 2015
3. Integration by differentiation: new proofs, methods and examples by Jia/Tang/Kempf, 2016

The general theme is: evaluating functions on derivative operators $$f(\p)$$, and applying this to delta functions $$f(\p_x) \delta(x)$$, is occasionally useful and can give weird alternate characterizations of the Fourier transform and can be used to efficiently solve integrals.

The authors are physicists, unsurprisingly, and I’m sure there are a bunch of reasons why these results are either not that surprising or surprising-yet-not-useful, but I found them remarkable. But the whole thing is confusing and hard to make sense of. Here’s a… totally different take, in which I rederive the main result by poking around.

tldr: the Fourier transform of $$f(x, \p_x)$$ is $$f(i \p_k, -ik) 2 \pi \delta(k)$$, whatever that means.

more...

# A brief note about derivatives

[November 9, 2019]

A blog post led me to a paper, “Extending the Algebraic Manipulability of Differentials”, which makes a useful point about the notation we use for derivatives. This is a brief summary so I don’t forget it.

Observation: the derivative operator $$\frac{d}{dx}$$ can be decomposed into two steps: applying the differential operator $$d$$ to the target, then dividing by $$dx$$. It is useful to think of this as occuring in two separate steps because it removes ambiguity in certain notations and allows algebraic manipulations like $$\frac{dy}{dx} \frac{dx}{dt} = \frac{dy}{dt}$$ to work on higher derivatives.

Being precise about what $$d$$ acts on, we compute the expansion of $$\frac{d^2 y}{dx^2}$$:

$\frac{d^2 y}{dx^2} = \frac{d( y_x dx )}{dx^2} = \frac{ y_{xx} dx^2 + y_x d^2 x}{dx^2} = y_{xx} + y_x \frac{d^2 x}{dx^2} \tag{1}$
more...

[September 15, 2019]

Most of our descriptions of how our brains work are fundamentally vague. We speak of our brains performing verbs like “think”, “realize”, “forget”, or “hope” but we aren’t talking about what’s going on mechanically to result in those qualities.

Sure, these can all be assigned truth values, in the sense that if everyone generally agrees that someone ‘realized’ something, we might define their brain to have performed the objective act of ‘realization’. But this gives no technical understanding of what the process of realization is – beyond, perhaps, some hand-wavey story about connections being bridged between neurons.

So, sometime in the last few years the English-speaking Internet became aware of the condition called aphantasia. Aphantasia is when a person is unable to picture images in their thoughts – they don’t have a “mind’s eye” at all.

This is interesting because, in contrast to the above, aphantasia is a concrete description of how the brain works. Some people see an image in their head when they draw or recall something; others don’t. Their brains work in materially different ways. I would have no idea how to figure out if two people “realize” something via different mechanisms, but I can be sure that two people’s brains operate differently, if one sees pictures and the others don’t.

more...

# Exterior Algebra Notes #4: The Interior Product

[January 27, 2019]

Vector spaces are assumed to be finite-dimensional and over $$\bb{R}$$. The grade of a multivector $$\alpha$$ will be written $$\| \alpha \|$$, while its magnitude will be written $$\Vert \alpha \Vert$$. Bold letters like $$\b{u}$$ will refer to (grade-1) vectors, while Greek letters like $$\alpha$$ refer to arbitrary multivectors with grade $$\| \alpha \|$$.

More notes on exterior algebra. This time, the interior product $$\alpha \cdot \beta$$, with a lot more concrete intuition than you’ll see anywhere else, but still not enough.

I am not the only person who has had trouble figuring out what the interior product is for. This is what I have so far…

more...

# Exterior Algebra Notes #3: The Hodge Star

[January 26, 2019]

Previously: matrices and inner products on exterior algebras.

Vector spaces are assumed to be finite-dimensional and over $$\bb{R}$$. The grade of a multivector $$\alpha$$ will be written $$\| \alpha \|$$, while its magnitude will be written $$\Vert \alpha \Vert$$. Bold letters like $$\b{u}$$ will refer to (grade-1) vectors, while Greek letters like $$\alpha$$ refer to arbitrary multivectors with grade $$\| \alpha \|$$.

more...

[December 28, 2018]

Here is a survey of understandings on each of the main types of Taylor series:

1. single-variable
2. multivariable $$\bb{R}^n \ra \bb{R}$$
3. multivariable $$\bb{R}^n \ra \bb{R}^m$$
4. complex $$\bb{C} \ra \bb{C}$$

I thought it would be useful to have everything I know about these written down in one place.

These notes are not pedagogical; they’re for crystallizing everything when you already have a partial understanding of what’s going on. Particularly, I don’t want to have to remember the difference between all the different flavors of Taylor series, so I find it helpful to just cast them all into the same form, which is possible because they’re all the same thing (seriously why aren’t they taught this way?).

In these notes I am going to ignore discussions of convergence so that more ground can be covered. Generally it’s important to address convergence in order to, well, not be wrong. And I’m certain that I’ve made statements which are wrong below. But I am just trying to make sure I understand what happens when everything works, because in my interests (physics) it usually does.

more...

# Infinite Summations and You

[November 1, 2018]

You may have seen that Youtube video by Numberphile that circulated the social media world a few years ago. It showed an ‘astounding’ mathematical result:

$1+2+3+4+5+\ldots = -\frac{1}{12}$

(quote: “the answer to this sum is, remarkably, minus a twelfth”)

Then they tell you that this result is used in many areas of physics, and show you a page of a string theory textbook (oooo) that states it as a theorem.

The video caused a bit of an uproar at the time, since it was many people’s first introduction to the (rather outrageous) idea and they had all sorts of (very reasonable) objections.

I’m interested in talking about this because: I think it’s important to think about how to deal with experts telling you something that seems insane, and this is a nice microcosm for that problem.

Because, well, the world of mathematics seems to have been irresponsible here. It’s fine to get excited about strange mathematical results. But it’s not fine to present something that requires a lot of asterixes and disclaimers as simply “true”. The equation is true only in the sense that if you subtly change the meanings of lots of symbols, it can be shown to become true. But that’s not the same thing as quotidian, useful, everyday truth. And now that this is ‘out’, as it were, we have to figure out how to cope with it. Is it true? False? Something else? Let’s discuss.

more...

# Exterior Algebra Notes #2: the Inner Product

[October 9, 2018]

(See this previous post for some of the notations used here.)

(Not intended for any particular audience. Mostly I just wanted to write down these derivations in a presentable way because I haven’t seen them from this direction before.)

(Vector spaces are assumed to be finite-dimensional and over $$\bb{R}$$)

Exterior algebra is obviously useful any time you’re anywhere near a cross product or determinant. I want to show how it also comes with an inner product which can make certain formulas in the world of vectors and matrices vastly easier to prove.

more...

# Exterior Algebra Notes #1: Matrices and Determinants

[October 8, 2018]

(This is not really an intro to the subject. I don’t have an audience in mind for this. I’ve written my notes out in an expository style because it helps me retain what I study.)

(Vector spaces are assumed to be finite-dimensional and over $$\bb{R}$$ with the standard inner product unless otherwise noted.)

Exterior algebra (also known as ‘multilinear algebra’, which is arguably the better name) is an obscure and technical subject. It’s used in certain fields of mathematics, primarily abstract algebra and differential geometry, and it comes up a lot in physics, often in disguise. I think it ought to be far more widely studied, because it turns out to take a lot of the mysteriousness out of the otherwise technical and tedious subject of linear algebra. But most of the places it turns up it is very obfuscated. So my aim is to study exterior algebra and do some ‘refactoring’: to make it more explicit, so it seems like a subject worth studying in its own right.

In general I’m drawn to whatever makes computation and intuition simple, and this is it. In college I learned about determinants and matrix inverses and never really understood how they work; they were impressive constructions that I memorized and then mostly forgot. Exterior Algebra turns out to make them into simple intuitive procedures that you could rederive whenever you wanted.

more...

# Oriented Areas and the Shoelace Formula

[August 6, 2018]

Here’s a summary of the concept of oriented area and the “shoelace formula”, and some equations I found while playing around with it that turned out not to be novel.

I wanted to write this article because I think the concept deserves to be better popularized, and it is useful to me to have my own reference on the subject. Some resources I have found, including Wikipedia, cite a 1959 monograph entitled Computation of Areas of Oriented Figures by A.M. Lopshits, originally printed in Russian and translated to English by Massalski and Mills, which I have not been able to find online. I did find a copy via university library, and I thought I would summarize its contents in the process to make them more available to a casual Internet reader.

I also wanted to practice making beautiful math diagrams. Which went okay, but god is it ever not worth the effort.

more...

# Geometric Mean and Standard Deviation

[June 15, 2018]

A friend is writing her master’s thesis in a subfield where data is typically summarized using geometric statistics: geometric means and geometric standard deviations (GSD), and sometimes even geometric standard errors – whatever those are. And occasionally ‘geometric confidence intervals’ and ‘geometric interquartile ranges’.

Most of which are (a) not something anyone really has intuition for and (b) surprisingly hard to find references for online, compared to regular ‘arithmetic’ statistics.

I was trying to help her understand these, but it took a lot of work to find easily-readable references online, so I wanted to write down what I figured out.

more...

# Programming: Existence is Pain

### A Rant

[April 19, 2018]

My bike was stolen out of the backyard last night, so I’m feeling a little more aggravated by everything than usual.

This has had the effect of reminding me of a recurring sensation in my life as a software developer: that dealing with technology can be a fundamentally miserable experience, and that the skill of being ‘good’ at software is often mostly the same skill as being able to take a lot of crap from faceless, abusive machines in ways that you feel powerless to do anything about.

So while I’m all for the “let’s teach everybody to code!” movement, I do sometimes wish we’d stop writing yet another Learn Machine Learning With Python Tutorial, or whatever, and just make maybe take some time to work on making everything the world around us better in little incremental ways, by making what we’ve already got suck less, for ourselves and for all the newcomers and for just everyone, so we can have less stress and more peace in our lives.

Basically some days I can’t honestly tell anyone they should get into this, when on a good day you get to slowly hack your way through bullshit and on a bad day you might just succumb and give up.

more...

# Meditation on Taylor Series

[March 30, 2018]

(Notes. Definitely not interesting unless, at minimum, you really really liked calculus.)

## 1

We can often write a differentiable function $$f(x)$$ as a Taylor series around a point $$x$$, approximating it in terms of its derivatives at that point:

$f(x+a) = \sum_{0}^{\infty} \frac{a^{n} f^{(n)}(x) }{n!}$

And, under certain conditions, this series will converge exactly to the values of the function at nearby points.

more...

# Some Intuition Around Entropy

[February 23, 2018]

(Only interesting if you already know some things about information theory, probably)
(Disclaimer: Notes. Don’t trust me, I’m not, like, a mathematician.)

I have been reviewing concepts from Information Theory this week, and I’ve realized that I never quite really understood what (Shannon) Entropy was all about.

Specifically: I have finally understood how entropy is not a property of probability distributions per se, but a property of streams of information. When we talk about ‘the entropy of a probability distribution’, we’re implicitly talking about the stream of information produced by sampling from that distribution. Some of the equations make a lot more sense when you keep this in mind.

more...

# Blogging

[January 2, 2018]

In 2018 I am going to write. Mostly: because I don’t remember anything unless I write it out for myself. And a little bit: because I have a lot I want to say.

Update: cool, I actually did some writing in 2018.