Analyticity.
So I was thinking about this. I would make a pretty lousy mathematician, and analysis is not precisely my strongest suit, but please humour my attempt.
We have f(a*b) = f(a) + f(b), with f analytical. We want to prove that f is a logarithm.
The idea is to prove that if f and g are two solutions to the problem that satisfy f(a) = g(a) for some a =/= 1, then f = g. Then, since we can always find k such as k*log(a) = f(a), we've proved what we were set to prove.
So let's take such f and g. f-g still satisfies the conditions of the problem, and f-g(a) = 0. Furthermore, for any n in
Z, we have f-g(a
n) = n*(f-g)(a) = 0, by induction since f(a*a
n) = f(a) + f(a
n), and f(1/a) = -f(a), since f(1) = 0.
Here's the part were my lousiness becomes obvious: we've got an analytical function (f-g) null at 1 such that, for any interval around 1, we can find x =/=1 in the interval such that (f-g)(x) = 0. This is because a
n goes to 1 when n tends to plus infinity (if a<1) or minus infinity (if a>1). I would think that this forces the coefficients in the Taylor expansion of f-g to be all equal to zero, but I do not know how to prove it rigorously. If it is indeed true, then QED.
As such, we would have a definition of log that does not involve e in any obvious way!
(of course, you could define its inverse function, notice that the derivative of that function is proportional to itself, and find the specific solution to this problem such that this constant of proportionality happens to be 1, but I would consider this to be far removed enough to say that e does not necessarily fall over the definition of logarithm)