>>13643707>then make a new symbol from it which doesn't collide with normal mathematics
We use '+' for obvious series like 1+1/2+1/4+1/8+... because that's literally what's happening. We use '+' when we say 1+2+3+4+... diverges to infinity. Idiots, and/or mathematicians who were trying to make a point about the zeta function and were taken out of context *by* idiots, may write 1+2+3+... = -1/12, but that doesn't make it true in most contexts.
In other words, it's perfectly fine to use the above notation when talking to someone who already knows we're talking about analytic continuation. 'Overloading' symbols like that happens all the time: we use '+' to represent adding integers, adding real or complex numbers, adding sets of numbers, doing the line-and-curve intersection nonsense that's involved in elliptic curve cryptography, appending strings, adding sequences termwise, adding matrices, expressing the direct sum of two sets (sometimes), Cesaro summation, modular arithmetic, etc. These all have some conceptual bits in common -- some notion of 'adding' -- but the actual operations vary wildly. And that's okay, *if* everyone involved knows what context we're talking about. The -1/12 sum is taken out of context by normies, and so they get confused because they think ordinary integer summation is being used.