(edit ppauper and I cross posted, I was also thinking about Chebyshev)
---
I tried a Chebyshev approximation (which is infinitely differentiable since it's a sum of (orthogonal) polynomials), but it doesn't converge very fast. I bet that'll be a general issue with geneneric (popular) polynomial basis because of the discontinuity of the derivative at the peak on one hand and you wanting
infinitely differentiability at the other hand.
import numpy as np
import matplotlib.pyplot as plt
x = np.linspace(-1, 1, 2000)
# hat function
y = 1 - abs(x)
p = np.polynomial.Chebyshev.fit(x, y, 128, domain=(-1,1))
plt.plot(x, y, 'r.')
plt.plot(x, p(x), 'k', lw=2)
plt.title('fit')
plt.show()
plt.plot(x, p(x) - y, 'k', lw=2)
plt.title('error')
plt.show()
plt.semilogy(np.abs(p.coef),'*')
plt.title('maginitude coefficients')
plt.show()
Perhaps cutting off the top -just a tiny bit- and replacing it with a spline that matches derivatives left and right will be a good approximation, and infinitely differentiable?