An Involved Integral Series Calculation, Part 2

From the first post we know that the integral series for log(x) is f(x) = s1(x) + s2(x). And:

s1(x) = log(x) * e^x
s2 = \sum_{k} H_n * \frac{x^n}{n!}

s2 has the pattern of an exponential generating function because \frac{x^n}{n!} part. The harmonic series is then the series that is generating our result. If we expand s2 by each power of x we can see what the harmonic series looks like in a generating function.

s2 = 0 * \frac{x^0}{0!} + \frac{1}{1} * \frac{x^1}{1!} + (\frac{1}{1}+\frac{1}{2}) * \frac{x^2}{2!}

A nice part of exponential generating functions is that taking the gf’s derivative just slides down the coefficient to one lower power.

s2' = \frac{1}{1} * \frac{x^0}{0!} + (\frac{1}{1} + \frac{1}{2}) \frac{x^1}{1!}

Each term of s2′ has one more reciprocal being summed than the same term in s2 itself. s2 for x^1 has 1/1, but s2′ has 1/1 + 1/2. For each term, the difference between s2′ – s2 is [x^n] \frac{1}{n+1}

So, replacing s2 with f to make differential equations easier, we have:

f'(x) - f(x) = \sum_k \frac{1}{k+1} \frac{x^k}{k!}

This is a pretty interesting and not immediately obvious sum. But:

e^x = \sum_k \frac{x^k}{k!}
\frac{e^x - 1}{x} = \sum_k \frac{x^(k-1)}{k!}
\frac{e^x - 1}{x} = \sum_k \frac{1}{k} \frac{x^(k-1)}{(k-1)!}

Which is just f’ – f with k shifted forward. Since we can’t use k = 0 anyway, we can make this shift to make the sum actually valid starting from 0. Which means:

\sum_k \frac{1}{k+1} \frac{x^k}{k!} = \frac{e^x - 1}{x}
f'(x) - f(x) = \frac{e^x - 1}{x}

y'(x) + f(x) * y(x) = g(x)
g(x) = \frac{e^x - 1}{x}

f(x) = -1
\int 1 dx = x
y(x) = \frac{\int \frac{e^(2x) - e^x}{x}}{e^x}

This integral does not have an elementary solution, but according to Wolfram Alpha it is Ei(2x) – Ei(x) where Ei is the Exponential Integral special function.

s2 = \frac{Ei(2x) - Ei(x)}{e^x}
s1 = log(x) * e^x

So the final answer is not clean or neat but there it is.

An Involved Integral Series Calculation, pt. 1

I think a better name for the M and P summations I’ve been talking about would be integral series and differential series, in the same sense as geometric series, or any other kind.

I have been working on the calculations for the Integral Series of f(x)=log(x) over the past few weeks and it has been rather involved, but also interesting. I didn’t think I was going to find an explicit formula for the summation, but eventually I found a connection that I want to talk about here with generating functions.
It makes sense to talk about generating functions because they also have a lot of connections with derivatives and integrals. The Integral Series for log(x) leads to what turns out to be an exponential generating function, which is a sum of the form:
\sum a_n \frac {x^n}{n!}
But let me start at the beginning with the iterated integrals of log(x) themselves. Even figuring out what pattern the integrals have in the arbitrary case was pretty tricky.
When trying to figure out the integral of log(x), there is no simple rule to figure it out. You have to reason backwards from the derivative of log(x).
\frac{D(log(x))}{dx} = \frac{1}{x} by definition of the logarithm.
This definition is interesting when you combine it with the product rule:
\frac{D(x * log(x)}{dx} = x * \frac{1}{x} + 1 * log(x) = 1 + log(x)
The derivative of x log(x) is very close to log(x), because the 1 / x and the x cancel in the left hand of the product rule, and the x when differentiated becomes 1 on the right side. From this derivative we can get the integral of log(x):
f'(x * log(x)) = 1 + log(x)
f'(x * log(x)) - 1 = log(x)
Using the fact that the derivative of a sum is the sum of the derivatives, and taking the integral of 1 wrt x, we can turn that into:
f'(x * log(x) - x) = log(x)
x * log(x) - x = \int{log(x) dx}
 Then taking the 2nd integral, we have the first term which we will have to figure out, but the second term is just x, so that part is easy:
\int \int{log(x) dx} = \int{x log(x) dx} - \frac{x^2}{2}
The integral of x log(x) is not that different from log(x) itself:
\frac{D(x^2 * log(x))}{dx} = x^2 * \frac{1}{x} + 2 x * log(x) = x + 2 x * log(x)
\int {x * log(x) dx} = \frac{1}{2} x^2 * log(x) - \frac{1}{4} x^2
The clear pattern is all of the parts of the expression have the same power of x. 
Generalizing:
\int{x^{n} * log(x)} = \frac{1}{n+1} * x^{n+1} * log(x) - \frac{1}{(n+1)^2} x^{n+1}
Applying this rule we will have an extra polynomial term with each integration of log(x), as well as the integrals of the previous polynomial terms:
Using the notation:  L(n) to denote the nth integral of log(x), here are a few of them.
L(0) = log(x)
L(1) = x * log(x) - x
L(2) = \frac{1}{2} x^2 * log(x) - \frac{1}{2*2} x^2 - \frac{1}{2} x^2
L(3) = \frac{1}{2*3} x^3 * log(x) - \frac{1}{2*3*3} x^3 - \frac{1}{2*2*3} x^3 - \frac{1}{2*3} x^3
Grouping the fractions in L(3) hints towards the pattern for L(n):
L(3) = \frac{1}{1*2*3} x^3 * log(x) - \frac{1}{1*2*3} * (\frac{1}{3} + \frac{1}{2} + \frac{1}{1})
The more terms you take the more messy it gets but the pattern in L(3) holds:
L(n) = \frac{1}{n!} x^n * log(x) - x^n \frac{1}{n!} * \sum_{k=1}^{n} \frac{1}{k}
Or:
L(n) = \frac{x^n}{n!} (log(x) + H_n)
The H_n is called the nth harmonic number. It is simply the sum of the reciprocals of the first n integers. 
So the Integral Series is:
\sum_{n} \frac{x^n}{n!} (log(x) + H_n)
So we have something to work with, but the fact that the sum has these Harmonic numbers in it makes things more complicated. In order simplify things, I broke apart the summation above into s1 and s2. You can’t always break up an infinite sum because the converge could be different, but it seems to have worked out this time.
H(log(x)) = s1 + s2
s1 = \sum_{n} \frac{x^n * log(x)}{n!}
s2 = \sum_{n} \frac{H_n * x^n}{n!}
So I just distributed out log(x) and H and separated the sum that way.
For s1, log(x) doesn’t depend on the summation variable n, so we can separate it as we would in an integral.
s1 = \sum_{n} \frac{x^n * log(x)}{n!} = log(x) \sum_{n} \frac{x^n}{n!}
But then the sum itself is simply e^x.
s1 = log(x) * e^x
I’ll be discussing the calculations for s2 next time. I went through a lot of deadends when analyzing s2, but the final solution isn’t too bad.

Sum of Integrals, Derivatives

The M and P functions that I mentioned in “Higher Order Operators and Functions” are a lot more interesting that I expected.

First, a recap on the definition:

M(f(x)) = f(x) + \int f(x) + \int \int f(x) + … (general antiderivatives here)
P(f(x)) = f(x) + f'(x) + f”(x) + f”'(x) +…
 Originally I guessed that there is no function such that M(f(x)) and P(f(x)) converge, but I have found out how wrong I was.
For the simple function f(x) = x,
M(f(x)) = \sum_{k=1}^{\infty} \frac{x^n}{n!} which is the Taylor series expansion for e^x except the constant term.
P(x) = x + 0 + 0 … = x
So x is doubly-convergent.
For a general monomial a x^b we multiply the monomial by a term k so that it has the form:
\frac{1}{b!} x^b which matches the taylor series form.k = \frac{1}{a*b!}
M(a x^b) = k (\sum_{i = 0}^{\infty} x^i - \sum_{i = 0}^{b-1} x^i)
M(a x^b) = k e^x - k (\sum_{i = 0}^{b-1} x^i)
Which is convergent. And we showed before that all polynomials are P convergent.
Integration is distributive over addition, so we can add the above expression for each term of a polynomial together to get a combined M(p(x)) for any polynomial. All polynomials are doubly convergent then.
Because integration and differentiation are distributive over addition, the M and P functions are also distributive (modulo some technical difficulties that may occur re: convergence).
An example polynomial:
M(x + x^3) = M(x) + M(x^3) = e^x + 6 M(\frac{1}{6} x^3)
M(x + x^3) = 7 * e^x - 3 x^2 - 6 x - 6
Rationals:
f(x) = 1 / x is P convergent, but the value and convergence status of  M(1 / x) is unknown.
P(\frac{1}{x}) = \sum_{k=0}^{\infty} \frac{-1^k}{x^(k+1)}
P(\frac{1}{x}) = \frac{e^(- \frac{1}{x})}{x}

Alien Math #2

“Space Beat” has aliens that live as plants until they release their offspring seeds to the winds, and then they become mobile. So I thought I’d think up another Alien Math system. I’ll just put together a few example calculations for now, because I haven’t thought of all of the complexities yet, just the basic idea.

(Couldn’t get latex to work at all this morning).

2 e^(Pi * i / 2) <&>  1-1i = -2.2209 + 1.7659 i
2 e^(Pi * i / 2) (&) 1-1i = i

-3 <%> i = 0.9314 + 1.0784 i
-3 (%) i = i + 1

10 e^(Pi * i / 10) <?> 4 e^(Pi * i / 3) = 2.4322 + 0.5780 i
10 e^(Pi * i / 10) (?) 4 e^(Pi * i / 3) = 7.5105 + 0.8920 i

Most binary functions in Alien #2 have these <> and () versions. They generally behave very differently, but there are a few exception cases where they get the same value for a pair of inputs.There are a lot of other functions I’ll introduce next time as well.

2 <&> 2 = 4
2 (&) 2 = 4

10 <%> 100 = undefined

Novels with two equal sets of characters in opposition?

The new novel I’m writing for NaNoWriMo, tenatively called Space Beat, has brought up an interesting question. Can you have a story where you have two sides, but its not clear which side is good, or even which is the protagonist? if any.

I originally thought I would make my two sides into pretty clear good guys / bad guys, but then I started writing and I realized the characters I created are not so clear cut.

I started creating regular characters rather than 1 offs to work for the “big bad”. Those characters have seemed more interesting that my protagonists. And on the good side we have two characters that so far seem like they might be interesting, but they are really just pawns controlled by the government that they work for. They are police, but as of yet, my writing with them has been more technical and focused on their work. Whereas the bad guys know how to have fun while waiting to meet with their boss.

I think I will try to write equal amounts for the two groups and put them in parallel when I can to make the comparison interesting. It will be a story where the reader has to decide what team to root for in a way. That may be too confusing and weird, so maybe I will change that in editing, but for now I have the sense that if I just wrote about the rookie and veteran space cops, it would be too dry.

I guess I need to also come up with some spice for Jordan and Quintin, the cops I feature that are part of the Space Fugitive Task Force.

Joseph, who nominally works for the “Thunder Man” of Thunder Mining Corporation, also works for the T man in secret on a number of not so above board things. but the Thunder Man and Joseph’s goal is too push for independence for the Ganymede colony however it requires.

The new recruit that Joseph goes to pick up seems a bit more dry so far, but I can probably find some interesting foil type behavior for him, so he reflects whatever Joe is not. The Thunder Man himself is more of a man behind the curtain who gives Joe orders but my idea is to make his presence felt more indirectly in the deference Joe gives him, that in actual conversation and actions with the T Man.

Now that our space cops have finished their first mission and are docked on the Moon, maybe they can lighten up and take a day off, and that will let me write them more interestingly. But regardless, I’m going to write equal amounts for both of these groups, or I may end up making Joe the main character and switch things completely.

Differential inequalities

The other thing I was thinking about recently was the idea of differential inequalities I mentioned before.

I think in the case of: x’ > x if x > 0 we can say something about what x(t) must be, as long as x'(t) is continuous.

I think with careful thinking about this there could be used for a proof that there is no infinitely differentiable f(x) such that its integral and derivative are both less than it for all x > 0

If x'(t) is continuous and x'(t) > x(t) for all t > 0, and both x and x’ are always greater than 0 on that range, then set a = x(1), whatever that positive value is.

You can find an exponential function such that it also is a at t = 1. The function would be:

g(t) = \frac{a}{e} e^t

Now notice:
x'(1) > a, because x'(t) > x(t) and x(1) = a

And
g'(x) = D(\frac{a}{e} e^x) = \frac{a}{e} e^x
g'(1) = \frac{a}{e} * e = a

Call x'(1) = b > a. Even if x'(t) from t = 1 onward is dropping back towards g'(t), there still must be some time h before x'(t) = g'(t), because a continuous function can not have a jump discontinuity, it must smoothly merge back into the curve of g'(t).

That is, for some 1 < t < 1 + h, x'(t) > g'(t).

\int_{1}^{1+h} x'(t) > \int_{1}^{1+h} g'(t)
\int_{1}^{1+h} x'(t) = x(1+h) - x(1) > g(1+h) - g(1)
x(1+h) - x(1) > g(1+h) - g(1) \rightarrow x(1+h) > g(1+h) - g(1) + x(1)

But since g(1) and x(1) are both a:

x(1+h) > g(1+h) - g(1) + x(1) \rightarrow x(1+h) > g(1+h) - a + a
x(1+h) > g(1+h) -a + a \rightarrow x(1+h) > g(1+h)

This relationship is true for anywhere between 1 < t \leq t + h because we can use with some other j < h. So throughout this range, x(t) > g(t), or x(t) > \frac{a}{e} e^x

However there is no actual h where g'(t) = x'(t) again, because g”(t) = g'(t), x”(t) > x'(t), x'(t) > g'(t), and so x”(t) > g”(t). x is accelerating away from g, so it cannot rejoin the slope of g(t). Similarly we can show that for all t > 1, \latex x^{(n)} > g^{(n)}.

Since x(t) > 0 for all t > 0, we can pick any arbitrary t > 0 and use this same argument that x(t) > c exp(t) for all t > p and some multiple c. Since we can do so for any p > 0, we can show that there is some c such that x(t) is greater than c exp(t) for all t > 0.

For the other direction, x’ < x, we can use the same argument that from some point p, x(t) could not grow as fast or faster than an exponential that has the same value at p.

If x(2) = 1, then \frac{1}{e^2} e^x has the same value at t=2. x’ is less than 1 there, and up to some distance h from there, so it must fall behind in the same way.

Now my question is how far can you take this line of argument, and what are the pitfalls that look like true statements you can make about inequalities, but you can’t.

It seems like the inequalities largely tell us about the growth rates of a function and nothing about its current value or velocity. We could have a function with x(1) = 1000000, but by the argument it still could not grow exponentially, because the derivative must be less than 1000000 there. It might have a connection with O notation.

Space Beat, The Blackboard and Nanowrimo

I picked the name for this blog based on the idea that I’d be writing the story about the blackboard with the power the change the rules of the universe. I think I’m going to wait on that one and let some more ideas come to me, I don’t seem to have a full novel’s worth yet.

I have now brainstormed three different ideas for my Nanowrimo novel, but I think I’m going to go back to the Hard SF idea I originally had. I backed away from it, because it felt like it may be too confining, but I think I will give it a go, and just skip the boring early parts to get right into the action.

I call it Space Beat, but that’s only a early title. Russia, America and Canada are connected by a massive Siberia-Alaska bridge complex, as well as the ease of trans-Arctic trade in the aftermath of global warming. They have formed the Northern Economic Union more than 150 years before the setting start, and formed the NEU bureau of Law Enforcement to coordinate investigation within the three countries, as well as their off world colonies.

NEU is only one of three major space-faring. The EU and Greater China are the others. All three have colonies and space stations scattered out as far as Jupiter, with two new stations just built on Saturn’s moons a few years before the story’s beginning.

The NEU has set up a Space department charged with tracking suspects who have fled off planet. The NEU’s space department is small, and the cost to outfit just one team of officers in the latest cruiser ship is expensive, especially when fuel costs are taken into account.

Unlike the vast majority of space themed stories, I am taking into account gravity and orbital mechanics, as well as the fact that you can’t really accelerate at more than a few g’s without crushing your officers.

The story is primarily about the Rookie Jordan Galloway and the Veteran Quintin “Q” Bass, and their pursuit of suspects, especially a group who are connected to a new criminal syndicate.

The “villain”, Ethan “Thunder Man” has adopted the moon Ganymede in orbit around Jupiter as his home, and he has concocted a scheme to divert attention and police resources to issues elsewhere in the system, so he can build up Ganymede’s defenses and declare independence.

While Jordan and Quintin are the protaganists, I intend to make Ethan’s goals at least reasonable, if not his methods.