Pondering higher order functions and derivatives

So I started thinking about other possible alien math systems the past few days and that got me on the topic of higher order functions. These have many names, and a lot of the complexities of things like functional analysis are beyond me, but I do understand a few of the basics, and I understand higher order functions from the perspective of programming.

A higher order function does the same thing with functions that functions do with numbers. It is a function that takes functions as its parameters instead of numbers. The derivative is a classic example:

Limit_h \rightarrow 0 \frac{f(x+h) - f(x)}{h}

If we consider x to be a general variable rather than a specific value to be calculated on, than this definition is a higher order function. We provide a function and get another function back. Just like with sin(x) we provide a number and get a number back.

There are tons of higher order functions, but I’m mostly interested in the ones related to the derivative and integral. Of course the integral doesn’t give us back a single value, so I turn it into a higher order function by taking the integral from 0 to x.

You could also have functions that received higher order functions themselves as parameters, such as the following:

N(O(f(x)), r) = O(O(O(....f(x))  –r iterations–
N(D(x^3), 3) = 6

That’s just a function that iterates a higher order function a given number of times.

The higher order functions I was thinking of both are similar to infinite series in a way. They bring up the same questions of convergence, but its harder for me to say in what sense a function can converge?

G(f(x), n) = N(D(f(x)), n), n \geq 0
G(f(x), n) = N(I(f(x)), -n), n < 0

So G is a generalized derivative or integrator that gives the nth derivative for positive n, or the -nth  integral for negative n

P(f(x)) = \sum_{i=0} G(f(X), i)
M(f(x)) = \sum_{i=0} G(f(x), -i)

P represents the sum of all derivatives of a function, and M is the sum of all integrals of a function. P only exists where f(x) is infinitely differentiable, and M exists wherever f(x) is continuous. P seems to “converge” for a lot more functions than M converges for.

Examples of P converging:

1. Any polynomial converges because the derivative will eventually be 0
ex. x^3 \rightarrow x^3 + 3 x^2 + 6 x + 6

2. a^x for a < e converges because the derivative decreases like a geometric series

Examples of M converging:

1. a^x for a > e converges in the same kind of geometric series except the multiplier is 1 / log(a) instead of log(a).

The question that I’m thinking about now is whether any function has the property that both P and M converge for it.

Sin(x) might seem to converge for both P and M, since its derivatives cancel each other out. Unfortunately, Sin(x) does not converge to any one value at infinity.

If P and M both converge for a function I think it would be a rather special function. The thing is that, in order for P to converge, the derivatives have to be getting smaller in magnitude. So Abs(G(f(x), n)) has to be decreasing, but that seems to imply that the Integrals are generally increasing in magnitude.

It seems to me that In order for both P and M to converge, there has to be a function that’s the “max” in some sense, so that the derivative and integral of that function are both smaller everywhere than the function itself. That function if found would define a whole range of functions which have converging P and M values.

Does there exist a function f(x) such that for all x > 0 0 < \int_{0} f(x) < f(x) \land 0 < D(f(x)) < f(x) ?

We can think of this, in a way, as a differential inequality. I don’t know if much is known about this sort of thing, but I certainly don’t know how to proceed with it that far. If we consider the integral to be the function g(x) and focus on that we get:

g’ = h
x>0 \rightarrow g' > g > 0
x>0 \rightarrow 0 < h' < h

Functions that grow very rapidly have larger derivatives and even larger 2nd derivatives and so on. Whereas other functions such as polynomials have their integrals grow such that at some positive a, the higher order terms will dominate and make them larger.

So our f(x) would have to be some strange function, small enough that its derivatives are not any larger, but also strange enough for its integrals to always be smaller. I can imagine this being true on a small interval but not the entire positive real line.

Getting a better idea of what functions have smaller derivatives for all positive x, and which functions have smaller integrals would go a long way to figuring it out, so I will think on that some more.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s