egg changed the topic of #principia to: READ THE FAQ: http://goo.gl/gMZF9H; The current version is Гельфонд. We currently target 1.8.1, 1.9.1, and 1.10.1. <scott_manley> anyone that doubts the wisdom of retrograde bop needs to get the hell out | https://xkcd.com/323/ | <egg> calculating the influence of lamont on Pluto is a bit silly… | <egg> also 4e16 m * 2^-52 is uncomfortably large
egg|laptop|egg has quit [Remote host closed the connection]
egg|cell|egg has joined #principia
egg|laptop|egg has joined #principia
Mike` has quit [Ping timeout: 194 seconds]
Mike` has joined #principia
egg|laptop|egg has quit [Remote host closed the connection]
<discord->
Butcher. — Hmm, flight planner struggles a little with a 14,400 day plan.
<raptop>
...are you visiting a KBO?
<discord->
Butcher. — Pluto.
<discord->
Butcher. — So almost. 😉
<discord->
Butcher. — I am doing Earth -> Jupiter -> Saturn -> Pluto
<raptop>
KBO (subset: plutino instead of classical)
<discord->
Butcher. — Hmm, 37 years to pluto, dan.
<discord->
Butcher. — Hmm, 37 years to pluto, damn. (edited)
* raptop
has questions, since the proposed eg: Voyager 1 mission would have gotten there before Voyager 2 reached Neptune
<discord->
Butcher. — This is essentially early Voyager - I'm in 1972 so its probably not the best window.
<discord->
Butcher. — Going to launch another probe on the next window in about 18 months.
<discord->
Ashnoom. — My pc have up after plotting a 1400 days in L5 orbit
<discord->
Ashnoom. — (increasing step count) didn't touch the accuracy or whatever it's called slider
egg|laptop|egg has joined #principia
<discord->
egg. — yeah it probably is a good idea to use a higher tolerance instead of cranking the step count until your machine gives up
<raptop>
hrm, pioneer 10 launched in 1972 and 11 in 1973
<discord->
Butcher. — I was on 16000 steps at 4m granularity.
<discord->
Butcher. — The plotting lines stopped about half way to pluto, but the apside and nodal markers were present.
<discord->
egg. — yeah the plotting stops eventually even if the computing happens, because filling the screen with noodles quickly makes things grind to a halt otherwise
<discord->
egg. — Looks like we do the limiting before we do the culling (we project and cull Principia-side), maybe we could be smarter about that
egg|laptop|egg_ has quit [Remote host closed the connection]
<_whitenotifier-8975>
[Principia] pleroy opened pull request #2850: Adjust tolerances for macOS/Linux and use Ge instead of Gt - https://git.io/JLNki
<_whitenotifier-8975>
[Principia] eggrobin labeled pull request #2850: Adjust tolerances for macOS/Linux and use Ge instead of Gt - https://git.io/JLNki
<discord->
egg. — Now, here, you see, it takes all the running you can do, to keep in the same place.
<discord->
Standecco. — the principia crew might like the idea and what could be done with this (+ KerbalismContracts in a very ideal world): https://forum.kerbalspaceprogram.com/index.php?/topic/199347-18x-111x-kerbal-weather-project-kwp-v100/
<discord->
Standecco. — the principia gang might like the idea and what could be done with this (+ KerbalismContracts in a very ideal world): https://forum.kerbalspaceprogram.com/index.php?/topic/199347-18x-111x-kerbal-weather-project-kwp-v100/ (edited)
<_whitenotifier-8975>
[Principia] pleroy closed pull request #2850: Adjust tolerances for macOS/Linux and use Ge instead of Gt - https://git.io/JLNki
<_whitenotifier-8975>
[Principia] pleroy pushed 2 commits to master [+0/-0/±2] https://git.io/JLNqK
<_whitenotifier-8975>
[Principia] pleroy 1d9ca42 - Adjust tolerances and use Ge matcher.
<_whitenotifier-8975>
[Principia] pleroy a532acc - Merge pull request #2850 from pleroy/Tolerances Adjust tolerances for macOS/Linux and use Ge instead of Gt
<discord->
Paculino (ŝi/ri/she/they). — But since most RSS games never make it to the future, one could simply make it require internet and check the weather occasionally.
<discord->
lamont. — KWP is atmospheric, principia is gravity, they seem very orthogonal
<discord->
Standecco. — @egg got any suggestions on how I would evaluate an infinite sum up to very high indices?
<discord->
Standecco. — the terms of the succession have n! and n^n, and I need to get at least up to n > 2000, because that's the most I've been able to test and it still doesn't seem to converge
<discord->
Paculino (ŝi/ri/she/they). — Is it for modding?
<discord->
Standecco. — but ideally it should converge, eventually, therefore I'm trying to get to even higher numbers, but my python script (with Decimals) crashes above 2000 or so
<discord->
Standecco. — no, it's for university
<discord->
Paculino (ŝi/ri/she/they). — Do you have numpy?
<discord->
Paculino (ŝi/ri/she/they). — It might have a term over 2^64
<raptop>
...can you carefully stay in integer land with python?
<discord->
Standecco. — of course I have numpy
<discord->
Standecco. — I'm relatively sure it has terms above googol
<discord->
Paculino (ŝi/ri/she/they). — When you say ideally it should converge, do you mean that you know it does, or that you hope it does?
<raptop>
The limits on large integers with numpy (or maybe python in general) are more about your ram than anything else. Not sure what the best practice is for large floats
<discord->
Standecco. — correction, well above a googol
<discord->
Standecco. — raptop: unfortunately not, I need floats
<raptop>
blarg
<discord->
Standecco. — python integers are unlimited, and so are decimals, but the issue is that it just crashes at some point
<discord->
Paculino (ŝi/ri/she/they). — Have you tried testing it in WolframAlpha?
<discord->
Standecco. — WA will give up well before it may be even remotely close
<raptop>
...what values of x are we looking at?
<discord->
Stonesmile. — Do you want a value or proof that it converges?
<discord->
Paculino (ŝi/ri/she/they). — Do you know what value x has?
<discord->
Paculino (ŝi/ri/she/they). — Or do you need a general solution?
<discord->
Standecco. — I need to evaluate the series at x = -e, because with that value it becomes an alternate sign series which ideally should converge because after some index, it should become monotonic
<discord->
Standecco. — (should've said that earlier)
<discord->
Standecco. — I don't need a value, I have a proof that it converges but so far it doesn't look like it
<discord->
Standecco. — but it really should converge
* raptop
is pretty sure that it converges for any finite value of x, since my vague recollection is that n^n grows rather faster than n!
<discord->
Stonesmile. — Oh, so Stirlings approximation is a lower bound?
<discord->
Standecco. — not sure, but the approximation is exactly only for n-> +oo
<discord->
Standecco. — this being an infinite series, it should eventually get there and therefore be monotonic
<discord->
Standecco. — but I don't know enough to be able to say it for certain, my professor doesn't know either, and numerically I can't get an answer
<discord->
Paculino (ŝi/ri/she/they). — What about the integral approximation method?
<discord->
Paculino (ŝi/ri/she/they). — What about the integral approximation? (edited)
<discord->
Stonesmile. — Was this the original question, or is this a modified one?
<discord->
Standecco. — not applicable here, because the series isn't always positive
<discord->
Paculino (ŝi/ri/she/they). — You can break it apart to be the positive section and the negative and add the two integrals
<discord->
Standecco. — slightly modified, because the original one was centered in x = 5, but otherwise it's the same
<discord->
Standecco. — no, because per Riemann's theorem the infinite sum of 2 non-absolutely-converging-but-simply-converging series can be made to be any number
<discord->
Standecco. — aka, if I have sum((-1)^n/n) + sum((-2)^n/x) can be literally any number
<discord->
Standecco. — aka, if I have sum((-1)^n/n) + sum((-2)^n/x) it can be literally any number by rearranging the terms (edited)
<discord->
Standecco. — (in fact, you don't even need 2 sums, either one of them is enough)
<discord->
Standecco. — (in fact, you don't even need 2 sums, either one of them is enough, as they both have infinite terms) (edited)
<discord->
Standecco. — no, because per Riemann's rearrangement theorem the infinite sum of 2 non-absolutely-converging-but-simply-converging series can be made to be any number (edited)
<discord->
Paculino (ŝi/ri/she/they). — I meant make a(n)=f(x), and finding int(f(x)\*sgn(f(x))) / 2 + int(-f(x)\*sgn(f(x))) / 2
<discord->
Standecco. — yes, but I'm saying that mathematically that wouldn't be a proof because of said theorem
<discord->
Stonesmile. — Using x = -e and stirlings formula on the inf sum should give you sqrt(2 pi n)/(1+n), which tends to 0 as n goes to inf
<discord->
Standecco. — yeah as I said, it is at some point equivalent to 1/sqrt(n), but that alone isn't enough to guarantee convergence, to my knowledge
<discord->
Stonesmile. — oh, and (-1)^n
<discord->
Stonesmile. — Isn't Leibniz basically (-1)^n * 'something that tends to 0' => converges
<discord->
Paculino (ŝi/ri/she/they). — What if you used a transformed square wave multiplied by a(n)? Would each portion then converge absolutely?
<discord->
Stonesmile. — It seems neatly designed to use Sterlings and then Leibnitz, but I might be missing something
<discord->
Standecco. — yes, _iff_ you can prove that the series is monotonic after a while
<discord->
Paculino (ŝi/ri/she/they). — I thought you said you already did
<discord->
Stonesmile. — And sqrt(2 pi n)/(1+n) *is* monotonic I think
<discord->
Standecco. — for example, (-1)^n * (1+cos(n))/n can't be used there
<discord->
Standecco. — for example, (-1)^n * (1+cos(n))/n can't be determined using Leibniz there (edited)
<discord->
Standecco. — for example, (-1)^n * (1+cos(n))/n can't be determined using Leibniz (edited)
<discord->
Stonesmile. — Sure, but sqrt(n)/n should be monotonic, right?
<discord->
Standecco. — afaik, you have to prove that the original series is monotonic after a certain index, not that its equivalent is, but this is where I might be wrong
<discord->
Stonesmile. — Very possible, if you can't work with the result of Stirlings, then my reasoning doesn't hold
<discord->
Standecco. — I found out why my python script was crashing: apparently there's a recursion depth limit in python
<discord->
Standecco. — if you still have your old calculus textbook and a few spare minutes, it might be really helpful if you could check it, because mine doesn't say anything about this very specific case
<discord->
Stonesmile. — I found another definition of stirlings formula, which is *exact*; n! = sqrt(2 π n)*n^(n+1/2)*e^(-n)*(1+ε), where ε tends to 0 as n tends to inf
<discord->
Stonesmile. — With that definition you can use it to prove Leibnitz I think
<discord->
Standecco. — interesting, let me try
<discord->
Stonesmile. — I found another definition of stirlings formula, which is *exact*; n! = sqrt(2 π n)\*n^(n+1/2)\*e^(-n)\*(1+ε), where ε tends to 0 as n tends to inf (edited)
Jesin has joined #principia
<discord->
Standecco. — looking at the function in desmos, and apart from it abruptly stopping after overflow, I can conclude this entire endeavour by saying that it is indeed monotonic
<discord->
egg. — @Standecco looking at a graph is satisfying and all, but that is not a proof of anything about the function
<discord->
Standecco. — I don't have any more tricks in my sleeve at this point though
<discord->
egg. — What is the actual question anyway, glancing at the giant backlog it seems one hour ago you had already gone off into numerical evaluation (which can be interesting but doesn’t seem to be what the underlying question is about)
<discord->
Standecco. — basically, convergence of a series through Leibniz's criterion
<discord->
sichelgaita. — Write that the term for n+1 is smaller in absolute value than the term for n. After some algebra you get e < (1+1/n)^n (n+2)/(n+1). Take the log on both sides and develop to order 2. You get: 1 < 1 - 1/2n + 1/(n+1) + O(1/n^2). This is true for n sufficiently large.
<discord->
sichelgaita. — Does that work?
<discord->
Standecco. — I had found that already, here
<discord->
Standecco. — seems reasonable, but so far I've tried up to N = 20 000 and the sum keeps increasing
<discord->
egg. — slowly convergent sum is slowly convergent
<discord->
Stonesmile. — Yeah, there is no reason to check it numerically
<discord->
Standecco. — I'm not sure if that kind of reasoning is correct, so I was checking it numerically
<discord->
Standecco. — eventually it should stop increasing and start converging
<discord->
Standecco. — it's the definition of converging
<discord->
egg. — no, eventually you need to learn numerical analysis if you are going to play with sums like that
<discord->
sichelgaita. — @Standecco but the last inequality on the penultimate line of your paper is wrong, (1+1/n)^n is < e.
<discord->
sichelgaita. — So you need order 2.
<discord->
Standecco. — you're right, I made a bit of a mess when writing that down, but the end result should be consistent with what you've found nonetheless, right?
<discord->
sichelgaita. — Are these the partial sums in the snippet above? In my world they quickly converge to -0.7. (I also conjecture that -e is the minimum of the generating function, but that seems hard to prove.)
<discord->
Standecco. — the python things are the partial sums, yes
<discord->
egg. — what value of x are you using
<discord->
Standecco. — -e, which is the only interesting value
<discord->
Standecco. — how did you get that convergence?
<discord->
sichelgaita. — Mathematica. For N = 100, 200, ... 1000 I get -0.587347, -0.623127, -0.639157, -0.648753, -0.655317, -0.660169, -0.663945, -0.66699, -0.669514, -0.671649
<discord->
egg. — I am not sure what your python script does, but it is rather surprising that calling it twice as `python sum.py 15` yields two different results with opposing signs
<discord->
Standecco. — what the heck is going on in my script indeed
<discord->
Standecco. — oh yeah, that's simply when I noticed that range(1, 15) goes from 1 to 14 and adjusted the script
<discord->
Standecco. — but the wrong result is not expected anyway
<discord->
sichelgaita. — I surmise that this interesting nerd-sniping thread will end with "oops, my Python is botched". 😛
<discord->
egg. — anyway, whatever it does it bears little resemblance to the sum
<discord->
Standecco. — god I hate python and its indenting
<discord->
Standecco. — I swear I'll never use it again
<discord->
Standecco. — now it's converging indeed
<discord->
Standecco. — thanks a lot for showing me it converged and letting me notice that I don't understand python indenting and that I hate its "beginner friendly" indenting
<discord->
Standecco. — thanks a lot for showing me it converged and letting me notice that I don't understand python indenting and that I hate its "beginner friendly" structure (edited)
<discord->
Stonesmile. — Yeah, python indenting is awkward to say it kindly
<raptop>
It enforces a degree of readability
<discord->
Standecco. — I find that brackets + indents enforce a higher degree of readability