raptop changed the topic of #kspacademia to: https://gist.github.com/pdn4kd/164b9b85435d87afbec0c3a7e69d3e6d | Dogs are cats. Spiders are cat interferometers. | Космизм сегодня! | Document well, for tomorrow you may get mauled by a ネコバス. | <UmbralRaptor> egg|nomz|egg: generally if your eyes are dewing over, that's not the weather. | <ferram4> I shall beat my problems to death with an engineer. | We can haz pdf
<UmbralRaptor>
What's two and a half orders of magnitdue between friends?
<UmbralRaptor>
*magnitude
_whitelogger has joined #kspacademia
ferram4_ has joined #kspacademia
ferram4 has quit [Ping timeout: 198 seconds]
e_14159 has quit [Ping timeout: 190 seconds]
e_14159 has joined #kspacademia
Technicalfool_ has joined #kspacademia
Technicalfool has quit [Ping timeout: 198 seconds]
ferram4__ has joined #kspacademia
ferram4_ has quit [Ping timeout: 183 seconds]
<egg|cell|egg>
Bofh: uncertainty, see backlog
<bofh>
egg|cell|egg: yeah, I'm still trying to meaningfully figure this out :P
<UmbralRaptor>
Silly thought: sampling 'densely' over the time series, what a histogram of semi-major axis, etc look like?
<bofh>
uniformly distributed makes sense, actually.
<egg|zzz|egg>
bofh: so eggsperimenting with sums of sinusoids, um
<egg|zzz|egg>
bofh: the stackoverflow formula, but up to lag n/4 instead of sqrt n, and with an absolute value in front of the sum to avoid negative variance (??) seems to work Ꙩ_ꙩ
<egg|zzz|egg>
well, it still overestimates the error eventually, but less so?
<egg|zzz|egg>
also it makes no sense
<egg|zzz|egg>
moo
<bofh>
HUH.
<bofh>
That makes *absolutely* no sense, what the hell.
<UmbralRaptor>
I mean, at this timescale, it's not all that stochastic?
<egg|zzz|egg>
bofh: it does nicely dip when the sample size aligns with the period though :-p
<egg|zzz|egg>
not that anything makes sense,
<egg|zzz|egg>
bofh: 1993MNRAS.263..287K seems to have stuff involving fourier magic to estimate some manner of standard deviation but I'm not sure 1. whether it's applicable 2. how it works
<egg|zzz|egg>
if you like DOIs more than adsabs codes
<egg|zzz|egg>
10.1093/mnras/263.2.287
<egg|zzz|egg>
(I do not have a catpic with this paper, sorry)
<egg|zzz|egg>
bofh: meow
<egg|zzz|egg>
UmbralRaptor: well, it's all going to be a combination of periodic effects at any timescale (not sure what this is, it doesn't look like a year, maybe kozai?)
<egg|zzz|egg>
UmbralRaptor: but I want to estimate its mean :-/
<bofh>
20:35:17 <@egg|zzz|egg> bofh: 1993MNRAS.263..287K seems to have stuff involving fourier magic to estimate some manner of standard deviation but I'm not sure 1. whether it's applicable 2. how it works
<bofh>
I'm trying to figure out 2. right now, moment.
<egg|zzz|egg>
meow
* egg|zzz|egg
meows repeatedly at bofh's door
<bofh>
adsabs is kinda sketchy for me atm? keep getting root@localhost errors
<kmath>
<✔Alex_Parker> We got our first high-resolution MVIC panchromatic image of 2014 MU69 down. With a little deconvolution magic, it i… https://t.co/MpFHNHC8p4
<egg|zzz|egg>
bofh: okay, on my test sum of sinusoids, Sqrt@Total[(Abs[Fourier[#][[;; 3]]])^2/Length[#]]& is a really good and tight majoration of the error of the mean??? Ꙩ_ꙩ
<egg|zzz|egg>
bofh: (vaguely inspired by that paper)
<egg|zzz|egg>
even ;;2
<egg|zzz|egg>
hmmm
<egg|zzz|egg>
wait with ;;1 it's eggsactly the absolute value of the mean, maybe my test vector is too regular,
<egg|zzz|egg>
yeah nevermind
<egg|zzz|egg>
ah derp I thought I was doing 2;;, but yes, taking the mean is taking the mean, news at 11
<bofh>
22:06:10 <@egg|zzz|egg> bofh: okay, on my test sum of sinusoids, Sqrt@Total[(Abs[Fourier[#][[;; 3]]])^2/Length[#]]& is a really good and tight majoration of the error of the mean??? Ꙩ_ꙩ
<bofh>
WHAT THE HELL that doesn't *make* any sense?
<egg|zzz|egg>
bofh: no it does if the mean is 0, because it's the mean + 2 terms of the Fourier eggspanson
<egg|zzz|egg>
bofh: more interesting is the 2;;10
<egg|zzz|egg>
i.e. drop the mean, take low frequencies
<egg|zzz|egg>
I feel like I'm missing a square somewhere compared to the Koen Lombard paper
<egg|zzz|egg>
2;;5 is OK tbh
<egg|zzz|egg>
bofh: tbh just the 2nd term of the DFT seems to do the job?? Ꙩ_ꙩ
<bofh>
I mean the first term of the DFT would be your input mean, the second term would... I'm not sure how to describe it but that makes *sense* to me intuitively.
<egg|zzz|egg>
bofh: the terms are related to the autocorrelation function by (2) in Koen Lombard
<bofh>
Ahh, right, that makes sense.
<egg|zzz|egg>
bofh: and then you have (19) that seems like dark magic
<bofh>
I'm still not sure I understand (19) at all.
<egg|zzz|egg>
I think (19) gets mentioned in 1996AJ....111..541F in the context of what I'm trying to do
<egg|zzz|egg>
bofh: you'll like 1996AJ....111..541F, it does stats with braket notation
<egg|zzz|egg>
cc UmbralRaptor
<bofh>
okay I actually strongly approve of that.
<egg|zzz|egg>
bofh: look at the paragraph after the one containing (5.4) in 1996AJ....111..541F, that's where they cite Koen and Lombard (and how I found the latter)
<bofh>
sec, opening
<UmbralRaptor>
!?
<egg|zzz|egg>
UmbralRaptor: the mean is just the projection on |1>
<bofh>
holy fuck this is actually *readable*
<bofh>
YES
<egg|zzz|egg>
bofh: but then there's a part II to that paper, where he does basically the same paper but in---galaxy brain---tensor index notation
<bofh>
augh please no
<egg|zzz|egg>
bofh: index notation is good actually
<egg|zzz|egg>
that's in 1996AJ....111..555F
<egg|zzz|egg>
bofh: from the abstract: "It is quite convenient for distinguishing a variety of different vector spaces, and is the most compact notation for all the sums which arise in the analysis."
<egg|zzz|egg>
bofh: but anyway, there's something going on between (4.5) from Foster and (19) from Koen and Lombard
<bofh>
Yeah, let me grab my copy of Koen and Lombard first
<egg|zzz|egg>
bofh: by the way, how do i compute the 2nd coefficient of the Fourier transform
<kmath>
<mrkgrnao> There's a MathOverflow comment about how the correct name for a morphism of Poisson manifolds is obviously "ichthyo… https://t.co/QcMS0wNmIc
<SnoopJeDi>
Yea, agreed
<SnoopJeDi>
especially if 'boisson' is fair game
<bofh>
Wait, 'Boisson'? :p
<SnoopJeDi>
oh, I didn't realize it was a surname too
<egg|zzz|egg>
bofh: drink
<bofh>
I mean I know, I'm just wondering where it was used in the context of Poisson Geometry.
<SnoopJeDi>
I wasn't thinking of anything in particular, I just always tie the two together mentally
<SnoopJeDi>
I guess because fish <> water <> drink
<egg|zzz|egg>
Twitter truncations of adsabs links look silly....…
<egg|zzz|egg>
SnoopJeDi: "distribution de poissons" certainly sounds weirder than "distribution de boissons"
* egg|zzz|egg
hands out fish and drinks
<SnoopJeDi>
egg|zzz|egg, btw did you hear about the new French fish sauce company using novel statistics to responsibly manage their fishing practices?
<SnoopJeDi>
boisson poisson: Poisson moisson de poisson
<egg|zzz|egg>
but Poisson isn't an adjective (and adjectives aren't generally prefix anyway), it's poissonien(ne) or de Poisson
<SnoopJeDi>
yea I figured I'd probably screwed that one up but had already sunk too much into forcing it as it was
<egg|zzz|egg>
:D
<SnoopJeDi>
would "moisson de poisson de Poisson" be sensible, or would "moisson poissonien de poisson" be more idiomatic?
<egg|zzz|egg>
<egg|zzz|egg> bofh: but anyway, there's something going on between (4.5) from Foster and (19) from Koen and Lombard << that would be (5.4), not 4.5
<egg|zzz|egg>
SnoopJeDi: moisson is feminine, so it would be a moisson poissonienne
<egg|zzz|egg>
which really sounds like Groethendieck got into stats
<egg|zzz|egg>
where are the sheathes
<SnoopJeDi>
bah, I never was much of a gambler
<egg|zzz|egg>
Grothendieck*
<egg|zzz|egg>
sheaves*
<egg|zzz|egg>
bofh: so assuming τ^2 in (5.4) is S0(ε) from (19) and above, what is r in the sense of (5.4) if I pick only one frequency (the 2nd coefficient), is it 2 or 1
<egg|zzz|egg>
hm i guess it's 1 because I'm only modeling the mean, and the number of frequencies I pick is how I estimate τ^2, not a different basis
<egg|zzz|egg>
bofh: if I'm only computing one coefficient of the dft, isn't it easier to do so naively from the definition? I don't see what I gain from the fft algorithms there
<egg|zzz|egg>
bofh: i.e. why do it in n log n then eggstract one coefficient that I could have computed in n