raptop changed the topic of #kspacademia to: https://gist.github.com/pdn4kd/164b9b85435d87afbec0c3a7e69d3e6d | Dogs are cats. Spiders are cat interferometers. | Космизм сегодня! | Document well, for tomorrow you may get mauled by a ネコバス. | <UmbralRaptor> egg|nomz|egg: generally if your eyes are dewing over, that's not the weather. | <ferram4> I shall beat my problems to death with an engineer. | We can haz pdf
<bofh> Yeah, standard error of the mean is what you sensibly want here.
<bofh> I'm pretty sure this is close enough to a point at which central limit thm applies, at least for sigdec purposes.
<egg|zzz|egg> bofh: okay, so that covers the things I compute as a mean
<egg|zzz|egg> bofh: (what estimator should i use for the standard deviation, but let's leave that question aside for a minute)
<egg|zzz|egg> bofh: what about the things that I compute as a slope, rather than a mean
<egg|zzz|egg> bofh: where the slope is computed as covariance/variance
<egg|zzz|egg> bofh: what's the error on that slope
<bofh> ugh.
<bofh> I'm going to assume you don't have a moment generating function handy
<bofh> hrm.
<egg|zzz|egg> bofh: I'm finding https://www.physicsforums.com/threads/estimating-error-in-slope-of-a-regression-line.194616/, does this make any form of sense?
<egg|zzz|egg> bofh: other question: if I have quantities equipped with a standard error, can I do arithmetic to them?
UmbralRaptor has quit [Remote host closed the connection]
<bofh> egg|zzz|egg: huh, I *like* that approach. never seen it before, but it's brilliant.
<bofh> and yes, you should be able to do arithmetic to them given some fairly tame assumptions.
<egg|zzz|egg> bofh: which approach, there are several posts
<egg|zzz|egg> bofh: and how (I can happily ignore the assumptions, do i look like a statistician,)
<egg|zzz|egg> the statdad reply
<bofh> also I apologize for incoherence but I have somewhat of a headache and am waiting for the bloody ibuprofen/caffeine to kick in.
<egg|zzz|egg> though to get y hat I need to get the ordinate at origin of the linear regression, not just its slope
<egg|zzz|egg> bofh: ow
<egg|zzz|egg> now i am reminded of "депакотя"
<kmath> <bofh453> OH: "депакотя"
<bofh> oh, I guess I've been working under the assumption that the coordinate at origin is either 0 or easy to acquire, which I am now realizing may not be the case for you.
<bofh> hrm.
<egg|zzz|egg> is it that hard to acquire in general?
<egg|zzz|egg> the slope is cute because cov / var of course
<egg|zzz|egg> bofh: also how does arithmetic work on quantities with errors
<bofh> I just realized it's trickier than I thought since the error of the sum of two quantities with given errors is not guaranteed at all to be the sum of the respective errors.
<bofh> in general I don't think so? like I said I'm used to the coordinate at origin being immediate :p
<egg|zzz|egg> principia algebraic trolling: but here the abscissa is time, so there's no origin
<egg|zzz|egg> affine spaces ftw
<egg|zzz|egg> bofh: also entertaining eggsercise (fairly easy), given a sequence of messily drifting angles that are computed geometrically, thus mod 2π, add cycles as needed so that you can fit a line to things
<egg|zzz|egg> works as long as you don't have noise > π against the trend
<egg|zzz|egg> (> π between successive elements that is)
<bofh> ooooooh
<bofh> this is *handy*
UmbralRaptop has joined #kspacademia
<egg|zzz|egg> bofh: eggsample https://i.imgur.com/IWIPTrG.png
<egg|zzz|egg> bofh: 0.1 t + a sin t, sampled at integer t, for a in 0 .. 3
<egg|zzz|egg> bofh: I guess if I am using standard errors, I should do arithmetic to my erros like this? https://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables#Independent_random_variables
<egg|zzz|egg> (and wave my hands while screaming "independently distributed")
<egg|zzz|egg> the product and the inverse are messy though
_whitelogger has joined #kspacademia
<egg|zzz|egg> UmbralRaptop: should I look at Evaluation of measurement data – Supplement 1 to the "Guide to the expression of uncertainty in measurement" – Propagation of distributions using a Monte Carlo method
<egg|zzz|egg> bofh: "NOTE 3 “Experimental standard deviation of the mean” is sometimes incorrectly called standard error of the mean. "
<egg|zzz|egg> bofh: help i am drowning in metrology
<egg|zzz|egg> bofh: UmbralRaptop: okay so i guess i am in JCGM 100:2008 4.2 Type A evaluation of standard uncertainty
<UmbralRaptop> The numerics papers have been replaced by metrology ones?
<egg|zzz|egg> UmbralRaptop: I'm sure there is interesting numerics lurking there
<egg|zzz|egg> all those differences between measurements and the mean
<bofh> 00:51:05 <@egg|zzz|egg> bofh: I guess if I am using standard errors, I should do arithmetic to my erros like this? https://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables#Independent_random_variables
<bofh> Yeah, that seems reasonable enough here.
<bofh> Also wow I have forgotten a lot of metrology, holy shit.
<egg|zzz|egg> bofh: wait, you're a physicist, isn't this actually relevant to what you do,
<egg|zzz|egg> *swat*
<bofh> Yes, yes it is. Hence rereading.
<egg|zzz|egg> bofh: "the term “confidence level” is not used in connection with that interval but rather the term “level of confidence”" halp
e_14159 has quit [Ping timeout: 190 seconds]
<egg|zzz|egg> bofh: okay, possibly relevant:
<egg|zzz|egg> 4.2.5 Often an estimate xi of an input quantity Xi is obtained from a curve that has been fitted to experimental data by the method of least squares. The estimated variances and resulting standard uncertainties of the fitted parameters characterizing the curve and of any predicted points can usually be calculated by well-known statistical procedures (see H.3 and Reference [8]).
e_14159 has joined #kspacademia
<egg|zzz|egg> *clicks* H.3 Calibration of a thermometer
<egg|zzz|egg> uuuuuh.
Ellied has joined #kspacademia
<bofh> 01:50:50 <@egg|zzz|egg> bofh: "the term “confidence level” is not used in connection with that interval but rather the term “level of confidence”" halp
<bofh> yep. it's annoying.
<egg|zzz|egg> bofh: is Confidence of Level weird sun twitter
<UmbralRaptop> Confidence interval ≠ confidence level
<bofh> what the hell is Weird Sun Twitter?
<egg|zzz|egg> bofh: https://twitter.com/CurlOfGradient is one
<bofh> oh cool someone's finally made a list
<egg|zzz|egg> bofh: clearly LevelOfConfidence should be GUM, THIS.
<bofh> YES.
<bofh> You should make it. :p
<egg|zzz|egg> you're the physicist :-p
<egg|zzz|egg> also you are better at twitter
<egg|zzz|egg> bofh: actually I have no idea how to say confidence level differently from level of confidence in french
<egg|zzz|egg> I wonder what VIM says
<egg|zzz|egg> since VIM is bilingual
<bofh> ooh.
<egg|zzz|egg> bofh: NOTE 2 The coverage probability is also termed “level of confidence” in the GUM.
<egg|zzz|egg> bofh: NOTE 2 Il convient de ne pas confondre ce concept avec le concept statistique de niveau de confiance, bien que le terme «level of confidence» soit utilisé en anglais dans le GUM.
<egg|zzz|egg> bofh: so that's 2.37 coverage probability | 2.37 probabilité de couverture, f
<egg|zzz|egg> bofh: I like how they eggsplicitly give the genders in the VIM
<egg|zzz|egg> bofh: also, 2.36 coverage interval has this note: NOTE 2 A coverage interval should not be termed “confidence interval” to avoid confusion with the statistical concept (see GUM:1995, 6.2.2).
<egg|zzz|egg> 2.36 intervalle élargi, m NOTE 2 Il convient de ne pas appeler «intervalle de confiance» un intervalle élargi pour éviter des confusions avec le concept statistique (voir le GUM:1995, 6.2.2).
<UmbralRaptop> aaaaaa
<egg|zzz|egg> bofh: not sure why it's intervalle élargi for coverage interval but probabilité de couverture for coverage probability
<bofh> Yeah that strikes me as weird.
<egg|zzz|egg> bofh: okay do you have the GUM open
<egg|zzz|egg> bofh: can you look at eqn H.13f
<egg|zzz|egg> bofh: I think that's the one for the error on least squares slope that is given by statdad
<egg|zzz|egg> bofh: and above there's the eggspression for least-squares fitting too
<egg|zzz|egg> bofh: but what's D
<bofh> egg|zzz|egg: uhh sec, let me actually find my copy of GUM
<egg|zzz|egg> ... actually what's θ even since above there's only t
<egg|zzz|egg> θ is properly defined below, but what's D
<bofh> okay, grabbed it
<egg|zzz|egg> also blarg GUM is particularly poorly typeset
<egg|zzz|egg> the SI brochure is very nicely set, but GUM is just a mess
<bofh> wtf, that's just n * the variance?
<egg|zzz|egg> bofh: what is
<egg|zzz|egg> ah H.13d?
<egg|zzz|egg> yes but what's D
<bofh> I'm referring to H.13g
<bofh> which defines D
<egg|zzz|egg> ah D is given in H.13g
<egg|zzz|egg> derp
<egg|zzz|egg> i am good at reading,
* bofh pets egg|zzz|egg
* egg|zzz|egg considers purring
<egg|zzz|egg> bofh: right, and so H.13d is eggsactly what statdad gives
<egg|zzz|egg> except it's slightly more satisfying to get it from GUM than statdad i guess,
<egg|zzz|egg> mrow
<egg|zzz|egg> bofh: I feel like I am opening a rabbit hole by writing template<typename T> struct MeasurementResult
<bofh> well honestly I feel like you open a rabbit hole every time you type "template<typename T>" tbh
<bofh> (is this something that even needs templating for that matter?)
<egg|zzz|egg> bofh: sure, depends on the type of the measurand
<bofh> when would it not be "double"?
<egg|zzz|egg> bofh: when it's Length
<egg|zzz|egg> or Angle
<egg|zzz|egg> or Instant
<egg|zzz|egg> etc.
<bofh> oh right I forgot you have a bazillion incompatible forms of double
<bofh> I guess dimensional analysis is handy.
<egg|zzz|egg> bofh: dimensional analysis is good
<egg|zzz|egg> bofh: Type B evaluation of measurement uncertainty: evaluation of a component of measurement uncertainty determined by means other than a Type A evaluation of measurement uncertainty
<egg|zzz|egg> Monsieur Jourdain, métrologue
<egg|zzz|egg> bofh: when averaging 0 samples, the average is NaN, but what is the standard uncertainty,
<egg|zzz|egg> (seems it's NaN too, however the experimental standard deviation comes ends up being -1 :-p)
<egg|zzz|egg> wait no, -0
<egg|zzz|egg> bofh: back to numerics, should one do something smart to compute the average
<egg|zzz|egg> bofh: advantage of MeasurementResult<double>, there's finally a good answer to "how many digits should one display" (arguably that's why I'm looking at it even)
<egg|zzz|egg> bofh: problem: I need to write that logic :-p
<bofh> 04:06:00 <@egg|zzz|egg> bofh: advantage of MeasurementResult<double>, there's finally a good answer to "how many digits should one display" (arguably that's why I'm looking at it even)
<bofh> I mean imho the answer is "as many as machine precision gives you, b/c why display *less*?"
<bofh> 03:10:52 <@egg|zzz|egg> bofh: back to numerics, should one do something smart to compute the average
<bofh> possibly doubledouble and checking for NaN before adding to it? running sum is a bit trickier, there are algs to do that nicely.
<egg|zzz|egg> bofh: because most of them are distracting garbage if they come from a process that tries to compute some property finitely many samples eggstracted by numerical eggsperiments
<egg|zzz|egg> bofh: actually even there
<egg|zzz|egg> T = +4.308190(18) 10^+4 s
<egg|zzz|egg> T = +4.306150(78) 10^+4 s
<egg|zzz|egg> maybe my uncertainty is a bit larger than i think,
<egg|zzz|egg> (that's the same value, moar samples on the second)
<egg|zzz|egg> also why am i awake
UmbralRaptop has quit [Remote host closed the connection]
<egg|zzz|egg> okay so
<egg|zzz|egg> T = +4.308190(18) × 10^+4 s with n = 9
<egg|zzz|egg> T = +4.307039(73) × 10^+4 s with n = 99
<egg|zzz|egg> T = +4.30441(15) × 10^+4 s with n = 1000
<egg|zzz|egg> T = +4.306494(78) × 10^+4 s with n = 4497
<egg|zzz|egg> hmm
<egg|cell|egg> Bofh: uncertaintea
<bofh> like I feel like your uncertainties should be a *bit* larger than that?
<egg|cell|egg> The n = 9 one is garbage, what about the others though
<egg|cell|egg> Bear in mind standard uncertainties are one standard deviation
UmbralRaptop has joined #kspacademia
<bofh> Ahh. Okay, tossing out the n=9 one, the rest look actually okay.
<kmath> <infty_dril> mean while, while you were "Diagram Chasing ", i tasted 100 different constructions of the real numbers in a cave b… https://t.co/I7rxA9jipE
_whitelogger has joined #kspacademia
_whitelogger has joined #kspacademia
_whitelogger has joined #kspacademia
<egg|zzz|egg> <bofh> Ahh. Okay, tossing out the n=9 one, the rest look actually okay. << well, they are still quite a few σ out...
<egg|zzz|egg> bofh: none of those measurements are within 99 % confidence intervals of each other...
<egg|zzz|egg> I guess 4.2.7 If the random variations in the observations of an input quantity are correlated, for example, in time, the mean and experimental standard deviation of the mean as given in 4.2.1 and 4.2.3 may be inappropriate estimators (C.2.25) of the desired statistics (C.2.23). In such cases, the observations should be analysed by statistical methods specially designed to treat a series of correlated,
<egg|zzz|egg> randomly-varying measurements. :-/
<egg|zzz|egg> bofh: haha: The GUM uses the term “level of confidence” as a synonym for coverage probability, drawing a distinction between “level of confidence” and “confidence level” [GUM:1995 6.2.2], because the latter has a specific definition in statistics. Since, in some languages, the translation from English of these two terms yields the same expression, the use of these terms is avoided here.
_whitelogger has joined #kspacademia
<egg|zzz|egg> bofh: *plots* WTF, that's *REALLY* not normally distributed
<egg|zzz|egg> bofh: it's sort of uniformly distributed???
<egg|zzz|egg> bofh: yeah it's eggstremely normally distributed
<egg|zzz|egg> bofh: so yeah the standard uncertainty assuming normality is going to be garbage
<egg|zzz|egg> bofh: okay, what's the standard error of the mean---er, i mean, eggsperimental standard deviation of the mean, for uniformly-distributed measurements
<egg|zzz|egg> *stares*
<egg|zzz|egg> no, I want the times *between* periapsides, silly egg.
<egg|zzz|egg> not the times *of* periapsides
<egg|zzz|egg> (those are uniformly distributed, but i knew that....)
<egg|zzz|egg> okay, now actually looking at the thing i want, it's definitely not normal, also it's a mess
<egg|zzz|egg> it's, uh, trimodal
<egg|zzz|egg> correction, it's a mess
<egg|zzz|egg> bofh: https://i.imgur.com/fs3MQ8K.png PDF histograms for increasing sample sizes
<egg|zzz|egg> bofh: how do i model that
<egg|zzz|egg> bofh: n = 4497 is over 1 cycle of the precession of the nodes, and it's more uniform than n = 1000
<egg|zzz|egg> bofh: basically all my uncertainties are going to come from cyclic madness...
_whitelogger has joined #kspacademia
egg|cell|egg has quit [Ping timeout: 180 seconds]
egg|cell|egg has joined #kspacademia
<egg|work|egg> hm where is my phone
<egg|work|egg> egg
<egg|work|egg> egg
<egg|work|egg> egg
<bofh> 10:29:32 <@egg|zzz|egg> bofh: https://i.imgur.com/fs3MQ8K.png PDF histograms for increasing sample sizes
<bofh> 10:29:36 <@egg|zzz|egg> bofh: how do i model that
<bofh> bleh. hrm.
<bofh> yeah this is all UNIFORMLY distributed what the hell.
<egg|work|egg> bofh: it's probably just a sum of long-period sinusoids
<egg|work|egg> bofh: not sure how i model that, esp. if I don't know all periods involved
<egg|work|egg> (they might be longer than the sampling interval)
<egg|work|egg> gah did i leave my phone on the tram?
<bofh> sum of long-period sinusoids makes sense, but modeling that is actually quite horrid if I recall.
<egg|work|egg> gah, my phone is in a weird place
_whitelogger has joined #kspacademia
<bofh> egg|work|egg: on the tram?
<egg|zzz|egg> bofh: in a town south of zurich
<egg|zzz|egg> bofh: locked it and gave my landline
<egg|zzz|egg> had to go back home to do so because my device where I'm logged in with my non-corp account and also my authenticator are uh,
<egg|zzz|egg> this has been a productive workday
<bofh> This sounds very, erm, great.
<bofh> At least it's not Senegal.
<kmath> <bofh453> So, um, I found one of my stolen cellphones (the one stolen more recently *inside* Gare Saint-Lazare). ⏎ ⏎ It's in Sen… https://t.co/U4KnPlFpif
<egg|zzz|egg> bofh: yes, also I can conceivably imagine that the person who has it might want to return it
<egg|zzz|egg> bofh: in Paris there would be no uncertainty,
<kmath> <lukebovard> @bofh453 wait *one of*? how many burner phones do you have?
<egg|zzz|egg> bofh: do you have a phone btw, I've lost track
<bofh> I'm getting one again now that I'm in the states and kinda need one to do things.
<egg|zzz|egg> yay
<bofh> (I'm actually surprised at how well I got by in FR w/o one, tbh. I mean I guess I had to borrow fib's twice to call LaPoste, but that's the extent of my phone usage, and I could've just done that via SIP tunnel in retrospect).
<bofh> (also yes figures the two times I had to call someone it was LaPoste/chronopost).
<egg|zzz|egg> bofh: from location history, I seem to have left mine in the tram, and then it was taken onto a train to that town where it's sitting in a supermarket, so there's a chance somebody picking it up on their way to work to return it or something
<egg|zzz|egg> bofh: huh, Kempf is knight of the ONM https://fr.wikipedia.org/wiki/Jean-Baptiste_Kempf
<bofh> Huh, TIL.
<bofh> Oh wow that was RECENT too.
<bofh> I mean it makes sense.
<egg|zzz|egg> bofh: okay but how do i uncertainty
<egg|zzz|egg> uncertaintea
<egg|zzz|egg> bofh: okay yeah this is more illuminating than the histogram
<mlbaker> exponential sums are pretty terrifying
<mlbaker> if you figure out how to understand them let me know lol
<egg|zzz|egg> bofh: how do I model that to get its mean and what's the standard uncertainty https://i.imgur.com/SAgBNRi.png
<egg|zzz|egg> mlbaker: ^
<egg|zzz|egg> no idea what the period of that thing is even, this is about 6 years in a молния orbit
<egg|zzz|egg> bofh: meow
* UmbralRaptop is eggstremely unconvinced that standard uncertainty is meaningful.
<UmbralRaptop> I guess you could look at peak to peak or trough to trough distance and get a period of sorts.
_whitelogger has joined #kspacademia
<egg|zzz|egg> UmbralRaptop: well, standard uncertainty as computed assuming normally distributed measurements is meaningless, yes
<egg|zzz|egg> UmbralRaptop: the question is, what estimator should one use for the mean of that thing, then its eggsperimental standard deviation might make a good standard uncertainty
<egg|zzz|egg> uncertaintea
<egg|zzz|egg> bofh: so i just got called about my phone
<egg|zzz|egg> bofh: it's at the lost & found in the train station of that smol town
<egg|zzz|egg> (Adliswil)
<egg|zzz|egg> bofh: not actually at the supermarket
<egg|zzz|egg> bofh: see, in CH you lose your phone, you pick it up at the lost & found next town over, in FR you pick it up in SN,
<UmbralRaptop> "Formula (4.2) differs from the usual formulas for transformation under rotation of the coordinate axes in having hyperbolic functions in place of trigonometric functions. This is the difference between pseudo-euclidean and euclidean geometry."
<UmbralRaptop> TIL pseudo-euclidian geometry is a thing.
<UmbralRaptop> CATS!
<UmbralRaptop> Yeah, modeling periodic components is highly relevant.
<egg|zzz|egg> UmbralRaptop: is CATS a well-known thing, aside from cats?
<UmbralRaptop> Not that I'm aware of.
<bofh> 16:48:33 <@egg|zzz|egg> bofh: see, in CH you lose your phone, you pick it up at the lost & found next town over, in FR you pick it up in SN,
<bofh> lolsob
<egg|zzz|egg> bofh: okay but if you have a signal that has long-period components of unknown (potentially long) periods, how should you estimate the uncertainty of the sample average as an estimator of its mean
<egg|zzz|egg> bofh: aside from screaming
<bofh> I should know this since I actually had a problem eggsactly like that back 3 years ago, but I'm not fully certain about it. Rereading my assignments right now.
<egg|zzz|egg> bofh: meow
<egg|zzz|egg> bofh: okay that uncertainty is much more reasonable, eggscept in the n=9 case which is a mess
<bofh> egg|zzz|egg: OKAY I've seen that formula before and was puzzled as to why it was used, this makes sense.
<egg|zzz|egg> bofh: is there a way to estimate the error for smol n
<egg|zzz|egg> (maybe it's just "use higher integration tolerance, integrate moar orbits")
<bofh> (I think that should work fine, tbh)
<egg|zzz|egg> bofh: do you know of a nicer reference than "that stack overflow answer over there"?
* UmbralRaptop screams internally https://social.mecanis.me/@er1n/101455835452586146
<SnoopJeDi> does the paper linked in the answer address the same concerns?
<bofh> egg|zzz|egg: uhh few moments, tho this stackoverflow answer is very well written
<bofh> skimming the linked paper currently
<SnoopJeDi> seems like the paper is heavier on building up the mechanics of dealing with it, although the references may hold something more discussion-y
<egg|zzz|egg> bofh: SnoopJeDi: the linked paper is in econometrics, it amuses me that I end up reading that to do astronomy Ꙩ_ꙩ
<SnoopJeDi> stats is stats
<bofh> ^
<UmbralRaptop> egg|zzz|egg: I mean, galaxy people yoinked the gini coefficient from econ…
<bofh> wait, the heck are Gini coeffs used for in Astrophys?
<egg|zzz|egg> bofh: oh also I need to buy a polar scope illuminator since I lost the one at ANBO somehow
<bofh> egg|zzz|egg: how the shit did that happen btw? I thought that thing screws in quite well?
<egg|zzz|egg> I have no idea
<egg|zzz|egg> it just wasn't here when I looked
<egg|zzz|egg> I don't know when it vanished
<egg|zzz|egg> we didn't polar align last time, so maybe it was already gone then, or we lost it that time, who knows
* UmbralRaptop is going to assume that a cow stole it.
<bofh> Yeah, I'm confused too.
<SnoopJeDi> I wish my stats-fu was good enough to even follow the problem being posed here, much less the answers in that SE. I guess that's stats!
<egg|zzz|egg> SnoopJeDi: my stats fu is nonexistent
<egg|zzz|egg> brb food
<SnoopJeDi> hm, perhaps it's just the phrasing, then. I know enough to be leary of it, at any rate.
<egg|zzz|egg> bofh: meeeooow
_whitelogger has joined #kspacademia
<egg|zzz|egg> bofh: meow
<bofh> egg|zzz|egg: still reading, but also not sure what question you still have at this point since that SO post should have answered the last one you asked me
<bofh> +?*
<egg|zzz|egg> bofh: dunno, do you have other material, other than that my question is mainly meowing
* egg|zzz|egg meows at bofh's door
<egg|zzz|egg> bofh: completely unrelated, came across this video of the ceremony for those two firemen https://esper.irclog.whitequark.org/kspacademia/2019-01-12#1547315154-1547315402; https://twitter.com/PompiersParis/status/1085970063562293248
<kmath> <✔PompiersParis> A nos frères d’armes. Sergent Simon Cartannaz, Caporal Nathanaël Josselin. https://t.co/8Qyn0NGCcG
<egg|zzz|egg> bofh: huh, the paris fire brigade is also in charge of firefighting at the space centre in Kourou
* bofh pets egg|zzz|egg
pizzaoverhead has joined #kspacademia
<egg|zzz|egg> pizzaoverhead!
pizzaoverhead has quit [Ping timeout: 189 seconds]
* UmbralRaptop saw someone wearing shorts today. It's 266 K right now.
* UmbralRaptop assumes that they're a witch, or something.
<egg|zzz|egg> bofh: hm, but that paper cited in the stackoverflow answer has multiple individuals with independent time series, whereas I have just one time series
<egg|zzz|egg> bofh: hm 10.2307/2237156 seems relevant?
pizzaoverhead has joined #kspacademia
<egg|zzz|egg> hm
<egg|zzz|egg> i zhould zzz
<egg|zzz|egg> esp since i need to pick up my phone tomorrow morning
pizzaoverhead has quit [Quit: Leaving]
<UmbralRaptop> "Houck would become the PI for SIRTF’s infrared spectrograph in 1984."
<UmbralRaptop> 19 years before it launched D:
_whitelogger has joined #kspacademia