egg|nomz|egg changed the topic of #kspacademia to: https://gist.github.com/pdn4kd/164b9b85435d87afbec0c3a7e69d3e6d | Dogs are cats. Spiders are cat interferometers. | Космизм сегодня! | Document well, for tomorrow you may get mauled by a ネコバス. | <UmbralRaptor> egg|nomz|egg: generally if your eyes are dewing over, that's not the weather. | <ferram4> I shall beat my problems to death with an engineer.
armed_troop has joined #kspacademia
e_14159 has quit [Ping timeout: 186 seconds]
e_14159 has joined #kspacademia
<egg|zzz|egg> hm where is my phone
<egg|zzz|egg> egg|phone|egg:
<egg|phone|egg> Meow
* UmbralRaptop did not know that one could use a cat as a phone.
<Ellied> yeah, so it looks like I'm not getting that LED panel working properly today.
armed_troop has quit [Read error: -0x7880: SSL - The peer notified us that the connection is going to be closed]
awang has joined #kspacademia
<kmath> <iximeow> i wonder how many people have experienced the problem i just caused for myself, because i think it's probably <10
<iximeow> bofh: sysv x86_64 calling convention only matches linux syscall convention for the first three parameters
<iximeow> fourth parameter for sysv x86_64 is in RCX, fourth parameter for linux x86_64 syscalls is in r10
<iximeow> i just spent all day learning that, in a very roundabout manner, after ptrace seemed to take correct arguments and then do incorrect things with them (i have no libc but i'm targeting linux for Reasons)
<bofh> iximeow: ...fucking incredible I actually got hit by that exact issue myself. (also it differs *only* in the fourth parameter, 5 and 6 are the same too).
<iximeow> YES
<bofh> wasted way too much time trying to figure out why mremap() wasn't working, unfortunately ptrace() was unhelpful for the same reason. then it clued into me to objdump a glibc syscall wrapper and I facepalmed.
<iximeow> bofh: why did you run into it?
<iximeow> aha
<iximeow> i wrote the same functionality using glibc ptrace and stepped through until the syscall
<bofh> like I was generating my own syscall wrappers b/c I needed to target a system lacking a libc.
<iximeow> seeing 0 in rcx, but knowing i'd passed a pointer, clued me in...
<iximeow> that's roughly what i'm doing as well
<iximeow> my "clever" implementation was just "mov rax, SYSCALL_NUM; syscall; ret"
<iximeow> works great for read/write/exit.... :(
<bofh> like that's basically it, but you need a mov r10, rcx for >=4 param syscalls.
<bofh> oh, other ""fun"" thing I got hit with that took FAR too long to figure out: https://pastebin.com/AiZpmqu8
<bofh> specifically, stack and pointer canaries are in thread-local storage on x86_64, so I had to figure out how to a minimal init of TLS.
<bofh> also I just noticed my comment says FS when in reality I am correctly setting GS, oops.
<bofh> https://pastebin.com/ADb0Tnj4 there we go, typo fixed in case anyone for whatever inexplicable reason wants to use this code (please don't tho).
<iximeow> hahahahahaha
<iximeow> jesus
<iximeow> oh the other fun "syscall" i dealt with recently is sleep()
<iximeow> which doesn't exist from the kernel perspective (libc wraps nanosleep())
<bofh> yeah so just call nanosleep lol
<iximeow> nanosleep involves all this junk about a struct and *defining* the struct and i was feeling the right kind of lazy to just https://www.iximeow.net/h/h4f5317xc9.pl
<iximeow> which worked right off the bat \o/
<bofh> AT&T Syntax :(
<iximeow> easier to just write it and move on than do the gas incantation to switch to/from :(
<iximeow> since most of these are five lines and three instructions
<iximeow> ... i bet a macro would clean up a lot of this. hm.
<bofh> .intel_syntax noprefix at the first line
<bofh> and yeah, use a macro. uh sec.
<bofh> http://csclub.uwaterloo.ca/~pbarfuss/crt1_amd64.asm is what I have for syscalls I needed as well as init code and some simple functions. also implements thread-local errno in a glibc-compatible manner.
<bofh> oh no it *is* fs: that is used, right. I keep mixing that up with Windows which uses gs: for its struct TEB pointer.
<iximeow> only on 64bit :)
<iximeow> though i think both are valid iff you're a WoW64 process?
<bofh> so all I care about is 64-bit, lol.
<iximeow> i'm jealous
armed_troop has joined #kspacademia
Majiir is now known as Snoozee
* UmbralRaptop 🔪 sleep again
<UmbralRaptop> Anyway, if you're bored with float insanity, have some flux insanity http://www.sdss.org/dr12/algorithms/magnitudes/
<UmbralRaptop> "The transformation from linear flux measurements to asinh magnitudes is designed to be virtually identical to the standard astronomical magnitude at high signal-to-noise ratio, but to behave reasonably at low signal-to-noise ratio and even at negative values of flux, where the logarithm in the Pogson magnitude fails."
<UmbralRaptop> negative flux. Ꙫ_ꙫ
<bofh> Negative flux? That sounds like a silly mathematical quirk in how we define flux at low intensities than an actual real thing.
<egg|zzz|egg> meow
tawny has quit [Ping timeout: 198 seconds]
tawny has joined #kspacademia
tawny has quit [Ping timeout: 383 seconds]
<UmbralRaptop> bofh: something about subtracting backgrounds, usually,
<UmbralRaptop> Also, the shuttle orbiters were heavier than Skylab? Ꙩ_ꙩ https://twitter.com/planet4589/status/977789605171924992
<kmath> <✔planet4589> OK you've all been trying hard, here's a labelled version to help you out. DOS-3 and DOS-6 are better known by thei… https://t.co/zMcmkjoF3J
<kmath> <JCTArtStudio> This is #Microraptor, AKA Shiny Shiny Micro Babe. Relatively adorable, but is essentially an airborne collection of… https://t.co/Wp2U71VCZD
<egg|zzz|egg> UmbralRaptop: birb!
<UmbralRaptop> Dragonfly (well, meganeuropsis) sized birb.
* UmbralRaptop bites his tounge to avoid saying "IDL" https://twitter.com/PlavchanPeter/status/977882550445268993
<kmath> <PlavchanPeter> The Perl programming language was such a flash in the pan in hindsight. I was pretty proficient with it. It was on… https://t.co/t3EHRjcm7z
<UmbralRaptop> note: this is my advisor.
tawny has joined #kspacademia
<egg|zzz|egg> !wpn UmbralRaptop
* Qboid gives UmbralRaptop a Fréchet mine with a katar attachment
<UmbralRaptop> !wpn egg|virus|egg
* Qboid gives egg|virus|egg an isobaric feathered electron
<egg|zzz|egg> bofh: yeah I think I see the general idea of obtaining the magic constant
tawny has quit [Ping timeout: 186 seconds]
<egg|zzz|egg> bofh: at least in the rsqrt paper, and the [cq]brt thing feels similar (and rootn is basically the same?)
<egg|zzz|egg> bofh: we should write something about the general case,
<egg|zzz|egg> [something something ANBO letters]
<whitequark> !wpn egg
* Qboid gives egg a radon spheromak
* UmbralRaptop grumbles at the lack of buses before 10 am today.
<egg|zzz|egg> !wpn whitequark
* Qboid gives whitequark a subsonic frangible combinator
<egg|zzz|egg> bofh: I have a closed form for Kahan's 0.1009678 that an IRC message is too short to contain
<egg|zzz|egg> bofh: I can express it as a root though, γ such that (12 - 4 * 2**(2/3) + 2**(2/3) * γ - 6 / (1 + γ)**(1/3)) / 6 = 0
<UmbralRaptop> And he didn't set Brownback or Kobach on fire? https://twitter.com/johnregehr/status/976704889765023744
<kmath> <johnregehr> shoot once again my dad is burning pastures in Kansas and instead of helping I'm stuck in stupid Utah doing this "j… https://t.co/FMPqGrs0xZ
<bofh> egg|zzz|egg: okay my next question is: what's special about that polynomial?
<bofh> I am extremely curious now
<bofh> http://mathb.in/ is handy for this sorta stuff btw
<egg|zzz|egg> bofh: it's fairly trivial and tedious, the relative error is periodic over three powers of two, you get nice expressions for the result of the integer arithmetic in there, you find where the maxima are depending on γ, you minimize the maximal maximum; basically exactly the same as that invsqrt paper from uwaterloo, plots even look similar
<bofh> Oh, wow. So it probably directly generalizes to the rootn case, even.
Technicalfool_ has joined #kspacademia
<bofh> (I put off trying to directly apply that paper yesterday since I needed to do research yesterday, well, condensed matter research. Heh).
UmbralRaptop has quit [Quit: Bye]
UmbralRaptop has joined #kspacademia
<egg|zzz|egg> bofh: now, interestingly, this constant is *not* optimal after one round of Newton or after one round of Halley!
Technicalfool has quit [Ping timeout: 186 seconds]
<UmbralRaptop> !wpn
* Qboid gives UmbralRaptop a mad commutator
<egg|zzz|egg> bofh: the constant that is optimal after one round of Newton is the root of a 6th degree polynomial, 0.09859583240; the constant that is optimal after one round of Halley is probably something even less sane, and is 0.099187461529
<egg|zzz|egg> bofh: which furiously resembles the constant in that libm
UmbralRaptop has quit [Ping timeout: 186 seconds]
<egg|zzz|egg> bofh: so not only does the general method from that invsqrt BS thesis apply, but the caveat that optimizing the approximation doesn't optimize the result of an iterate also applies
UmbralRaptop has joined #kspacademia
<bofh> egg|zzz|egg: does the difference in the constant between optimal overall and optimal over one round of Newton matter considerably?
<bofh> I mean the difference in those values is tiny, when converted to fixed-point it'll be < 30 of a difference.
<egg|zzz|egg> bofh: in the Newton case, max. relative error of 0.00099297280 instead of 0.0010393036 after one iterate; in the Halley case, 0.00002051443 instead of 0.0000219579
<egg|zzz|egg> bofh: so not much of a change, but that's true for invsqrt too
UmbralRaptop has quit [Quit: Bye]
<bofh> Huh. Not much, but larger than I thought, particularly in the Newton case.
<kmath> <DynamicsSIAM> Apparently, #ChaosIn3Words is a trending meme. ⏎ ⏎ How about: positive Lyapunov exponent
<egg|zzz|egg> (why is that a meme though)
<bofh> /¯\_(ツ)_/¯
tawny has joined #kspacademia
tawny has quit [Ping timeout: 198 seconds]
StCipher has joined #kspacademia
StCypher has quit [Quit: Leaving]
StCipher has quit [Client Quit]
StCypher has joined #kspacademia
<egg|zzz|egg> bofh: so, this should generalize to rootn; are there any other useful things that are nicely approximated by an affine map of the integer representation?
UmbralRaptop has joined #kspacademia
<egg|zzz|egg> !wpn UmbralRaptop
* Qboid gives UmbralRaptop a superconducting involution
Snoozee is now known as Majiir
* UmbralRaptop implodes.
<bofh> egg|zzz|egg: exp(x): https://pastebin.com/WvBuFKHQ
<egg|zzz|egg> RMS? but why
<bofh> I needed to minimize L^2 error in my application (DSP), ideally you want to minimize L^{\infty} error, yeah.
<egg|zzz|egg> also, approximating exp by a line in log-log? Ꙩ_ꙩ
<SilverFox> !wpn Majiir
* Qboid gives Majiir an ones-complement demi-culverin
<Majiir> !wpn egg|zzz|egg
* Qboid gives egg|zzz|egg a xenon ≈
<bofh> egg|zzz|egg: it works better than you'd naïvely expect, try it.
<UmbralRaptop> Numerics is definitely closer to magic than science.
<bofh> I use it in code that emulates a pair of MOSFETs in saturation mode, it's audio synthesis so the output is truncated to 16 bits anyhow, but an exponential characteristic is vital.
<egg|zzz|egg> UmbralRaptop: if done on gpus is it alchemy
<UmbralRaptop> FMA?
<UmbralRaptop> egg|zzz|egg: maybe
UmbralRaptor has joined #kspacademia
UmbralRaptor has quit [Client Quit]
UmbralRaptor has joined #kspacademia
UmbralRaptop has quit [Ping timeout: 190 seconds]
<egg|zzz|egg> bofh: I mean obviously there's pown but since pow (which is evil) coincides with pown when the exponent is integer (except perhaps on special values) you are more likely to have a pown than a rootn
<bofh> a sentence I just read, somehow: "This allows service techs to come out and update the firmware installed on the toilet."
<bofh> yeah
UmbralRaptop has joined #kspacademia
UmbralRaptop has quit [Client Quit]
UmbralRaptop has joined #kspacademia
UmbralRaptor has quit [Ping timeout: 198 seconds]
<egg|zzz|egg> bofh: I'm not sure I see why IEEE recommends a (1+x)^n, where is the ill-condition that this works around?
<bofh> Nor am I, for integer n that's silly.
<egg|zzz|egg> bofh: hm, maybe smol x?
<egg|zzz|egg> but not sure what the use case is
<UmbralRaptop> Random annoyance: anbo.blogspot.com and anbol.blogspot.com are taken.
<egg|zzz|egg> .github.io?
<UmbralRaptop> Looks open?
<egg|zzz|egg> UmbralRaptop: or I can just write things in the documentation directory of principia :-p
<UmbralRaptop> Hah
<egg|zzz|egg> bofh: it seems the constant grows with higher roots
<egg|zzz|egg> bofh: it also seems extremely tedious to find it for general n, since it's not always the same maxima crossing
<bofh> Yeah, I'm not sure if it's worth it for higher roots, given the complexity of root-finding on high-order polynomials.
<egg|zzz|egg> bofh: I mean whatever you're using higher roots for it's not solving polynomial roots (theta functions!), but occasionally they appear
<egg|zzz|egg> bofh: but yeah, it means if you want to implement a general rootn eventually it makes sense to give up on that magic number and compute offline the values for small n
<egg|zzz|egg> bofh: amusingly even for n=2^63-1 rootn is not constant, even when restricted to normalized values (it is constant at 2^64-1 though)
<bofh> FUCK THETA FUNCTIONS
<egg|zzz|egg> :D
<bofh> 22:20 <mlbaker> there honestly needs to be an ISO spec for theta functions
<bofh> 22:20 <mlbaker> so annoying
<bofh> 22:54 <mlbaker> you know, i think that'll actually be my first paper
<bofh> 22:54 <mlbaker> a natural structure of smooth deligne-mumford stack on the moduli space of all possible fucking *systems* of notation for this bullshit
<egg|zzz|egg> :D
<egg|zzz|egg> bofh: s/FU/MO
<Qboid> egg|zzz|egg thinks bofh meant to say: 22:54 <mlbaker> a natural structure of smooth deligne-mumford stack on the moduli space of all possible MOcking *systems* of notation for this bullshit
<egg|zzz|egg> no that's not what I meant Qboid
<egg|zzz|egg> bofh: s/^FU/MO
<Qboid> egg|zzz|egg thinks bofh meant to say: MOCK THETA FUNCTIONS
<bofh> rofl
<SnoopJeDi> bofh, I had the pleasure of reading this today, thought you'd enjoy: https://twitter.com/europlanetmedia/status/977924338753134592
<kmath> <europlanetmedia> A Unique View Of The Moon by Lunar Reconnaissance Orbiter's Wide Angle Camera https://t.co/565Wgqbek9 https://t.co/1rAXGuP7h8
<SnoopJeDi> although since you also follow @RidingWithRobots you may have seen it :)
<bofh> I do but I somehow scrolled past it, so the re-link is extremely appreciated. :3
UmbralRaptor has joined #kspacademia
UmbralRaptop has quit [Ping timeout: 186 seconds]
tawny has joined #kspacademia
<kmath> <BMatB> Currently writing up our new #EBSD indexing algorithm - and it's super cool. Originally the method was developed to… https://t.co/mmEeEpOOl7
UmbralRaptor has quit [Quit: Bye]
UmbralRaptop has joined #kspacademia
UmbralRaptor has joined #kspacademia
UmbralRaptor has quit [Client Quit]
UmbralRaptor has joined #kspacademia
UmbralRaptop has quit [Ping timeout: 186 seconds]
<SnoopJeDi> bofh, whoa, neat!
UmbralRaptop has joined #kspacademia
UmbralRaptor has quit [Ping timeout: 198 seconds]
<bofh> https://twitter.com/PeroxideFormer/status/977995308239081473 OKAY I NEED THIS SET TO MUSIC SOMETIME.
<kmath> <PeroxideFormer> Don't stop, make the hop ⏎ F-C, vibe my level down ⏎ photon, im not done ⏎ till we hit the S1 ⏎ tick tock, on the clock ⏎ but… https://t.co/HdO7Cqpnre
egg|phone|egg has quit [Remote host closed the connection]
<egg|zzz|egg> !wpn bofh
* Qboid gives bofh a [DATA EXPUNGED] centripetal barber
tawny has quit [Read error: Connection reset by peer]
tawny has joined #kspacademia
<egg|zzz|egg> !wpn wwh
* Qboid gives wwh a Carnot sabre
<egg|zzz|egg> um
<egg|zzz|egg> !wpn whitequark
* Qboid gives whitequark an unitary 💩
<egg|zzz|egg> ...
<egg|zzz|egg> !wpn bofh
* Qboid gives bofh an unique durandal-like heat pump
<bofh> !wpn egg|zzz|egg
* Qboid gives egg|zzz|egg a Renesas tungsten rope
<egg|zzz|egg> UmbralRaptop: not sure if telescope or actual scope https://twitter.com/chordowl/status/977262327308398592
<kmath> <chordowl> tired: scope too big ⏎ wired: don't even have a scope
<UmbralRaptop> egg|zzz|egg: unsure if telescope or project scope.
<bofh> ^
tawny has quit [Quit: 「Roundabout」 - To Be Continued]
tawny has joined #kspacademia
<kmath> <paniq> Q: what is square and pink and goes "oink"? ⏎ A: a pigxel ⏎ ⏎ Q: what is cubic and red and has a bushy tail? ⏎ A: a foxel
<egg|zzz|egg> bofh: the constants minimizing the error of the initial approximation (so like Kahan's constant, not that libm's), for roots 3rd through 64th https://i.imgur.com/vGYFLjh.png
<egg|zzz|egg> almost but not quite a line?
<egg|zzz|egg> (minimization done in 100 sig. dec. arithmetic so any visual nonlinearity is probably actually a thing)
<bofh> that's unexpected. like there's slight nonlinearity there around the n ~ 23, but otherwise yeah, that's weirdly linear.
<egg|zzz|egg> bofh: almost-but-not-quite results can be interesting and also really tricky :-p
<egg|zzz|egg> something something 163
<bofh> There's a clear relation there at the very least, of *some* form.
<egg|zzz|egg> bofh: note: those are constant based on the infinite-mantissa piecewise representation of the affine map, floating-point-format-independent, so fairly simple (like the analysis done by your colleague robertson)
<bofh> Yeah but that's the format that likely will have the more fundamental underlying mathematical structure if there *is* any.
<egg|zzz|egg> yeah, I would not expect adding roundings to 53 bits to add clarity to that soup
<egg|zzz|egg> bofh: um, the relative errors are weird https://i.imgur.com/gxbPplm.png
<bofh> rofl
<bofh> what the hell happens for n=23?
<egg|zzz|egg> bofh: I mean 23 is a special value, in that the intersection determining the minimum is between, uh, different things? (you should fly over here and look at the graphs...)
<egg|zzz|egg> there's one particular segment of the approximation for γ>0 which behaves differently (and has maximal error at a value of 1 on the interval [1+γ, (1+γ) 2**n[ aifaict)
<egg|zzz|egg> and this segment has the maximal error up to 23
<bofh> Iiiiiiiiiiiiiiiinteresting.
<egg|zzz|egg> (23rd roots are interesting?!)
<egg|zzz|egg> is this going to start summoning simple groups,
<bofh> apparently!
<bofh> at the 23rd roots are interesting bit. no idea if simple groups are going to get invoked yet.
<bofh> I feel like you have enough material for a paper with a bit more staring and maths. :P
<egg|zzz|egg> yeah
<egg|zzz|egg> bofh: also I feel like we should summon atlas to see whether this is all obvious and already known, it feels like it should be
<egg|zzz|egg> (I mean, rsqrt is famously old, and that Kahan cube root similarly, rootn is an IEEE recommendation)
<bofh> I don't think anyone's done the higher-order root results, and even if they have, I don't know if they've done general trend analysis. Hm.
<bofh> Like I need to stare at that paper a bit and reread your mathb.in, sec.
<tawny> what's this all for in the first place anyway
<bofh> curiosity, largely. :P
<tawny> fair enough haha
<egg|zzz|egg> bofh: I mean the mathb.in is actively useless tbh
<bofh> oh?
<egg|zzz|egg> it's just whatever closed form Mathematica gives for the root of 12 - 4 * 2**(2/3) + 2**(2/3) * γ - 6 / (1 + γ)**(1/3), which I wouldn't expect to be nice
<egg|zzz|egg> bofh: so the thing I'm minimizing here is Max @@ Join[{relativeError[1, (1 + \[Gamma])^(1/n)]},
<egg|zzz|egg> Table[relativeError[(1 + k + n - \[Gamma])/
<egg|zzz|egg> n, (2^(1 + k))^(1/n)], {k, 0, n - 1}]]
<egg|zzz|egg> bofh: do you have a Mathematica license at your institution? otherwise I can eggsplain it enough that you can feed it to whatever you want
<egg|zzz|egg> (my relaiveError takes actual, expected, not the reverse)
<egg|zzz|egg> s/aive/ative/ argh
<egg|zzz|egg> tawny: I blame the flu tbh
<egg|zzz|egg> also principia legitimately has a rootn with compile-time n somewhere (but one that requires low precision really)
<egg|zzz|egg> !wpn bofh
* Qboid gives bofh a global Toblerone-like 2N3906
<kmath> <sigfig> so now i have a symplectic integrator that needs optimizing and a messy combinator approach to constructing hamiltonians
<bofh> Ahh.
<bofh> So like, I think that's enough of an explanation for me to figure out the rest (I prolly do have a mathematica license but this is more interesting for me to rederive from this).
<egg|zzz|egg> bofh: Mathematica is really nice to poke at this stuff fwiw
<egg|zzz|egg> (if you want IEEE float you'll want your own soft-float, but I have that; the thing natively has arbitrary precision float (probably decimal?) as well as some sort of "machine" float that looks like binary64 on a binary64 exponent or something weird like that)
<egg|zzz|egg> bofh: also sigfig is apparently redoing principia?
<bofh> well binary64 on a binary64 (or uint64_t?) exponent is the standard matlab/octave format for many things.
<bofh> also huh, you should probably let her know :P
<egg|zzz|egg> bofh: she has starred our repo fwiw https://github.com/mockingbirdnest/Principia/stargazers?page=3
<egg|zzz|egg> (or at least someone who is also in austin and also called sig and has the same profile pic :-p)
<bofh> heh
TonyC has joined #kspacademia
TonyC1 has quit [Ping timeout: 190 seconds]
<egg|zzz|egg> bofh: btw, https://i.imgur.com/Yq3LWpU.png
<egg|zzz|egg> a clearer picture of what happens
<egg|zzz|egg> this is the (positive n, large n) analogue of figure 4.3 of https://cs.uwaterloo.ca/~m32rober/rsqrt.pdf
<egg|zzz|egg> (probably t is nontrivially related to γ too but it's basically the same thing)
<egg|zzz|egg> but interestingly, the error without a correction factor gets significantly worse for large roots, it's > 6% at the 23rd
<egg|zzz|egg> <bofh> also huh, you should probably let her know :P <<< also unsure how to do that because I'm bad at social interaction esp. on the diapsid website >_>
<bofh> 23:14:38 <@egg|zzz|egg> but interestingly, the error without a correction factor gets significantly worse for large roots, it's > 6% at the 23rd
<bofh> I mean that feels like it would make sense intuitively, just not sure how to explain why.
<egg|zzz|egg> mathematica notebooks are nice to tinker with but gods do they make for an insane codebase ~instantly
* egg|zzz|egg stares
<egg|zzz|egg> now I'm not sure what's actually going on at that 23?
<egg|zzz|egg> bofh: hm, I'm not sure I trust that 23, it might just be a bug in my piecewise mess that manifests only at 23?
<bofh> so there's still a clear (mostly) monotonically decreasing rel. error curve to 23, and a monotonically increasing one from 23, so...
<egg|zzz|egg> I think my piecewise decomposition is wrong for >= 23 somehow?
<egg|zzz|egg> yeah there's something fishy
<egg|zzz|egg> bofh: yeah *something* happens at 23, and from there onwards my analysis is incorrect
<egg|zzz|egg> 23 is still magic tho