egg|nomz|egg changed the topic of #kspacademia to: https://gist.github.com/pdn4kd/164b9b85435d87afbec0c3a7e69d3e6d | Dogs are cats. Spiders are cat interferometers. | Космизм сегодня! | Document well, for tomorrow you may get mauled by a ネコバス. | <UmbralRaptor> egg|nomz|egg: generally if your eyes are dewing over, that's not the weather. | <ferram4> I shall beat my problems to death with an engineer.
UmbralRaptop has quit [Ping timeout: 186 seconds]
<bofh>
UmbralRaptor: sounds like Jackson
<UmbralRaptor>
>_<
e_14159 has quit [Ping timeout: 186 seconds]
e_14159 has joined #kspacademia
egg|zzz|egg has quit [Read error: -0x1: UNKNOWN ERROR CODE (0001)]
StCypher has joined #kspacademia
StCipher has joined #kspacademia
* UmbralRaptor
blinks in confusion at whitequark's twitter avatar.
StCypher has quit [Ping timeout: 383 seconds]
<whitequark>
bwahaha
tawny has quit [Remote host closed the connection]
tawny has joined #kspacademia
<UmbralRaptor>
!choose Astronautics Day|Cosmonautics Day|Yuri's Night
<Qboid>
UmbralRaptor: Your options are: Astronautics Day, Cosmonautics Day, Yuri's Night. My choice: Cosmonautics Day
<whitequark>
tired: Yuri's Night
<whitequark>
wired: Yuri Night
* UmbralRaptor
admittedly does run into that namespace collision some years.
<bofh>
I mean I keep misreading at as Yuri Night constantly >_>
<whitequark>
!choose Yuri's Night|Yuri Night
<Qboid>
whitequark: Your options are: Yuri's Night, Yuri Night. My choice: Yuri Night
<kmath>
<robert_watson> Great talk on scattering from bees. Room is buzzing. @eucap2018 https://t.co/1ZHR41Blm2
egg has joined #kspacademia
<egg>
whitequark: so you are a cat now?
<whitequark>
always
<egg>
yay
* egg
pets whitequark
* egg
gives whitequark a moth
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Ping timeout: 186 seconds]
tawny has quit [Ping timeout: 198 seconds]
egg|phone|egg has joined #kspacademia
egg|cell|egg has quit [Read error: Connection reset by peer]
egg|cell|egg has joined #kspacademia
egg|mobile|egg has joined #kspacademia
egg|cell|egg has quit [Read error: Connection reset by peer]
egg|phone|egg has quit [Ping timeout: 186 seconds]
awang has joined #kspacademia
egg|phone|egg has joined #kspacademia
egg|mobile|egg has quit [Read error: Connection reset by peer]
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Read error: Connection reset by peer]
egg|phone|egg has joined #kspacademia
egg|cell|egg has quit [Read error: Connection reset by peer]
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Read error: Connection reset by peer]
<egg|work|egg>
!wpn Iskierka
* Qboid
gives Iskierka a retrobop/haloalkane hybrid
<egg|work|egg>
!wpn Thomas
* Qboid
gives Thomas an elliptical heat sink
UmbralRaptor is now known as EntropicRaptor
<egg|work|egg>
!wpn EntropicRaptor
* Qboid
gives EntropicRaptor a recurve Sagan
<egg|work|egg>
!wpn whitequark
* Qboid
gives whitequark a tin Orcrist
<EntropicRaptor>
!wpn egg|work|egg
* Qboid
gives egg|work|egg a Sumerian trapezohedron
<bofh>
!wpn EntropicRaptor
* Qboid
gives EntropicRaptor a Jeans grammar
<egg|work|egg>
>> (+ 1 1)
<egg|work|egg>
hm
<egg|work|egg>
I thought we had a lispbot
tawny has joined #kspacademia
egg|phone|egg has joined #kspacademia
egg|cell|egg has quit [Read error: Connection reset by peer]
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Read error: Connection reset by peer]
egg|phone|egg has joined #kspacademia
egg|cell|egg has quit [Read error: Connection reset by peer]
egg|phone|egg has quit [Read error: Connection reset by peer]
egg|phone|egg has joined #kspacademia
APlayer has joined #kspacademia
Parenthesie has quit [Remote host closed the connection]
Parenthesie has joined #kspacademia
tawny- has joined #kspacademia
tawny has quit [Ping timeout: 383 seconds]
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Read error: Connection reset by peer]
<egg>
!wpn whitequark
* Qboid
gives whitequark a caffeinated nitrate
<egg>
!wpn bofh
* Qboid
gives bofh a [DATA EXPUNGED] salmon
<egg>
!wpn Thomas
* Qboid
gives Thomas a partially-ordered resistor/cardinal hybrid
<Thomas>
!wpn egg
* Qboid
gives egg a Brillouin omicron edge
<iximeow>
!wpn iximeow
* Qboid
gives iximeow a 5-choosable first/follow 2N3904
<iximeow>
hm not enough
<iximeow>
!wpn iximeow
* Qboid
gives iximeow a mu monopoly
<egg>
!wpn -add:wpn rat
<Qboid>
egg: Weapon added!
<egg>
!wpn whitequark
* Qboid
gives whitequark a peculiar adjective/pharmacy hybrid
<iximeow>
that seems fitting
<iximeow>
also whitequark not being a black squa^Wcircle is weird still
<egg>
iximeow: yeah, but cat
* egg
pets whitequark
<iximeow>
oh yes it's a v good image
<iximeow>
i can't seem to trick myself into getting work done today
<bofh>
iximeow: same, tho it's less bad than the complete writeoff that was yesterday
<egg>
bofh: so Sun's cbrt says < 0.667 and I measure 0.667106 ULPs
<egg>
also it's slow, though not quite as much as microsoft's
<bofh>
egg: close enough :P also not surprised it's slow.
<bofh>
clearly yours is good enough to ship
<APlayer>
Is sci-hub working at all?
<bofh>
Works for me, sci-hub.hk
<APlayer>
Hmm, couldn't get a single paper from it for a few weeks now
<APlayer>
Must be doing it wrong
<APlayer>
Huh
<APlayer>
So when I open it via IP (which I have now switched to, because the domains are unreliable), it fails to open things
<APlayer>
Guess I /will/ have to keep looking up valid domains
<APlayer>
SnoopJeDi: So how would I go about implementing LSQ for my given problem?
<APlayer>
It says something about two parameters on Wikipedia, I have 6
<APlayer>
I could probably partially differentiate all 9 of my equations and have 6 unknowns, and... I have no idea what to do then.
<APlayer>
(I would actually do this by hand and re-use the equations in a program)
<SnoopJeDi>
it'd be two parameters for fitting a line (which is a common LSQ), but you're doing a regression over more parameters. I think you can view it as picking the best hyperplane?
<SnoopJeDi>
anyway, solve grad(e_i) = 0
<SnoopJeDi>
i.e. calculate grad(e_i) which gives you 6 vector "slots" per equation (i.e. 6 variables wrt to differentiate)
<SnoopJeDi>
err sorry that's wrong innit
<APlayer>
Uh, sorry?
<SnoopJeDi>
grad((Σe_i)^2) is what you really want
<SnoopJeDi>
...dammit, grad(Σ(e_i)^2)
<SnoopJeDi>
the sum of all the squared-errors is like a "megaerror" and you're trying to minimize that
<SnoopJeDi>
i.e. the least of the squares
<APlayer>
Yes
<SnoopJeDi>
so yea, work out grad() of that sum and set it to zero, giving you 6 equations (one equation per "slot" of the gradient vector) in 6 variables
<APlayer>
But solving it is pretty much my problem, heh
<APlayer>
So I differentiate the whole sum with respect to every variable individually
<SnoopJeDi>
oh hrm, maybe that system is only sometimes invertible though
<kmath>
<DrAstroStu> Just found this, a beautiful infographic about Cassini's path through the Saturnian system over the course of 13 ye… https://t.co/1vz67ds5Wm
<SnoopJeDi>
APlayer, how many sample points are you fitting to? More than 6, right?
<APlayer>
9, exactly
* egg
pets bofh
<egg>
(is bofh also a cat)
<SnoopJeDi>
okay I think you're guaranteed to have a unique minimum then
<SnoopJeDi>
it doesn't go terribly in-depth, but I think your problem is conceptual at this point
<APlayer>
I am struggling with this linear thing
<APlayer>
I can't understand how plotting a line for a best fit for a set of data correlates with my problem
<APlayer>
I do have a linear adjustment for every axis, but it's hidden behind a square, a sum, a square root, another sum and another square, heh
<SnoopJeDi>
APlayer, that's because you're forcing yourself to think about the problem as if lines have literally anything to do with it
<SnoopJeDi>
fitting a line is a *special case* of the technique that is by far the most common
<APlayer>
Ah
<APlayer>
Alright
<SnoopJeDi>
the idea is "pick the line that minimizes the errors, i.e. fits the data the best"
<APlayer>
In my case, the data is the vector magnitudes
<SnoopJeDi>
there is one and only one such line for N >= 2 points
<SnoopJeDi>
because there are 2 free parameters, slope and intercept
<APlayer>
The error is the error of the vector magnitude to 1
<APlayer>
But... No, that can't be it. Else I'd be looking for a linear function again, which I am not
<SnoopJeDi>
"looking for a linear function?"
<SnoopJeDi>
you're finding the best hyperplane
<SnoopJeDi>
which you can consider an extension to the notion of fitting a line
<APlayer>
I am looking for parameters that are already "included" in my data, which kind of confuses me
<SnoopJeDi>
if you're free to choose the c_j then what you just said is not true
<SnoopJeDi>
you seek the least-crappy choice of those parameters, though
<SnoopJeDi>
i.e. the ones that minimize the errors
<APlayer>
Yes, I do
<APlayer>
But if I consider the vector magnitudes to be my data (I do, right?), then the c_j things are already part of the calculation I use to get the magnitudes
<SnoopJeDi>
that's vague
<SnoopJeDi>
are you saying the errors e_i depend on the c_j?
<SnoopJeDi>
or something more nuanced? the x_i do?
<APlayer>
I have data points that consist of the readouts for three axes. I adjust each of the axes with two c_j's and then I get the vector's magnitude, which is the number that I try to get as close to 1 as possible
<APlayer>
Yes, e_i depend on c_j
<SnoopJeDi>
well yea, they'd damn well better depend or you don't have an optimization problem
<SnoopJeDi>
a knob that's not connected to anything can't be tuned, heh.
<APlayer>
Well, I'd argue that it can be tuned more easily than if it is connected to something that adds resistance or inertia. :P
<APlayer>
turned*
<SnoopJeDi>
I don't understand what that might mean
<APlayer>
Nevermind, I am just being silly
<APlayer>
So, again, I take the whole sum of errors as a function, partially differentiate for every c_j and then solve this as a system of 6 equations with 6 unknowns?
<APlayer>
Which is what Wikipedia says
<APlayer>
You talk about a gradient, which is the same thing as differentiation, I guess
<SnoopJeDi>
yea you can think of the gradient as a vector built from derivatives
<SnoopJeDi>
all the derivatives need to be zero for a minimum (although this does not *guarantee* a minimum, c.f. saddle points)
<SnoopJeDi>
although looking back at your equations, I guess you can't actually write this as a linear system...
<SnoopJeDi>
oh nvm yes you totally can, you just need to consider 1/c_j1 as your free parameters to solve for
<SnoopJeDi>
the c_j2 are fine the way they are
* EntropicRaptor
runs out of phonons.
<APlayer>
Well, then, let me just do a gradient descent then, and be done with it.
<APlayer>
Too much for too little a task.
<APlayer>
I am still just calibrating a single sensor.
<SnoopJeDi>
that's exactly how I would describe gradient descent for this, incidentally
<SnoopJeDi>
but it will probably require sufficiently few iterations that you can still bazooka it that way
<SnoopJeDi>
keep it in mind if you're short on cycles later
<APlayer>
That should be a standalone Arduino sketch, I won't be short on cycles unless the whole board overloads
<SnoopJeDi>
probably particularly if you take your previous set of parameters as your initial condition, since I would guess most perturbations cause small changes in the problem space
<APlayer>
I have no previous parameters as of yet
<SnoopJeDi>
I meant in the realtime app you're building. This is continually running on a quadcopter, IIRC?
<APlayer>
No, not the calibration, heh
<SnoopJeDi>
oh
<APlayer>
I shall calibrate it once every few weeks or so, and use the so-obtained parameters for the meantime
<egg>
!wpn котя and whitequark
* Qboid
gives котя and whitequark an oxygenated /kick