UmbralRaptor changed the topic of #kspacademia to: https://gist.github.com/pdn4kd/164b9b85435d87afbec0c3a7e69d3e6d | Dogs are cats. Spiders are cat interferometers. | Космизм сегодня! | Document well, for tomorrow you may get mauled by a ネコバス. | <UmbralRaptor> … one of the other grad students just compared me to nomal O_o | <ferram4> I shall beat my problems to death with an engineer.
<kmath>
<DrAndrewThaler> Science news this week: You can succeed as long as you have a side hustle as a running model or Joe DiMaggio randomly lends you a hand.
icefire has quit [Remote host closed the connection]
<Ellied>
Nice. My {channels called an idiot on}/{channels talked on} ratio for Freenode in the last month is a solid 1.0
<Ellied>
Every single time was a case of "your experience differs very slightly in a largely superficial way from mine; therefore you are completely unintelligent"
awang has quit [Read error: -0x1: UNKNOWN ERROR CODE (0001)]
awang has joined #kspacademia
<bofh>
Ellied: yaey
<FluffyFoxeh>
that's unfortunate. I've had good experiences on freenode
<FluffyFoxeh>
THEREFORE YOU MUST BE AN IDIOT
<FluffyFoxeh>
(no :p)
<Fiora>
i've had good experiences on freenode too
<Fiora>
for example, disconnecting from freenode is very cathartic
<Greys>
"This media may contain sensitive material. Your media settings are configured to warn you when media may be sensitive. "
<Greys>
twitter really needs to implement the complicated multilayer vetting necessary to allow a "no this is not sensitive" button to be functional
APlayer has quit [Ping timeout: 383 seconds]
<UmbralRaptor>
Greys: I would assume Google in general and Google Scholar in particular? Phase diagrams don't seem like the sort of thing that would get too much woo.
<kmath>
<OrgaP> @OhEmmeG I passed out in a bus because I was too broke to eat and had to tell people not to call 911.
<bofh>
UmbralRaptor: yeah, I've never seen a disreputable phase diagram, they're not something cranks'd care about...
<UmbralRaptor>
bofh: that ellipsis is worrying me.
<bofh>
well it's more I'm trying to think of examples of bullshit ones and failing :P
<UmbralRaptor>
I wouldn't trust the high pressure and/or temperature ends, though? Presumably measurements get hard somewhere?
<bofh>
like I'm not sure I trust any experimental evidence of generation of metallic hydrogen in an earth lab *yet*, but that's more "it's incredibly difficult, requires absurd pressures & is tricky to verify" :P
<bofh>
but basically any unknowns I can think of are more likely due to simple difficulty of the task and/or lack of understanding
<bofh>
(for another example, I am 100% certain ANY phase diagram of a Cuprate is missing some states)
<bofh>
(but that's simply b/c I'm fairly convinced they exhibit the kitchen sink given the right combination of pressure, temperature & ambient magnetic field)
APlayer has joined #kspacademia
<APlayer>
Hi!
* UmbralRaptor
pokes Type II superconductors with a B field.
<UmbralRaptor>
!wpn APlayer
* Qboid
gives APlayer a pointy rectifier-like banana
<APlayer>
!wpn UmbralRaptor
* Qboid
gives UmbralRaptor an infinity
<APlayer>
Boo, that was boring ;P
<egg|work|egg>
!wpn UmbralRaptor
* Qboid
gives UmbralRaptor a lugubrious ꙮ
<APlayer>
Actually, which is the kind of transistor that only permits flow when the base is on? The PNP or NPN one?
<APlayer>
!wpn egg|work|egg
* Qboid
gives egg|work|egg a Bloch Newtonian PDF
<APlayer>
Also, Iskierka: I checked out that RPi pin. It indeed goes high on boot and low on shutdown. Couldn't that be inverted using a single transistor, of the one kind that closes collector/emitter flow when the base is connected?
<APlayer>
It delivers 3.3V, it seems, though, so it will require a stronger than usual transistor
<egg|work|egg>
!wpn bofh
* Qboid
gives bofh a Shenzhen Norman sphere which vaguely resembles a valve
<Qboid>
kd: Prints out the details for Kountdown Events
<Qboid>
parameters: -add (Add Kountdown event. Syntax: !kountdown -add name|description|time.), -list (List pending Kountdown Events or subscribers), -remove (Delete Kountdown by id.), -edit (Edits a Kountdown by id.), -subscribe (Subscribe yourself or a channel to Kountdown.), -unsubscribe (Unsubscribes yourself or a channel from the Kountdown.)
<Qboid>
example: !kountdown 1
<Iskierka>
APlayer, Ellied said there's a few ways you can set up the circuit to give the desired behaviour, so it'd be better for her to talk to you to go through the ways
<APlayer>
Okay, thanks! When is Ellied usually online (relatively to now)?
APlayer has quit [Ping timeout: 204 seconds]
<egg|work|egg>
!kd -add Ураган-М №752/Фрегат-М/Союз-2.1б|A Союз-2.1б with Фрегат-М upper stage will launch the ГЛОНАСС satellite Ураган-М №752 (http://space.skyrocket.de/doc_sdat/uragan-m.htm) into a circular orbit with 64.8° inclination.|2017-09-22T00:02:32Z
<Qboid>
egg|work|egg: Added event #9
APlayer has joined #kspacademia
<bofh>
!wpn egg|work|egg
* Qboid
gives egg|work|egg an average python
<bofh>
so, Python 3.15? :P
<UmbralRaptor>
πthon
<egg|work|egg>
!kd -add AsiaSat 9/Бриз-М/Протон-М|A Протон-М with Бриз-М upper stage will launch the SSL-1300S-based AsiaSat 9 (http://space.skyrocket.de/doc_sdat/asiasat-9.htm) from Байконур into GEO.|2017-09-28T18:50Z
<egg|work|egg>
!kd -add 「みちびき3号機」/H-IIAロケット35号機|An H-IIA rocket will launch the DS-2000-based みちびき4号機 navigation spacecraft (http://space.skyrocket.de/doc_sdat/qzs-2.htm) from 種子島宇宙センター into an eccentric geosynchronous orbit with an inclination of 41°.|2017-10-09T22:00Z
<Qboid>
egg|work|egg: Added event #11
<egg|work|egg>
wait wrong name
<UmbralRaptor>
So, they need my mail stop (Not sure we have one), and my advisor's phone # (not listed in the campus directory) to sign up…
<egg|work|egg>
!kd edit:11 name 「みちびき4号機」/H-IIAロケット35号機
<Qboid>
egg|work|egg: Invalid ID!
<egg|work|egg>
!kd -edit:11 name 「みちびき4号機」/H-IIAロケット35号機
<Qboid>
egg|work|egg: Updated event #11: 「みちびき4号機」/H-IIAロケット35号機 - An H-IIA rocket will launch the DS-2000-based みちびき4号機 navigation spacecraft (http://space.skyrocket.de/doc_sdat/qzs-2.htm) from 種子島宇宙センター into an eccentric geosynchronous orbit with an inclination of 41°. - 2017-10-09 22:00:00
<egg|work|egg>
contrast with !kd 5 which went into GEO
<kmath>
<demitrimuna> Are you an astronomy educator? What would help you prepare your students with respect to programming & data science? Please contact me / RT!
egg|phone|egg has joined #kspacademia
egg|mobile|egg has quit [Read error: Connection reset by peer]
egg|phone|egg has quit [Read error: Connection reset by peer]
egg|phone|egg has joined #kspacademia
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Ping timeout: 183 seconds]
APlayer has joined #kspacademia
egg|phone|egg has joined #kspacademia
egg|cell|egg has quit [Read error: -0x1: UNKNOWN ERROR CODE (0001)]
awang has quit [Read error: Connection reset by peer]
<APlayer>
Does simulated annealing cool down even when not accepting a change?
<APlayer>
Also, where is a good place to start with cooling schedules and accepting probability functions?
<SnoopJeDi>
APlayer, I believe most implementations would cool down, yes. The fact that you're less likely to transition out of that state acts in that state's favor
<APlayer>
Humm, okay
<UmbralRaptor>
Yeah, the cooling is purely time based in the simulations I've seen.
<APlayer>
So I can basically define my cooling schedule as a LUT of iteration -> temperature?
* APlayer
approves of sci-hub
<APlayer>
Thanks for teaching me that, guys :D
<UmbralRaptor>
>_>;;
<UmbralRaptor>
LUT?
<APlayer>
Lookup table
<UmbralRaptor>
Ah, I think so.
<SnoopJeDi>
You could, but it shouldn't be an expensive calculation so I think most of the time people would just perform the calculation
<APlayer>
Humm
<SnoopJeDi>
otherwise you'd have to re-create the table each time you change schedule parameters and god forbid you forget to update the table and *think* you're using schedule X but actually using schedule Y
<APlayer>
The thing is, I am just looking for a fool-proof tuning of the algorithm with a /LOT/ of iterations and not likely to be ever tuned by the user
<APlayer>
s/tuned/changed/
<Qboid>
APlayer meant to say: The thing is, I am just looking for a fool-proof tuning of the algorithm with a /LOT/ of iterations and not likely to be ever changed by the user
<UmbralRaptor>
Uh, cooling rate tends to be useful to vary.
<SnoopJeDi>
APlayer, my advice would be not to prematurely optimize
<SnoopJeDi>
If you find at some point that calculating such a thing is expensive, *that* is the time to start scratching your head looking for perf savings, imo
* SnoopJeDi
has lost many many hours to "optimization" in order to learn this lesson
<APlayer>
And I've never done such a thing before... Does T_0 / ln(1 + iteration) play well with a truly large (10 000) amount of iterations?
<SnoopJeDi>
10,000 is not typically considered a large number, not even close
<SnoopJeDi>
but if you're really worried about it, profile it
<APlayer>
To describe the problem, the program may (or may not) be used in about a week at my school to assign students to a small variety of projects (4 projects, about 80 students) based on a first, second and third choice from everyone
<APlayer>
And I am not writing it on a terribly fast system. JavaScript, in fact.
<APlayer>
So I am not sure whether 10,000 is much, little or appropriate
Majiir has quit [Ping timeout: 183 seconds]
<SnoopJeDi>
well, the line you wrote is 3 operations worth: an addition, a logarithm, and a division. so you're looking at 30,000 operations. Assuming you're running on a machine capable of more than, say, 30 MHz, you're talking about less than 1 ms of time spent doing that work (we're being naive here because it's an estimate)
<SnoopJeDi>
but more to the point: don't go out of you way to make code go fast until it's slow
Snoozee has joined #kspacademia
Snoozee is now known as Majiir
<SnoopJeDi>
modern machines are sufficiently fast that doing "the dumb thing" often pays off pretty well in wall-clock time anyway, and the resulting code is easier to understand. If it *is* slow, then you can bust out a profiling tool and say "where is it slow and why?"
<APlayer>
Okay
<APlayer>
I'll first see how well or bad it runs :D
* egg|afk|egg
groans at SnoopJeDi for counting addition and logarithms equally
egg|afk|egg is now known as egg
<SnoopJeDi>
You'll notice I didn't call them FLOPs, egg
<egg>
SnoopJeDi: yes, but then you added them
<UmbralRaptor>
egg: Logarithms run in O(log(n)) time, right?
* UmbralRaptor
ducks
* egg
throws a duck at UmbralRaptor
<APlayer>
(I am just a bit afraid of the numbers. I learned programming and did it mostly on a 6 MHz calculator with a language that was thousands of times slower than assembly on the same machine)
<SnoopJeDi>
egg, APlayer is sufficiently lost in the weeds as it is, there's no need to be deliberately obtuse
<SnoopJeDi>
and certainly nothing to be gained at this level of detail
<egg>
SnoopJeDi: it's not about being obtuse, but identifying things two orders of magnitude away is evil
<APlayer>
egg: No worries, I noticed that, but I believe in a language as high level as JS, it matters comparatively little
<SnoopJeDi>
Well, I didn't *do* that, so methinks the egg doth protest too much
<SnoopJeDi>
It's an *estimate*, egg
<egg>
then just count the logs :-p
<SnoopJeDi>
See above remark about lost in the weeds
<SnoopJeDi>
Detail for detail's sake wastes everyone's time :P
<egg>
...
egg is now known as egg|afk|egg
<APlayer>
Anyway... Do you think 10,000 is appropriate for the current problem?
<SnoopJeDi>
I have no idea APlayer. There are some ways to qualify these sorts of problems, but I'm not all that familiar with them
<SnoopJeDi>
Optimization is a Hard Problem™ and it comes off rather more like an art than a science sometimes
<APlayer>
No worries
<SnoopJeDi>
But that's kinda why being able to change your parameters is important: if you can only get convergence with a magical set of annealing parameters, your solution probably isn't very robust
<APlayer>
Let me just look at the mathematical properties of the log cooldown function
<APlayer>
At iteration = 10000 it is still T_0 / 10
<SnoopJeDi>
APlayer, in particular, to know how a particular algorithm will behave for a particular problem, you have to be able to say stuff about the problem space
<UmbralRaptor>
I just decided to try taking 10k logs on my MBA in an open Python session. I, uh, didn't notice it running,
<SnoopJeDi>
...and if you knew the problem space inside and out, you could imagine you might just be able to pick the best solution :P
<UmbralRaptor>
(granted this is a recentish i5)
awang has quit [Read error: -0x1: UNKNOWN ERROR CODE (0001)]
<SnoopJeDi>
UmbralRaptor, yea before APlayer mentioned that it was JS, I was mocking up a naive Python loop with timeit
<APlayer>
Wut
<SnoopJeDi>
I was setting up an experiment to do exactly what you were describing
<APlayer>
You were mocking up an algorithm a random stranger asked you about via IRC?
<APlayer>
o.o
<APlayer>
My respect to you, Sir
<SnoopJeDi>
It's hard to overstate the value of just *trying* the thing you wanna do
<bofh>
^
<SnoopJeDi>
UmbralRaptor, you know about the %timeit magic in IPython notebooks, right? <3
<APlayer>
I am trying to try to do it :P
<UmbralRaptor>
SnoopJeDi: If I were good about using Jupyter, probably. >_>;;
<SnoopJeDi>
you can use it in a "normal" IPython kernel too, it just runs timeit.timeit on that cell, which is neat
<APlayer>
So, I want the temperature to be pretty near to 0 at iteration = 10,000, right?
<SnoopJeDi>
there's also %time which just counts CPU and wall-clock time
<UmbralRaptor>
Shiny
<UmbralRaptor>
APlayer: yeah.
<APlayer>
Which means that for T_0 / something, "something" should approach T_0
<APlayer>
And ln(1 + iteration) does definitely not do that
<UmbralRaptor>
Depends, is T_0 ~10?
<APlayer>
T_0 is at 100, but I may change that if it is silly
<APlayer>
The starting assignment of students is alphabetical
<APlayer>
Or rather, in the order they were put into the system. Most likely alphabetical.
<APlayer>
Would taking another log base help here?
<SnoopJeDi>
well, if you want it to approach T_0, make it T_0 * log(10001) / log(1 + it)
<SnoopJeDi>
or more generally log(1 + max_it) / log(1 + it)
<APlayer>
Seems promising...
icefire has joined #kspacademia
<UmbralRaptor>
…you'd think that by now, I would remember how to do matrix multiplication…
egg|afk|egg is now known as egg
<egg>
UmbralRaptor: *wags tail*
* UmbralRaptor
blinks.
<egg>
UmbralRaptor: it's easier to remember if you use Einstein notation tbh
<egg>
(and also harder to screw things up as far as confusing the space with its dual is concerned)
<egg>
UmbralRaptor: a^i_k b^k_j \o/
<UmbralRaptor>
o_O
<egg>
UmbralRaptor: the \o/ is not part of Einstein notation
<UmbralRaptor>
No, at the ^i part
* UmbralRaptor
would have eggspected a_{ik} b_{kj}
<egg>
UmbralRaptor: but that's the whole point of Einstein notation, the covariant and contravariant indices are distinguished!
<egg>
UmbralRaptor: and that way you don't need to say what you sum over, because you can only sum over one covariant and one contravariant index
<egg>
UmbralRaptor: e.g. the trace is a^k_k
<egg>
UmbralRaptor: and you can only take the trace of a linear endomorphism, if you have a bilinear form a_{ij} you need to raise an index before you can take a trace
<egg>
(so a_{ij}g^{ij})
awang has joined #kspacademia
<awang>
join #principia
<egg>
I'm there already :-p
<egg>
you might want to put a / there :-p
<egg>
!wpn awang
* Qboid
gives awang an infinite duck
<UmbralRaptor>
Another duck!
<egg>
!wpn -add:wpn bear
<Qboid>
egg: Weapon added!
<APlayer>
!wpn People
* Qboid
gives People a geodesic walnut
<APlayer>
I've got to go for today... But I will be stronger when I return next time :P
<SnoopJeDi>
I wish my supervisor did not hate useful and legible data.
* SnoopJeDi
stares at ad-hoc excel spreadsheet with randomly selected columns
<SnoopJeDi>
He's produced about a dozen different such files from the results of his Fortran tracker, one would think it would be so much LESS work to standardize it and report all relevant data.
<UmbralRaptor>
egg: uh, I think I need to look up most of those terms.
<egg>
UmbralRaptor: so the idea with Einstein notation is that the things you're manipulating are tensor products of V and V*
<SnoopJeDi>
not directly, there's an intermediate human step
<SnoopJeDi>
(which is all the more infuriating)
<egg>
UmbralRaptor: a (column) vector is an element of V; a linear form (a row vector) is an element of V*
<egg>
UmbralRaptor: a linear endomorphism of V (a square matrix) is an element of V ⊗ V*
<UmbralRaptor>
SnoopJeDi: Have you tried separating your supervisor with commas?
<SnoopJeDi>
He's a whitespace kind of guy
<SnoopJeDi>
UmbralRaptor, I should thank my lucky stars that he included column names and units, honestly
<UmbralRaptor>
egg: * as in transpose?
<egg>
* as in dual
<egg>
the space of linear forms
<SnoopJeDi>
"the space wot you enter when you turn the vector sideways"
<SnoopJeDi>
the (column) vector*
<egg>
UmbralRaptor: let's say you have a vector space V, V* is the set of functions f with values in the field such that for all v, w in V, and α in the field, f(v+αw) = f(v) + α f(w)
APlayer has quit [Ping timeout: 383 seconds]
<SnoopJeDi>
(i.e. linear forms)
<UmbralRaptor>
SnoopJeDi: No, no. Separate *him*, not the data.
<SnoopJeDi>
heh
<SnoopJeDi>
today's one of those "can I meet the bare minimum competence" days
<egg>
UmbralRaptor: for column vectors, row vectors (which apply by multiplication with a column vector) do the job as SnoopJeDi says; let's say more interestingly that V is the space of polynomials with real coefficients, an example of an element of V* is the integral of your polynomial on [0,1]
<egg>
UmbralRaptor: or evaluation at any point
<SnoopJeDi>
Right, the idea of duality goes much deeper than the subject at hand (and away from the subject)
<UmbralRaptor>
AAAAAAA
<SnoopJeDi>
I particularly like the way duality intersects graph theory
<SnoopJeDi>
(even if I don't understand it very much at all)
<egg>
yeah, but let's stay with vector space duality (but not entirely inside matrices otherwise the abstraction looks pointless)
<SnoopJeDi>
egg, so as to avoid needless tangents? ;P
<egg>
oh we can bring in tangent spaces at some point if we want to teach differential geometry to a helpless diapsid :D
<SnoopJeDi>
hehe
<egg>
UmbralRaptor: so, let's say you have a basis e_i for V, then that gives you a basis for V*: e^i is the function that maps e_i to 1, and e_k to 0 for k not equal to i
<SnoopJeDi>
I confess that my unfamiliarity with duality makes my degree feel rather fraudulent, so I'm equally a pupil here.
<egg>
UmbralRaptor: eggsample, for the space of polynomials, we can take the monomial basis {1, x, x^2, x^3, ...}, in that case the dual basis is {evaluate at 0, evaluate the derivative at 0, evaluate the second derivative at 0, ...}
<egg>
[IMPORTANT NOTE: I lied! this is not a basis! infinite dimensional vector spaces are tricky]
<egg>
eggsample: look at the linear form "evaluate the polynomial at 1"
<egg>
by Taylor, it is evaluate at 0 + evaluate the derivative at 0 + 1/2 evaluate the second derivative at 0 + ...
<egg>
but that's an infinite sum! linear combinations in linear algebra are finite!
<egg>
so, this is not in the span of that "dual basis"
<egg>
UmbralRaptor: SnoopJeDi: so, let's switch back to a finite dimensional vector space, the space of polynomials of degree less than or equal to 9223372036854775807
<egg>
{evaluate at 0, evaluate the derivative at 0, evaluate the second derivative at 0, ...} *is* a basis of V*, which is dual to {1, x, x^2, x^3, ...}
<egg>
UmbralRaptor: and "evaluate the polynomial at 1" = evaluate at 0 + evaluate the derivative at 0 + 1/2 evaluate the second derivative at 0 + ... which is a finite sum (albeit a rather long one)
<egg>
UmbralRaptor: does that make sense?
SilverFox is now known as SilverFox1
<egg>
!wpn UmbralRaptor
* Qboid
gives UmbralRaptor a bolter
SilverFox1 is now known as SilverFox
<egg>
!wpn
* Qboid
gives egg an acid-sensing photomultiplier tube
* egg
pokes UmbralRaptor with the photomultiplier tube
* UmbralRaptor
may have gotten lost at the duals not working like transposes of row/column vectors part.
<egg>
UmbralRaptor: well they are, but that's just a special case
<egg>
UmbralRaptor: does the concept of "V* is the set of functions f with values in the field such that for all v, w in V, and α in the field, f(v+αw) = f(v) + α f(w)" make sense
<egg>
UmbralRaptor: and do you see why row vectors are those if V is column vectors
<UmbralRaptor>
No, those seem unrelated.
<egg>
UmbralRaptor: so, let n be the dimension of V, let f be a row vector of dimension n, define f(v) as f * v
<egg>
UmbralRaptor: (a row vector times a column vector, if the dimensions match, is a scalar)
<egg>
(if the dimensions don't match it's not defined)
<UmbralRaptor>
Yay, inner product (sort of)
<egg>
UmbralRaptor: matrix multiplication is linear, so you do get f(v+αw) = f(v) + α f(w)
<egg>
UmbralRaptor: now take any function f such that f(v+αw) = f(v) + α f(w); let e_i be (column) basis vectors, you have f(v^1 e_1 + v^2 e_2 + ... + v^n e_n) = v^1 f(e_1) + ... + v^n f(e_n)
<egg>
(I'm writing the indices of v in superscript)
<egg>
UmbralRaptor: but that means that f as a function behaves exactly as the row vector (f(e_1), ..., f(e_n))
<egg>
UmbralRaptor: does it make sense that row vectors are eggsactly the linear forms on column veggtors now?
<UmbralRaptor>
I think so.
<egg>
UmbralRaptor: okay, so now polynomials are veggtors, too, right?
<egg>
(you can add them, multiply them by a scalar, there's a 0 polynomial, all those operations have the right properties)
<egg>
UmbralRaptor: e.g. (2x^17+3x+2) + 3 * (x^17-x+1) = 5x^17+3
<egg>
UmbralRaptor: and polynomials with degree less than or equal to 9223372036854775807 are also a veggtor space (because by adding polynomials and multiplying them by a scalar you can't increase the degree)
<SnoopJeDi>
i.e. the space is closed under addition and scalar multiplication
* egg
pokes UmbralRaptor with a linear form
<egg>
UmbralRaptor: and then evaluation of a polynomial at 42 is linear: (p+λq)(42) = p(42) + λ * q(42)
<UmbralRaptor>
ok
<egg>
UmbralRaptor: so is integration over an interval: ∫_[0,1] (p+λq)(x) dx = ∫_[0,1] p(x)+λq(x) dx = ∫_[0,1] p(x) dx + λ ∫_[0,1] q(x) dx
<egg>
UmbralRaptor: so is evaluation of the derivative at 54: (p+λq)'(54) = p'(54) + λ * q'(54)
<egg>
UmbralRaptor: so those are all linear forms on the space of polynomials
<egg>
UmbralRaptor: now the space of polynomials with degree <= 9223372036854775807 has many interesting bases (something something Чебышёв polynomials), it also has the fairly boring monomial basis: 1, x, x^2, ..., x^9223372036854775807
<egg>
UmbralRaptor: this is obviously a basis, because that's how you write a polynomial: as a linear combination of monomials
<egg>
(it's even in the name!)
<SnoopJeDi>
!wpn -add:adj obvious
<Qboid>
SnoopJeDi: Adjective added!
<egg>
!wpn -add:adj trivial
<Qboid>
egg: Adjective added!
<UmbralRaptor>
Wait, is 922372036854775807 a specific max int, or just a big number?
<egg>
it's 2^64-1, but I could take any other number
<UmbralRaptor>
ah
<SnoopJeDi>
2^63 - 1 :P
<egg>
argh yes
<SnoopJeDi>
but that is obviously an eggsceptional number in the context of #kspacademia
<SnoopJeDi>
oops, I used the o word
<egg>
that was mostly me being silly because I had started with unbounded degree (in which case the degree is countable infinity), but that broke things so I picked a silly large number instead
<egg>
SnoopJeDi: it is a good word
<SnoopJeDi>
For very special definitions of "good"
<UmbralRaptor>
!wpn -add:adj obvious
<Qboid>
UmbralRaptor: Adjective already added!
<egg>
UmbralRaptor: so, now, for column vectors with the basis (1, 0, ..., 0)^T (0, 1, ..., 0)^T, ..., (0, 0, ..., 1)^T, you get the dual basis (1, 0, ...), (0, 1, ..., 0), ... (0, 0, ..., 1)
<egg>
UmbralRaptor: why is that? because if you multiply that first row vector with the colum vectors, it yields 1 for the first, and 0 for the others; the second row vector yields 1 on the second column vector, 0 on the others; etc.
<egg>
UmbralRaptor: now, similarly, the dual basis to the basis of monomials is the one whose first element maps 1 to 1 and x, x^2, etc. to 0
<egg>
UmbralRaptor: whose second element maps 1 to 0, x to 1, and x^2, x^3, etc. to 0,
<egg>
and so on
<egg>
UmbralRaptor: evaluation at 0 fits the bill for the first one: (x -> 1)(0) = 1, (x -> x)(0) = 0, (x -> x^2)(0) = 0...
<X>
Hello.
<X>
I am squared today.
<X>
Multiply me by myself.
<SnoopJeDi>
!wpn X
* Qboid
gives X a late 80s offensive blizzard which strongly resembles an invariant
<egg>
UmbralRaptor: then evaluation of the derivative at 0 does the job for the second one: (x -> 1)'(0) = (x -> 0)(0) = 0, (x -> x)'(0) = (x -> 1)'(0) = 1, (x -> x^2)'(0) = (x -> 2x)(0) = 0...
<X>
Dear UmbralRaptor, please teach me differential equations through an subliminal or unconscious medium.
<SnoopJeDi>
the way things wiggle sometimes depends on the things and also how they've been wiggling, class dismissed
<egg>
UmbralRaptor: then, evaluation of the second derivative, *halved*, at 0, does the job for the third one: (x -> x^2)''(0)/2 = (x -> 2x)'(0)/2 = (x -> 2)(0)/2
<egg>
UmbralRaptor: the fourth element of the basis will be one sixth of the evaluation of the third derivative, etc.
<kmath>
<TheQueenInGlory> vampire mayor, who is an on duty soldier, going to jail. busy guy bonus: his current thought https://t.co/Jup8E5Wal8
<UmbralRaptor>
yes, I think
<UmbralRaptor>
… on an unrelated note, I should probably eat today.
<egg>
UmbralRaptor: nah, no need for that :-p
<egg>
UmbralRaptor: so now, if p has degree <= 922372036854775807, p(1) = p(0) + p'(0) + p''(0)/2 + p'''(0)/6 + ... + p^{(922372036854775807)}(0)/(922372036854775807!)
<egg>
UmbralRaptor: which means that the linear form "evaluate at 1" is expressed as (1, 1, 1, ...) in the dual basis of the monomial basis
<egg>
UmbralRaptor: more generally, since f(v^i e_i) is v^i f(e_i), the coordinates of any linear form in the dual basis is its values on the basis elements, so e.g. "integrate on [0,1]" is (1, 1/2, 1/3, 1/4, ...)
<UmbralRaptor>
;8ball Can I use dual forms to prove tr(XY) = tr(YX), where X and Y are hermetian matrices.
<kmath>
UmbralRaptor: Concentrate and ask again
<SnoopJeDi>
slightly related: I did not realize until *very* recently that cheese has a distinct radial gradient (making non-wedge serving practices a big cocktail faux pas)
<egg>
UmbralRaptor: well it's obvious, x^i_k y^k_i = y^k_i x^i_k
<UmbralRaptor>
SnoopJeDi: some men just want to watch the world burn.
<egg>
UmbralRaptor: no need for the matrices to be hermitian
<UmbralRaptor>
Or even square, as the next problem does so with vectors.
<SnoopJeDi>
cheesewheels: basically Laplace's equation on a disc?
<egg>
UmbralRaptor: uh how are you going to have both XY and YX be defined if they're not square
<UmbralRaptor>
egg: well, <φ|θ> and |θ><φ|
<egg>
UmbralRaptor: but yeah, cyclic trace is just x^i_k y^k_i = y^k_i x^i_k (if you don't like Einstein notation drop a summation sign on either side)
<egg>
UmbralRaptor: uh btw <φ| lives in the dual, that's another eggsample
egg is now known as egg|z|egg
<egg|z|egg>
|egg> lives in a vector space, <egg| in its dual, the inner product relates the two
<egg|z|egg>
UmbralRaptor: see the discussion with rqou a couple of days back
<egg|z|egg>
UmbralRaptor: if φ is a vector, you define <φ| := <φ| . > (. being a placeholder, so the function that maps v to the inner product <φ|v>), and then it looks cute if you call φ |φ> instead, that way <φ|(|φ>) = <φ|φ>
<egg|z|egg>
UmbralRaptor: so back to |θ><φ|, that's a different animal entirely; it's a tensor product!
<egg|z|egg>
it's really morally |θ>⊗ <φ|
<soundnfury>
It's an entirely different animal, altogether!
<egg|z|egg>
(whose coordinates are the θ_i φ^j, and whose trace is θ_i φ^i, which is clearly the same as <φ|θ>)
<egg|z|egg>
UmbralRaptor: but I guess that's a good opportunity to introduce the fact that endomorphisms on V are actually the same as elements of V⊗V*
* egg|z|egg
pokes UmbralRaptor in the feathers
<UmbralRaptor>
ow
<egg|z|egg>
UmbralRaptor: do you know about the tensor product
<egg|z|egg>
(besides having been sneakily taught about it hidden in brakets like |θ><φ|
<egg|z|egg>
)
<UmbralRaptor>
I'm increasingly thinking I'll need 2 years of linear algebra to get through 1 semester of QM.
<egg|z|egg>
UmbralRaptor: nah that fits in one year and tbh that probably fits in one hour
<egg|z|egg>
okay maybe two
<FluffyFoxeh>
I think it'd be neat to understand these advanced maths but since I'm not required to have a lot of exposure to it (computer science major) it's too easy for me to just avoid the heavy maths courses
<egg|z|egg>
fortunately, you can't escape them here muahahahaha
<UmbralRaptor>
I'm aware of the tensor product giving you a matrix from 1 vectors, but not in the detail you're going to suggest.
<FluffyFoxeh>
if I want to do an honours degree (which I might) I'll have to take another algorithms course
<egg|z|egg>
bofh: I wonder whether I should eggsplain the tensor product to UmbralRaptor in coordinates or go with the friendly universal property; the former might be quicker, but ew
awang has quit [Read error: Connection reset by peer]
awang has joined #kspacademia
<UmbralRaptor>
(also, wait. FluffyFoxeh is a Brit?)
<FluffyFoxeh>
no
<FluffyFoxeh>
canadian
<rqou>
hey egg|z|egg i have something to make you sad again
<rqou>
this class really likes the "bag of numbers" "shove everything into huge vectors" view of math
<egg|z|egg>
*sob*
<rqou>
also "This node also handles a lot of conversions between ROS, OpenCV, and NumPy array information."
<rqou>
apparently bikesheds never end in robotics :P
<UmbralRaptor>
Then what do bikesheds end in?
<UmbralRaptor>
Yak barbers?
<FluffyFoxeh>
yes
<egg|z|egg>
UmbralRaptor: okay, so first, the word bilinear means "linear in both arguments"; so b(v, w) is bilinear if it's linear as a function of v for fixed w, and linear as a function of w for fixed v
<egg|z|egg>
UmbralRaptor: now, a linear function F from a vector space V to a vector space W, which maps v in V to F(v) in in W (in coordinates, a matrix) is the same thing as a bilinear scalar-valued function of V and W*: given an element v of V and an element ω of W*, ω(F(v)) is a scalar, and this is linear in ω and in v.
<egg|z|egg>
(and you can also take a bilinear form of V and W* and make it a function from V to W, which I leave as an eggsercise to the reader)
<egg|z|egg>
UmbralRaptor: so Linear Homomorphisms(V, W) = Bilinear Forms(V, W*).
* egg|z|egg
ponders
<egg|z|egg>
no, starting with the universal property is *really* going to be confusing I think
<egg|z|egg>
!wpn bofh
* Qboid
gives bofh a plane tofu/conifer hybrid
<UmbralRaptor>
… homomorphism?
<egg|z|egg>
UmbralRaptor: "function that preserves the structure of the thing we're talking about today", in this case that means "function that preserves the vector space structure" i.e. "linear map"
<UmbralRaptor>
ah
<egg|z|egg>
UmbralRaptor: okay I'll just go with coordinates; take a n-dimensional vector space V with a basis e_i and an m-dimensional vector space W with a basis e'_j, define the vector space V⊗W as the nm-dimensional space of linear combinations of the e_i ⊗ e'_j (which I declare to be linearly independent and say no more about for now)
<egg|z|egg>
UmbralRaptor: then, for any vectors v^i e_i in V and w^j e'_j in W, define v⊗w as v^i w^j e_i ⊗ e'_j
<egg|z|egg>
UmbralRaptor: for now there's not much to this, I'm just defining piles of chalk
<UmbralRaptor>
Uh
* UmbralRaptor
chooses to imagine egg having a large chalkboard in his office.
<egg|z|egg>
sadly I don't
<egg|z|egg>
I have a whiteboard
<egg|z|egg>
there's a cheshire cat on it
<egg|z|egg>
(blame phl)
<UmbralRaptor>
Kitty!
<egg|z|egg>
UmbralRaptor: I claim (and leave it to the reader to look up the proof) the following eggstremely weird-looking property: given a bilinear function B of V and W to a third vector space U, there eggsists a unique linear function f_B from the vector space V⊗W to U such that the following is true:
<egg|z|egg>
for all v in V, for all w in W, B(v, w) = f_B(v⊗w)
<egg|z|egg>
UmbralRaptor: what this means is that ⊗ allows you to turn bilinear maps into linear maps (of the tensor product)
<egg|z|egg>
UmbralRaptor: now, this is increasingly weird-sounding, so let's get back to the remark above that Linear Homomorphisms(V, W) = Bilinear Forms(V, W*)
<egg|z|egg>
UmbralRaptor: Bilinear Forms(V, W*) are bilinear maps that take an argument in V and an argument in W* and whose value is a scalar
<egg|z|egg>
UmbralRaptor: so now take the weird property above, let U be the scalars
<egg|z|egg>
(and substitute W* instead of W because we're looking at a bilinear map that takes a vector and a form)
<egg|z|egg>
UmbralRaptor: then f_B becomes a linear function of V⊗W* with scalar values, i.e., a linear form on V⊗W*, i.e., an element of (V⊗W*)*
<egg|z|egg>
UmbralRaptor: now, it can be shown that in finite dimension, (V⊗W*)* = V*⊗W
<egg|z|egg>
UmbralRaptor: okay now let's go back to your outer product of kets and bras
<egg|z|egg>
UmbralRaptor: |θ> is a linear form, I'll write it θ_i e^i; <φ| is a vector, φ^i e_i.
<egg|z|egg>
UmbralRaptor: now, how is that a matrix? well, with bras and kets you can see that given a vector |egg>, |θ><φ|(|egg>) is <φ|egg> |θ>
<egg|z|egg>
uuuuh sorry I screwed things up
<egg|z|egg>
UmbralRaptor: I meant " |θ> is a vector, I'll write it θ^i e_i; <φ| is a linear form, φ_i e^i. |θ>⊗<φ| is (by definition above) θ^i φ_j e_i ⊗ e^j"
<egg|z|egg>
UmbralRaptor: so, <φ|egg> |θ> is φ_j egg^j θ^i e_i
<egg|z|egg>
UmbralRaptor: how is it also a bilinear form on (linear forms, vectors)? |θ><φ|(<stabbity|,|egg>) = <stabbity|θ><φ|egg> = stabbity_i θ^i φ_j egg^j
<egg|z|egg>
UmbralRaptor: now in particular, since |θ>⊗<φ| is a linear map, it makes sense to talk about its trace, which is why the question makes sense in the first place
<egg|z|egg>
UmbralRaptor: and so θ^i φ_i is its trace, which also is the application of the linear form to the vector <φ|θ>
<egg|z|egg>
UmbralRaptor: does some of this make sense
<UmbralRaptor>
uh, I got lost
<egg|z|egg>
UmbralRaptor: that's not entirely surprising, I went too fast and sketchy, but do any of the lower lines make sense once I stop blathering about the universal property
<egg|z|egg>
!wpn котя
* Qboid
gives котя an injective reduced µa741
<SnoopJeDi>
18:11 <@egg|z|egg> no, starting with the universal property is *really* going to be confusing I think
<SnoopJeDi>
HAH
<egg|z|egg>
SnoopJeDi: but it's so pretty! you get to show that linear homomorphisms are (V⊗W*)* :D
<SnoopJeDi>
egg|z|egg, you're a true mathematician
<egg|z|egg>
but that's also why Einstein notation makes sense!
<egg|z|egg>
it's all tensor products of spaces and their duals
<egg|z|egg>
whose coordinates have upper and lower indices respectively
<SnoopJeDi>
yea, the duality notion becomes really important in...well, all the things wot need the differential geometry fixins, mostly
<egg|z|egg>
well really in QM too, something something bras and kets
* SnoopJeDi
is still not entirely comfortable with it, but doesn't use these concepts so it works out
<egg|z|egg>
but they have an entirely different way of writing it :D
<SnoopJeDi>
yea sure I'm including QFT in that for sure
<SnoopJeDi>
covariant derivatives yada yada
<egg|z|egg>
SnoopJeDi: even just boring matrix mechanics
<SnoopJeDi>
egg|z|egg, I'd argue it's a superset
<SnoopJeDi>
but I'm a lousy mathematician ;)
<UmbralRaptor>
>_>;;
<egg|z|egg>
<egg| is a linear form, |egg> is a vector, |egg><egg| is a tensor product of the two (and is pure state)
* egg|z|egg
gives UmbralRaptor a bag of indices
<UmbralRaptor>
s/pure state/operator/
<UmbralRaptor>
I thought?
<egg|z|egg>
well yes, but by virtue of being the result of a tensor product it's a very special kind of tensor