egg|nomz|egg changed the topic of #kspacademia to: https://gist.github.com/pdn4kd/164b9b85435d87afbec0c3a7e69d3e6d | Dogs are cats. Spiders are cat interferometers. | Космизм сегодня! | Document well, for tomorrow you may get mauled by a ネコバス. | <UmbralRaptor> egg|nomz|egg: generally if your eyes are dewing over, that's not the weather. | <ferram4> I shall beat my problems to death with an engineer.
<e_14159>
Interesting observation: Apparently, an ML algorithm is unable to learn to predict the next character in Wikipedia if you're too stupid and mess up the data generation.
<e_14159>
<asDEm\n" <-- What's the next character? :-/
<egg|cell|egg>
喵 probably
<e_14159>
egg|cell|egg: Two-byte unicode is, in this case, actually encoded as two separate bytes
<e_14159>
(input space is 205)
APlayer has joined #kspacademia
egg|cell|egg is now known as egg|train|egg
egg|phone|egg has joined #kspacademia
egg|train|egg has quit [Ping timeout: 383 seconds]
egg|phone|egg has quit [Ping timeout: 182 seconds]
egg|phone|egg has joined #kspacademia
<egg|phone|egg>
!Wpn whitequark
* Qboid
gives whitequark a molten cipher-like segfault
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Ping timeout: 207 seconds]
egg|phone|egg has joined #kspacademia
APlayer has quit [Ping timeout: 182 seconds]
APlayer has joined #kspacademia
egg|cell|egg has quit [Ping timeout: 207 seconds]
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Ping timeout: 207 seconds]
<kmath>
<collinkrum> Who decided to call it “the list of satellites in graveyard orbit” and not “orbituaries”
<egg|cell|egg>
Hah
egg|cell|egg is now known as egg|train|egg
<egg|train|egg>
Second train
<egg|train|egg>
!Wpn UmbralRaptor
* Qboid
gives UmbralRaptor a subnormal edge
<UmbralRaptor>
!wpn egg|train|egg
* Qboid
gives egg|train|egg a Kozai trapezohedron
<UmbralRaptor>
AAAAAAAAA
<egg|train|egg>
古在!
<e_14159>
UmbralRaptor: Someone sane?
<UmbralRaptor>
e_14159: uh, it's a silly pun.
<e_14159>
UmbralRaptor: Oh, I know
egg|train|egg has quit [Ping timeout: 198 seconds]
egg|phone|egg has joined #kspacademia
<UmbralRaptor>
First, you need to determine where you are going to place all of the executable binaries. In most cases, you will want to place the executable binaries in /usr/local/bin, but if you are on a multiuser system, and you are the only one who will be using SPECTRUM, it might be better to place the executables in a local ``bin'' directory. Consult your system administrator
<UmbralRaptor>
uhm.
* UmbralRaptor
parties like it's 1988?
icefire has joined #kspacademia
egg|phone|egg has quit [Ping timeout: 207 seconds]
egg|phone|egg has joined #kspacademia
<bofh>
UmbralRaptor: welcome to scientific code!
<UmbralRaptor>
bofh: There are nice example graphs of results with test data. In gnuplot. O_o
<kmath>
<bofh453> Convinced GNUPlot was created via genetic algorithm optimizing for "maximal frustration while simultaneously technically also plotting data"
<UmbralRaptor>
Thankfully I don't have to use it. But that does make me feel sorry for the document writer.
<whitequark>
hahahahaha
UmbralRaptor is now known as SpectralRaptor
egg|phone|egg has quit [Ping timeout: 198 seconds]
<kmath>
<bofh453> @whitequark @erincandescent "Permission to modify the software is granted, but not the right to distribute the comp… https://t.co/Fr3eUqXYVK
egg|phone|egg has quit [Ping timeout: 207 seconds]
egg|phone|egg has joined #kspacademia
egg|phone|egg has quit [Ping timeout: 207 seconds]
egg|phone|egg has joined #kspacademia
egg|cell|egg has joined #kspacademia
egg|phone|egg has quit [Ping timeout: 383 seconds]
* egg
pets ANBOcat
ferram4_ has quit [Read error: Connection reset by peer]
<kmath>
<StefanEJones> Mehhhh. Good night Twitter. One of you go write an award winning fantasy trilogy based on this photo. Would read. https://t.co/ZdnCbbwYL8
<APlayer>
For an Arduino project, I intend to use a camera as an optical flow sensor. The Arduino is not quite made for image processing so I need either a very simple algorithm or some piece of hardware that would add raw processing power, like some sort of "embedded GPU". But first, I want to see what I can do regarding the algorithm
<APlayer>
To break it down, I either plan to run some sort of feature detection (optionally scaling the image down while preserving contrasts, somehow, if that saves CPU time), or, as an alternative to feature detection, I wanted to split the image in a few areas, average the color values on them, and use the color gradients as "features" of the image
<SpectralRaptor>
A RasPi would presumably have useful amounts of processing power. Beyond that, unsure.
<APlayer>
Next, I will need to map corresponding features from one image to the next, calculate how much they have moved and average the values to get actual image motion
<APlayer>
Sure, it would, and it would be my choice. But the project is meant to be done in a team of two, and my teammate is slightly uncomfortable with more complex systems such as the RPi. We settled on an Arduino as an alternative that is easier to program
<bofh>
I can't help but think this would be better done simply by storing past frames in a buffer and using uneven multi-hex motion estimation with a sum-of-absolute-differences difference function over, say, 8x8 blocks.
<APlayer>
bofh: Uh, and now in English, please :P
<APlayer>
As for buffers, I am not sure if even one image would completely fit into the RAM. That is, I am fairly sure it would not. I need to read it out of flash as I am processing it
<bofh>
Basically take two images and find which, say, 8x8 pixel blocks match the previous image's blocks when shifted slightly. Then the direction of motion is merely the negation of the vector you needed to add to the block to make it match the one in the previous block.
<APlayer>
Except if the camera I choose brings a sufficiently big buffer
<bofh>
But like, doing *any* sort of motion vector gathering on an arduino is hard IMO.
<APlayer>
bofh: "find which pixel block match the previous image's blocks" sounds easier than it is
<APlayer>
In fact, it's the pairing of "features" (as I called them) between the first and the second frame that I am concerned about
<APlayer>
How would I cheaply compare two things?
<APlayer>
And, in your method, how would I shift the blocks, additionally? I can't just go and start randomly trying shifts and rotations of pixel blocks
<APlayer>
Ah, sum of absolute differences if the way to compare things?
<bofh>
It's a way, it just happens to usually be by far the cheapest possible in terms of computational cost.
<APlayer>
I see
<APlayer>
What I researched regarding the multi-hex thing does not look promising to me, though (sum of absolute difference sure does!)
<APlayer>
I mean, the multi-hex algorithm seems like it will cause a lot of problems with the SAD if I shift slightly from the target location, but not enough to get to a neighbouring one
<APlayer>
In that regard, feature detection by finding high-contrast areas seems more useful to me, TBH
<APlayer>
That is, find a few high-contrast areas, compare them, estimate motion
<APlayer>
To visualize what I mean, imagine an extreme case: frame 1 is a uniform white image with a single black speck of dust on it. Multi-hex compares the dust 8x8 block with the same speck of dust in frame 2, but is off by a few pixels, so the locations of the speck in frame 1 and 2 mismatch. So you have an error of two specks of dust in total. Any uniformly white area in frame 2 will perform better than that,
<APlayer>
because the total error is of one speck of dust