Learn more about AI, math, and physics with courses such as Neural Networks on Brilliant! First 200 to use our link https://brilliant.org/sabine will get 20%...
Most of the largest datasets are kind of garbage because of this. I’ve had this idea to run the data through the network every epoch and evict samples that are too similar to the output for the next epoch but never tried it. Probably someone smarter than me already tried that and it didn’t work. I just feel like there’s some mathematical way around this we aren’t seeing. Humans are great at filtering the cruft so there must be some indicators there.
Most of the largest datasets are kind of garbage because of this. I’ve had this idea to run the data through the network every epoch and evict samples that are too similar to the output for the next epoch but never tried it. Probably someone smarter than me already tried that and it didn’t work. I just feel like there’s some mathematical way around this we aren’t seeing. Humans are great at filtering the cruft so there must be some indicators there.