Jag skrev klart en första implementation av en litet ramverk för partikelsvärmar. Även om de inte sällan hanteras standardmässigt är deras funktion i sin natur sådan att man bör implementera dem själv om man använder dem därför att förståelse av hur de fungerar krävs om de ska vara meningsfulla. Utan det kan man bättre välja andra approximationsmetoder då styrning av parametrarna krävs antingen för de specifika problemet eller "dynamiskt" via andra algoritmer om problemet är föränderligt över tiden.
Jag brukar roa mig med snabba privata anteckningar - från början som en fallstudie av statistiska egenskaper för vissa språkliga komponenter ännu ej så vitt jag vet som färdig statistik fanns på (jfr vad man söker hos zebrafinken t.ex. som uttryckt i Neural Network Models for Zebra Finch Song Production and Reinforcement Learning).
Följande är ett exempel på det som jag såg visst värde att publicera för ev. synpunkter. Implementationen för partikelsvärmen publicerar jag troligen idag eller i morgon, eller senast tisdag, och en länk till den lär säkert finnas från Svärmintelligens är självorganiserande intelligens och Kreativ intelligens för artificiellt liv. En allmänt mycket bra resurs - heltäckande, hög kvalitet på författarna, välskrivna artiklar - om hela området är också Scholarpedia även om kanske artikeln om partikelsvärmar inte är den som prioriterats mest men en utmärkt start: Particle swarm optimization.
Funny with the particle swarm. I can see it function visualizing it - the particles and the data moving. The n-dimensional space really having more dimensions - the fitness-function and data express the same - an idea of the world perhaps reducing unlikeliness more than other things. Bayesian statistics.
Are not all about that? Visual attention focusing on the object we before understod less about? People wasting money on lottery? The commercials being the unique high driving on both visual energy entropy high in light, speed and proportion while for the orbifrontal cortex forming a diskret never spoken riddle? Or the unknown danger which voices always speaks about time after time before a country goes to war?
And I see why we all just love the particle swarm. Simple, working easy for sure cost some but yet not if your hands touch the parameters not anything that want find a solution good enough. Also a bit efectos especiales - The Magic of Perception - making it itself a riddle filled with stimulating entropy both for the programmer and the one ordering a system built.
But it is like it point to some thing general - for this concept - one see and it glides away because you fall down to the written algorithm and it misses part.
The data and the fitness function being separat while it perhaps do not have to be. One should always ask in this area which math comes natural to us? We spend years in school learning the easiest non-linear math by heart because we lack any natural ability to form them - the multiplication tables.
Perhaps it here only should only have a fuzzy direction - much like our general motivation or freights easy to rule us but never anything we can express in direct words even for our self - while the rest work it's own way.
All being self organization. Which problem it solves. Particles forms them self expressing the data and on the other side of the same thing the fitness function which by it self just as well can form a solution to a problem you did not know existed - A possibility.
It isn't about solving one problem but solving problems coming and going - weights is keept. A general strategy might have worked being bossy. But also some thing more concrete pop up from the past activated by the data bold touch given a tool bold put in forming a fitness function activated the tools associated data.
This slips away though. Colors, contrast and the proportions, memory, idea, experiance - being data but knowledge is tool and tools do not exist forming the fitness function - All being the same but a general view of their you are going.
I do wonder if it might be good doing a self-organization map before it. Really for me in a way I see little difference all being euclidian distances between data points their the relative distance express a simplification of the data both for the particle swarm and the Koonen map or for that matter nearest neighboor and all the rest. And for sure I see neither no difference compared to forming temporal correlation weights which of course is ve4ry important . Still it might learn you things seeing the problem - or a part of a bigger problem area. in a slightly different way.
Here I also come back to why I feel it holds value to express the neuron partly close to more realistic functions. For solving problems for sure no such value is obvious. The leaky integrate and fire though costs little problably equal or similar to "normal" sigmoids with tan or e.
Perhaps it is some thing in the relative "speed" or "conducatance" or particle flow between the change of medium. Yes that might perhaps be it. The neuron is just the neuron. Expressing the "sum" and "knowledge". The dendrites forming the path of the solution - alternatives. A road map. while you have a flooting fluid state forming a solution see expressed before in a bigger probability cloud which would follow bayesian expressing the neuron but also holds through rare things more possibilities in the chemicals released synapses but due to share numbers, quantum probablities and in the moment metabolic changes and nutrition flooting, as well the immun systems adds instability. That for sure one can all simplified away but perhaps some value or importance here exist I touched but forgot? The enzymes perhaps? MAO and so? The immun system? Well if important it will come to me."
Kommentera