Iterated Function Systems are a given by a composition of randomly chosen transformations of a separable metric space (ok, sometimes complete also). They are natural objects that arising in modeling, and as a coordinatization for Markov Chains with stationary transition probabilities. They have been going in and out of style, but they are guaranteed to raise a smile, in the sense that they have been often rediscovered in all innocence with new terminology. Here, in this part of the talk, I will insert a list of famous people over the last century and some friends (non-empty intersection) who have thought about these models. Iterated Function Systems have seen a lot exposure lately because of Fractal Image Compression, using the idea that rough textures are often enough detail, and that this may be simulated with fast simple code. In fact, before they were used to produce detailed and attractive pictures, they were used as models of learning and adaptation. We give a very simple proof of the main idea, explore how they could model learning and give some results, with a hint at the methodology we used. If you just want the pictures go to www.electricsheep.org, itself a free open source downloadable software package and is an allusion to the greatest science fiction writer of all time.