Reacting in realtime
All 3 of those run on realtime-input, meaning they work on the signals that come to you around, sonic and light and are not dependent on pre-analysing. For exapmle The Beatblitz analyses the incoming audio-inout and performs on it an Fast Fourier Transformation (FFT) to split the audio signals in frequency-bands (see picture). Afterwards the app is checking the differences of the bass-part for big differences within our time buffer. In the end, your camera-light always flashes when the drumer hits the basedrum.
Sensors * Android = Fireworks
Meanwhile The Zipper is an visual effect and running on the light- and position-sensors of your smartphone. Here the algorithm analyses the GPS-Position and comes into play if the same position is detected twice. So the algorithm mapped the behaviour of you waving your zipper to mathematical correlation. The Painter runs also on the lightsensors on the camera which are responsible for an appropriate lighting of your pics. If we get a certain difference here, the displays starts to flash. This effect only works with 23 smartphomes upwards. Each of the smartphones represents one pixel. Thinking of a 300 people counting crowd within a concert, you can have 17 x 17 pixels and draw anything you want into the crowd. Maybe you want the crowd to be your logo? You should give it a try!