In class on Monday (4/30) we had project critiques. We were not done with the project and had a list of things to do. With the class and professor's feedback, we nailed down a final list of things to do to complete the project and make it look great for presentation at Vizagogo!
At this point, we needed:
1. A projector mounting device so that it is upright and projecting on the ceiling.
2. Attach white mat-board to the ceiling to project on.
3. Support for tracking multiple storms and color coding them based on intention.
4. A wander function to make the storms move around the screen.
5. To change it so that the keywords/hashtags are at the end of the lightning bolts.
6. To add interactivity, so that if someone tweets "Vizagogo" it does something cool.
#1 - Projector Mounting
Jake went to Home Depot and built a great projector stand with a large piece of wood, curved metal pipe, and mounting device. Jake tested it and attached the projector to it - success!
#2 - White Mat-Board on the Ceiling
We went to Hobby Lobby and bought four squares of white mat-board and some clear duct tape. We attached it to the ceiling using the large ladder at the exhibition and used a ton of tape to secure it.
#3 - Multiple Storms and Color Coding
David worked on the functions for multiple storms, while Jake worked on the color coding function. We made it so that before we render any lightning bolt, we can change a single variable to change the lightning bolt's color. This is crucial because we won't be sure which tweet will be sent next and which storm will need to spark a bolt! As for the "multiple" storms, there are actually two different storms with their centers right on-top of each other. This is what allows us to have multiple different words appearing, like "Happy" and "Sad", shooting from the same area.
#4 - Wander Function
The previous wander function David wrote was very basic. It worked more like a jitter, telling the center of the storm to move up, down, left, or right, at random. The new wander function is based on one of processing's example boid functions. It uses variables for speed, acceleration, and turning rate. This allows for a far more interesting and organic looking wander, and also implements wrap around if the storm goes outside of the boundaries. The wrap around also allows us to implant the idea that the "Twitterverse" is never ending into the audience's mind. The internet cloud has no boundaries!
#5 - Keyword/Hashtag Change
Previously, we had the keywords written at the centers of the storms, and "TWEET" optionally written at the end of each lightning bolt. Through class feedback, we decided that it is best to remove the keyword from the center of the storms and change the "TWEET" written at the end of each bolt to be the keyword that the bolt represents. This required some large coding changes, but after a lot of tweaking and simplifying, looks nice and is much more organized. Also, the words "Happy" and "Sad" or "Love" and "Hate" are color coded when displayed at the end of the bolts.
#6 - Interactivity
At first, we wanted the user to be able to change the keywords on-the-fly and be able to see what is being tweeted about more. Unfortunately, this would require the user to close the current process, change the keywords, re-save the file, recompile the file, and re-run it in fullscreen mode with the cursor not on the screen. Also, some keywords overload the twitter streaming API because we would get track limited. After some debate about possibilities, we came to the conclusion that it would be really cool if people could tweet "Vizagogo" and actually see their tweet come up on the screen. If someone tweets it, "VIZAGOGO!" appears on the screen in large capital yellow letters!
For the Vizagogo Exhibition:
We managed to complete everything on our list, work out all of the errors, and smooth out all of the lag and other rendering problems on time and for display at the exhibition! We also added a text description projected onto the ceiling because many people weren't reading the title cards and we wanted to promote the interactivity of tweeting while watching it!
Looking back on the project:
For our first times writing a large processing based program and learning a complicated real-time streaming API, I feel that this project was a huge success. I think both of us would like to use this project as a stepping stone for creating larger scale work like Jer Thorpe's or for other interesting generative art explorations! Special thanks to Richard Wong for providing the source code for the lightning effect!
Screen Recording of the Final Project
Here are some screen recordings of the final project in-action! I also tweeted "Vizagogo" a few times in each clip. Sorry for the frame rate, screen recording while running processing fullscreen isn't the best quality!
Twitterverse - Happy vs Sad: http://youtu.be/T9TaIy5N2jM?hd=1
Twitterverse - Love vs Hate: http://youtu.be/bQ-zX2xASZ8?hd=1









