Wednesday, May 9, 2012

Final Project Update - 4/9

The project was a great success!  But before we get into that, we will talk about what lead up to that in chronological order starting with Monday, April 30th...

In class on Monday (4/30) we had project critiques.  We were not done with the project and had a list of things to do.  With the class and professor's feedback, we nailed down a final list of things to do to complete the project and make it look great for presentation at Vizagogo!

At this point, we needed:
1.  A projector mounting device so that it is upright and projecting on the ceiling.
2.  Attach white mat-board to the ceiling to project on.
3.  Support for tracking multiple storms and color coding them based on intention.
4.  A wander function to make the storms move around the screen.
5.  To change it so that the keywords/hashtags are at the end of the lightning bolts.
6.  To add interactivity, so that if someone tweets "Vizagogo" it does something cool.

#1 - Projector Mounting
Jake went to Home Depot and built a great projector stand with a large piece of wood, curved metal pipe, and mounting device.  Jake tested it and attached the projector to it - success!

#2 - White Mat-Board on the Ceiling
We went to Hobby Lobby and bought four squares of white mat-board and some clear duct tape.  We attached it to the ceiling using the large ladder at the exhibition and used a ton of tape to secure it.

#3 - Multiple Storms and Color Coding
David worked on the functions for multiple storms, while Jake worked on the color coding function.  We made it so that before we render any lightning bolt, we can change a single variable to change the lightning bolt's color.  This is crucial because we won't be sure which tweet will be sent next and which storm will need to spark a bolt!  As for the "multiple" storms, there are actually two different storms with their centers right on-top of each other.  This is what allows us to have multiple different words appearing, like "Happy" and "Sad", shooting from the same area.

#4 - Wander Function
The previous wander function David wrote was very basic.  It worked more like a jitter, telling the center of the storm to move up, down, left, or right, at random.  The new wander function is based on one of processing's example boid functions.  It uses variables for speed, acceleration, and turning rate.  This allows for a far more interesting and organic looking wander, and also implements wrap around if the storm goes outside of the boundaries.  The wrap around also allows us to implant the idea that the "Twitterverse" is never ending into the audience's mind.  The internet cloud has no boundaries!

#5 - Keyword/Hashtag Change
Previously, we had the keywords written at the centers of the storms, and "TWEET" optionally written at the end of each lightning bolt.  Through class feedback, we decided that it is best to remove the keyword from the center of the storms and change the "TWEET" written at the end of each bolt to be the keyword that the bolt represents.  This required some large coding changes, but after a lot of tweaking and simplifying, looks nice and is much more organized.  Also, the words "Happy" and "Sad" or "Love" and "Hate" are color coded when displayed at the end of the bolts.

#6 - Interactivity
At first, we wanted the user to be able to change the keywords on-the-fly and be able to see what is being tweeted about more.  Unfortunately, this would require the user to close the current process, change the keywords, re-save the file, recompile the file, and re-run it in fullscreen mode with the cursor not on the screen.  Also, some keywords overload the twitter streaming API because we would get track limited.  After some debate about possibilities, we came to the conclusion that it would be really cool if people could tweet "Vizagogo" and actually see their tweet come up on the screen.  If someone tweets it, "VIZAGOGO!" appears on the screen in large capital yellow letters!

For the Vizagogo Exhibition:
We managed to complete everything on our list, work out all of the errors, and smooth out all of the lag and other rendering problems on time and for display at the exhibition!  We also added a text description projected onto the ceiling because many people weren't reading the title cards and we wanted to promote the interactivity of tweeting while watching it!

Looking back on the project:
For our first times writing a large processing based program and learning a complicated real-time streaming API, I feel that this project was a huge success.  I think both of us would like to use this project as a stepping stone for creating larger scale work like Jer Thorpe's or for other interesting generative art explorations! Special thanks to Richard Wong for providing the source code for the lightning effect!

Screen Recording of the Final Project
Here are some screen recordings of the final project in-action!  I also tweeted "Vizagogo" a few times in each clip.  Sorry for the frame rate, screen recording while running processing fullscreen isn't the best quality!

Twitterverse - Happy vs Sad: http://youtu.be/T9TaIy5N2jM?hd=1

















Twitterverse - Love vs Hate: http://youtu.be/bQ-zX2xASZ8?hd=1

Wednesday, April 25, 2012

Final Project Update - 4/25

-We now have the lightning changing color and then fading back to white on its own.

-Have started trying to get typography to appear when tweets are read in. The desired effect is for the words to appear at the end of the lightning bolt, hover for a while, and then fade out again.

-Have contacted Cody at the Federal Building to set up a time to get wifi access for our project. We want the tweets being read in to be from a live stream, not from a preset library.

-Have checked out a projector from Glen and will be coming up with ways to stabilize it in order for it to project onto the ceiling. Concerns are that the power cord plugs into the back of the projector, the projector is designed for air flow to be horizontal, not vertical, and that the projector is very heavy and not easily stood up on end. These things are being figured out.

Wednesday, April 18, 2012

Final Project Update - 4/18

-We have decided to use a projector setup to project our display on the ceiling. We liked the idea of having it be symbolic of the tweets entering the "Twitterverse", the invisible network of tweets flying above our heads to their destinations.

-Using Processing, we have gotten to the point where the lightning shoots out from a central point, and wanders around the screen randomly. We have also gotten the lightning to change color depending on a variable. The "tweet-reading" technique is still being perfected.

Tuesday, April 10, 2012

Final Project Aesthetic Crit - 4/11

This twitter data visualization will have 3 specific elements to it:

1) "3d" objects representing the different subjects tweeted about

The different subjects of the tweets will be represented as cubes, spheres, and prisms. They will cluster together according to their type, and will appear individually as tweets come in. The objects fade over time so they visualization doesn't become too cluttered.

2) color changes depending on positive or negative connotation of tweets

If positively associated words are used, such as "awesome", "good", or "best", the objects will be blue. If negatively associated words are used, the objects will be red.

3) visually stimulating effects, such as lightning "synapses" between objects and a "gravity swarm" effect.

http://www.openprocessing.org/sketch/2363
http://rdwong.net/archive/lightning/

Other elements:

-A virtual camera will rotate around the "scene" 360 degrees the entire time.
-The background is completely black.

Saturday, March 31, 2012

Final Project Concept Crit - 3/28



Using twitter and processing, we will create a visualization from the amount of tweets referring to certain hot-button topics, such as the Republican Nominees, Kony 2012, Occupy Wall Street, Celebrity Gossip, Snooki, Tim Tebow, etc.

The visualization will be a simulation of critters that battle for power based upon how many times their assigned topic is being tweeted.

Example parameters:
Critter anger level is determined upon how many times "War" is tweeted.
Critter reproduction rate is determined upon how many times "Snooki" is tweeted.
Food amount is determined upon how many times "McDonalds" is tweeted.

The simulation will be a playful look into what people are interested in, and possibly provide predictions for future tweets, trends, and more.

Friday, March 30, 2012

Jer Thorp presentation

JER THORP
Biography (including birthday and date of death if no longer living (or you are psychic))
  • Jer Thorp is an artist and educator from Vancouver, Canada, currently living in New York. Coming from a background in genetics, his digital art practice explores the many-folded boundaries between science and art.
  • Thorp’s award-winning software-based work has been exhibited in Europe, Asia, North America, South America, and Australia and all over the web.
  • Jer has over a decade of teaching experience, in Langara College’s Electronic Media Design Program, at the Vancouver Film school, and as an artist-in-residence at the Emily Carr University of Art and Design. Most recently, he has presented at Carnegie Mellon’s School of Art, at Eyebeam in New York City, and at IBM’s Center for Social Software in Cambridge.
  • He is currently Data Artist in Residence at the New York Times, and is an adjunct Professor in New York University’s ITP program.
Generative Art Connections
  • Thorp’s work takes the appeal of infographics into the realm of art, as he reminds us of our shared immersion in concepts and words while presenting a gorgeous image to contemplate our connectedness.
  • Artist whose medium is data
  • Expert in the processing language
  • Data visualization
  • Current events
  • Data in a human context
Generative Works
  • 138 Years of Popular Science (2011)
    • graphic that showed how different technical and cultural terms have come in and out of use in the magazine since its inception.
  • Project Cascade (2010 – 2011)
    • project that visualizes the sharing activity of New York Times content over social networks.
  • Random Number Multiples (2011)
    • Screenprints from the “Random Number Multiple” series. The first, titled ‘RGB – NYT Word Frequency’, shows usage of the words ‘red’, ‘green’, ‘blue’ in the Times between 1981 and 2011. My second print visualizes the terms ‘hope’ and ‘crisis’ over the same time period.
  • Sustained Silent Reading (2010)
    • uses semantic analysis to ‘read’ through a base of content.
  • Wired UK, August 2010 (2010)
    • visual representation of cellular phone records from a pool of 10 million users in an anonymous European country.
  • Haiti Earthquake aid – in Avatar minutes (2010)
    • a visualization tool showing how much different countries and organizations have pledged to the Haiti eathquake aid effort. Represented in how many minutes of the film, Avatar, the aid would pay for.
  • Code.lab (2010)
    • A combination of pedagogy, performance, and interactive installation, Code.lab was a unique collaboration between artists, students, and the public during the 2010 Olympic Games in Vancouver.
  • 9/11 Memorial Names Arrangement Algorithm & Placement Tool (2010)
    • algorithm and an accompanying software tool to aid in the placement of the nearly 3,000 names on the 9/11 Memorial in Manhattan.
  • Two Sides of the Same Story (2009)
    • Built in Processing, this tool allows for free comparison of any two bodies of text.
  • Wired UK, July 2009 (2009)
    • Using a series of generative graphics, the piece investigates the discrepancies between the demographics of the UK’s National DNA Database and of the UK population in general.
  • Good Morning! (2009)
    • GoodMorning! is a Twitter visualization tool that shows about 11,000 ‘good morning’ tweets over a 24 hour period, rendering a simple sample of Twitter activity around the globe.
  • Just Landed (2009)
    • Just Landed finds tweets containing the phrases ‘just landed in…’ and ‘just arrived in…’ and provides map-based visualization of these tweets over time.
  • NYTimes: 365/360 (2009)
    • Built in Processing, this set of visualizations shows the top organizations and personalities for every year from 1985 to 2001, by occurrence in the New York Times. Connections between these people & organizations are indicated by lines.
  • Glocal Image Breeder (2008)
    • Using genetic algorithms, the system can suggest images from a database of 8,000 images which could conceivably be ‘children’ of any two images that the user suggests.
Quotes
  • “The amount of available data, I think, is quickly outpacing our ability to use it in useful and novel ways.”
  • “This project was a very real reminder that information carries weight. While names of the dead may be the heaviest data of all, almost every number or word we work with bears some link to a significant piece of the real world. It’s easy to download a data set – census information, earthquake records, homelessness figures - and forget that the numbers represent real lives. As designers, artists, and researchers, we always need to consider the true source of data, and the moral responsibility which they carry.”
  • “I know I’ve said this before, but be patient.”
  • “The art itself is the software.”
Bibliography

  • http://blog.blprnt.com/
  •  
  •  
  • http://www.niemanlab.org/2010/06/the-art-itself-is-the-software-jer-thorp-on-the-aesthetics-of-data/
External Links
  • Jer Thorp’s official website: http://blog.blprnt.com