Visualizing A Real-time Trivia Game


On April 10th, 2011 MIT held the Next Century Convocation as a centerpiece of its 150 anniversary celebration. 10,000 people attended the event at the Boston Convention and Exhibition Center. Before the program began guests participated in a trivia game designed by our team.

The game was a crowdsourcing experience, with participants asking their own questions to the crowd. Participants sent text messages to a short number through Ken’s awesome mSurvey system. Messages ending with “?” were recognized as questions and appeared on the screen with a sequential number. Messages started with “Q+number” followed by a space are recognized as answers to that question. Questions and answers were displayed in real-time, on a 90-foot screen.

My goal in designing this vis was to enable direct feedback to the users, invoke conversation among them and encourage participation. Considering the dimensions of the space, it was critical to keep the interface simple and learnable. Our team came up with this idea of “questions competing for answers” – each question moved from the left edge of screen to the right, gradually accelerating if nobody responded to it, till it was out of sight. When an answer came in, it was attached to the target question and therefore slowed it down. Each question left a trail behind it, whose width was related to speed. So the more popular questions would stay longer on the screen. It was a competition for both good questions and interesting answers.

Screenshot from the game:
Screenshot from the Trivial Game at MIT150 Convocation

Picture at the event:
Live photo at the convocation

I was a bit unhappy that the video staff insisted on adding moving backgrounds to the visualization. Then during the real run, we encountered some technical problem , causing the answers mismatched for a while (in fact a bug in the setup-at-the-last-minute message censorship system – censorship sucks, I knew it better than anyone). Anyway both the team and the guests had a lot of fun with the game. Some of us were sleepless for a few days to make this work, especially our great leader Ken, who’s been suffering from a fever since then – I hope he gets better now. ♥

Tools used: Processing

Visualization: Xiaoji Chen, Yanni Loukissas
Backend: Kenfield Griffith, Reid Williams
And thanks to the rest of the team: Michael Berry, Kristyn Maiorca, Ella Peinovich who made this happen

The Slow Glass I


This is a project for a Media Lab class: New Paradigms for Human-Computer Interaction by Pattie Maes and Hiroshi Ishii. Slow glass was imagined by Bob Shaw in the science-fiction story The Light of other Days. Light travels very slowly in this material so that it takes months or even years for people to see what had been on the other side.

We consider the slow glass as an architectural element that provides a window into another space/time. It changes people’s perception of the surroundings. We tried to make an elegant implementation for the concept. The screen is located in the lobby of the new Media Lab building. One camera captures sequential images of the lobby and tracks the coordinates of people using background subtraction. Another set of cameras on the back of the screen records a panorama of the lobby. Video is played back, a few hours later, according to the relative positioning between a person and the screen. From a user’s perspective, the screen is like transparent, only that through it he sees the past.

The tracking system, powered by OpenCV:

A diagram of perspective simulation:

A video of the concept and the first prototype that we presented in the class review. We are currently working on making it a permanent installation in the Media Lab building:

Tools used: Open Frameworks, OpenCV, iMovie
Collaborator: Polychronis Ypodimatopoulos, Daniel Rosenburg