Seeing Buildings Shake With Software
Written by David Conrad   
Sunday, 26 April 2015

Using motion magnification, monitoring large structures could become cheap enough to be routine. All you need is a video camera and you can literally see rigid buildings move like reeds in the wind.

Motion and color magnification using software is something that has been going on for a while. It first came to public attention back in 2005 at Siggraph. This made use of direct tracking of changes in pixels and then applied the detected movement to the image on a larger scale.

In 2012 a team from MIT CSAIL discovered that you could get motion magnification by applying filtering algorithms to the color changes of individual pixels. The method didn't track movement directly, but instead used the color changes that result from the movement. With this technique the team could do some amazing things that depended on the direct observation of color change - like see the pulse in a wrist or the blood pulsing in a face.

The important, and almost accidental, discovery was that in magnifying color changes they also magnified motion. The new method is fast enough to work in real time. 

 

motionmag1

 

Now another MIT team, led by Oral Buyukozturk, has attempted to put the technique to use in monitoring structures - to directly see the vibrations in buildings, bridges and other constructions. Currently such monitoring involves instrumenting the building with accelerometers. This is expensive and doesn't generally give a complete "picture" of what is happening to the building. 

It would be much simpler to point a video camera at the building and use motion magnification software to really see the vibrations. The problem is that you can't be sure that what the software is extracting from the video is indeed the motion of the object and how it relates to the data you would get from the standard method using accelerometers. 

To confirm and calibrate the approach, the team used it to video a PVC pipe struck with a hammer and also measured the outcome using accelerometers and laser vibrometers. The motion magnified video revealed the pipe vibrating in the expected normal modes. You can see how well it worked in the following video: 

 

This is a technique that clearly has other potential uses and the good news is that the original software, not the improved version used here, is available on GitHub with an open source license.

motionmag2
The next stage is to use the method to monitor MIT's Green Building, the Zakim Bridge and the John Hancock Tower in Boston. I wonder if they will put up a monitor to allow people in the buildings, or passing over the bridge, to see just how much they move! It could be an unnerving experience.

 

Banner


The University of Tübingen's Self-Driving Cars Course
22/03/2024

The recorded lectures and the written material of a course on Self-Driving Cars at the University of Tübingen have been made available for free. It's a first class opportunity to learn the in an [ ... ]



VLOGGER - AI Does Talking Heads
24/03/2024

Developed by Google researchers VLOGGER AI is a system that can create realistic videos of people talking and moving from a single still image and an audio clip as input. 


More News

 

raspberry pi books

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Sunday, 26 April 2015 )