Please use this identifier to cite or link to this item: http://ir.juit.ac.in:8080/jspui/jspui/handle/123456789/6737
Title: Motion magnification
Authors: Parmar, Yashika
Chahal, Sumeet Singh
Bansal, Puvail
Sharma, Neeru [Guided by]
Keywords: Frequency domain
MATLAB
Issue Date: 2017
Publisher: Jaypee University of Information Technology, Solan, H.P.
Abstract: Everyday, several things around us go unnoticed as they cannot be detected by our eyes. In this thesis, we focus on a technique, video magnification which allows us to see these small or subtle movements of an object or even a human being, which are impossible to see through our naked eyes. This allows us to see everything around us with a new perspective. The method used here is Eulerian Video Magnification, which takes video of an object as the input, applies the spatial and temporal processing to it and then gives the output as the magnified version of the original input. For the motion magnification to happen, it is very important that we mea sure visual motions with accuracy and also have a pair or more of pixels together that are to be modified. The final output after applying motion magnification reveals small, subtle motions in an amplified way that were impossible to detect in the original sequence. This method also requires the motions to be very small, which most of the time is not the case. Also, the motion of any layer can be magnified by a user specified value. Using this method, we will be able to visualize the amplified motion of the pulse rate of a hand or the flow of blood from heart to our face which changes our colour of the face. This method also runs in real time that shows the phenomenon, which is done by selecting temporal frequencies by the user and has a variety of applications in many fields.
URI: http://ir.juit.ac.in:8080/jspui/jspui/handle/123456789/6737
Appears in Collections:B.Tech. Project Reports

Files in This Item:
File Description SizeFormat 
Motion Magnification.pdf597.18 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.