Sensor Based Liveness Verifications for Mobile Data


Citizen journalism includes videos of conflicts in areas with limited professional journalism representation to spontaneous events (e.g., tsunamis, earthquakes, meteorite landings, authority abuse). Such videos are often distributed through sites such as CNN's iReport, NBC's Stringwire or YouTube's CitizenTube, and are admissible on news networks.

This development is raising important questions concerning the credibility of impactful videos, e.g., fake news. The potential impact of such videos, coupled with the use of financial incentives, can motivate workers to fabricate data. Videos from other sources can be copied, projected and recaptured, cut and stitched before being uploaded as genuine on social media sites. For instance, plagiarized videos with fabricated location and time stamps can be created through ``projection'' attacks: the attacker uses specialized apps to set the GPS position of the device to a desired location, then uses the device to shoot a projected version of the target video.

To address this problem, in this project we exploit the observation that for plagiarized videos, the motion encoded in the video stream is likely inconsistent with the motion from the inertial sensor streams (e.g., accelerometer) of the device. As illustrated below, we leverage the motion sensors of mobile devices in order to verify the liveness of the video streams. We exploit the inherent movement of the user's hand when shooting a video, to verify the the consistency between the inferred motion from captured video and inertial sensor signals. We use the intuition that being simultaneously captured, these signals will necessarily bear certain relations, that are difficult to fabricate and emulate. In this case, the movement of the scene in the video stream should have similarities with the movement of the device that registers at the motion sensors.

alt text

Movee. We developed Movee, a system that provides CAPTCHA like verifications, by including the user, through her mobile device, into the verification process. However, instead of using the cognitive strength of humans to interpret visual information, we rely on their innately flawed ability to hold a camera still.

alt text

Vamos. To address these limitations, we have introduced Vamos, a Video Accreditation through Motion Signatures system. Vamos provides liveness verifications for videos of arbitrary length. Vamos is completely transparent to the users; it requires no special user interaction, nor change in user behavior. Instead of enforcing an initial verification step, Vamos uses the entire video and acceleration stream for verification purposes (see illustration below). It divides the video and acceleration data into fixed length chunks. It then classifies each chunk and uses the results, along with a suite of novel features that we introduce, to classify the entire sample. Vamos does not impose a dominant motion direction, thus, does not constrain the user movements. Instead, Vamos verifies the liveness of the video by extracting features from all the directions of movement, from both the video and acceleration streams.

Applications Liveness verifications are a cornerstone in a variety of practical applications that use the mobile device camera as a trusted witness. The examples below discuss several application scenarios:

Publications

  • IEEE TMC "Video Liveness for Citizen Journalism: Attacks and Defenses"
    Mahmudur Rahman, Mozhgan Azimpourkivi, Umut Topkara, Bogdan Carbunar.
    IEEE Transactions on Mobile Computing (TMC), Volume 16, Issue 11, November 2017. [pdf]

  • [IEEE TMC] "Movee: Video Liveness Verification for Mobile Devices with Built-in Motion Sensors"
    Mahmudur Rahman, Umut Topkara, Bogdan Carbunar.
    IEEE Transactions on Mobile Computing (TMC), Volume 15, Number 5, May 2016 [pdf]

  • [ACM WiSeC] Liveness Verifications for Citizen Journalism Videos
    Mahmudur Rahman, Mozhgan Azimpourkivi, Umut Topkara, Bogdan Carbunar.
    In Proceedings of the 8th ACM Conference on Security and Privacy in Wireless and Mobile Networks (WiSeC) [acceptance rate=19%], New York City, June 2015. [pdf]

  • [ACSAC] "Seeing is Not Believing: Visual Verifications through Liveness Analysis using Mobile Devices"
    Mahmudur Rahman, Umut Topkara, Bogdan Carbunar.
    In Proceedings of the 29th Annual Computer Security Applications Conference (ACSAC) [acceptance rate=19%], New Orleans, December 2013. [pdf]

  • Data and Code

  • Main code for Vamos and Movee Vamos.zip
  • Google Glass code Glass.zip
  • Video and accelerometer data (2.4 GB) captured by 5 users. For each user U, we also include the results of sandwhich attacks launched by 2 other users on the videos of U.

    Funding

    We are thankful to the following agencies for funding this work: