//No Comment - Kinect & HoloLens, People Tracking & Kinect Calibration |
Written by David Conrad | |||
Sunday, 19 February 2017 | |||
• Scanning physical objects with an Xbox One Kinect to use as Holograms in HoloLens • Online People Tracking and Identification with RFID and Kinect • Robust Intrinsic and Extrinsic Calibration of RGB-D Cameras
Sometimes the news is reported well enough elsewhere and we have little to add other than to bring it to your attention. No Comment is a format where we present original source information, lightly edited, so that you can decide if you want to follow it up.
Scanning physical objects with an Xbox One Kinect to use as Holograms in HoloLensIt is fairly obvious that Microsoft has two devices that fit together very well - Kinect and HoloLens. After seeing demos of using a Kinect as a scanner to create objects that could be displayed using HoloLens, Joost van Schaik decided that it was time to implement and document the technique so that others could do the same. This is a step-by-step procedure using existing hardware and software. You basically need a Kinect 2 working on a PC and a free 3D scan app, a copy of the open source CloudCompare and a ccopy of Unity. The link to CloudCompare given in the write up isn't working. Instead you can use: You can see from the demo video that there is still work to be done:
Online People Tracking and Identification with RFID and KinectKinect can be used to work out where people are. RFID can be used to identify people, but isn't so good at getting their position. Put the two together and you have a people tracker: We introduce a novel, accurate and practical system for real-time people tracking and identification. We used a Kinect V2 sensor for tracking that generates a body skeleton for up to six people in the view. We perform identification using both Kinect and passive RFID, by first measuring the velocity vector of person's skeleton and of their RFID tag using the position of the RFID reader antennas as reference points and then finding the best match between skeletons and tags. We introduce a method for synchronizing Kinect data, which is captured regularly, with irregular or missing RFID data readouts. Our experiments show centimeter-level people tracking resolution with 80% average identification accuracy for up to six people in indoor environments, which meets the needs of many applications. Our system can preserve user privacy and work with different lighting.
Robust Intrinsic and Extrinsic Calibration of RGB-D CamerasIf you are using say the Kinect for serious applications, and who is using it for games any more, then calibration is vital. We have a new implementation of a novel calibration procedure and the good news is that the code is open source and you can use it in your own projects: Color-depth cameras (RGB-D cameras) have become the primary sensors in most robotics systems, from service robotics to industrial robotics applications. Typical consumer-grade RGB-D cameras are provided with a coarse intrinsic and extrinsic calibration that generally does not meet the accuracy requirements needed by many robotics applications (e.g., high accuracy 3D environment reconstruction and mapping, high precision object recognition and localization, ...). In this paper, we propose a human-friendly, reliable and accurate calibration framework that enables to easily estimate both the intrinsic and extrinsic parameters of a general color-depth sensor couple. Our approach is based on a novel, two components, measurement error model that unifies the error sources of different RGB-D pairs based on different technologies, such as structured-light 3D cameras and time-of-flight cameras. The proposed correction model is implemented using two different parametric undistortion maps that provide the calibrated readings by means of linear combinations of control functions. Our method provides some important advantages compared to other state-of-the-art systems: it is general (i.e., well suited for different types of sensors), it is based on an easy and stable calibration protocol, it provides a greater calibration accuracy, and it has been implemented within the ROS robotics framework. We report detailed and comprehensive experimental validations and performance comparisons to support our statements.
Results of the calibration applied to a set of clouds of a wall at different distances. You can find more information about the code at: http://iaslab-unipd.github.io/rgbd_calibration/
To be informed about new articles on I Programmer, sign up for our weekly newsletter,subscribe to the RSS feed and follow us on, Twitter, Facebook, Google+ or Linkedin.
Comments
or email your comment to: comments@i-programmer.info |
|||
Last Updated ( Sunday, 19 February 2017 ) |