Augmented reality has been suffering from lack of appropriate contents to show to the users. Showing pure virtual models with AR, considering unstability of tracking even with state of the art techniques, is no better than showing the models on a smartphone. We developed an application which directly visualizes environmental information (in this case, dimensions of things), which cleverly avoids simultaneous localization and mapping (SLAM) or preparing CAD models.
Specifically, we analyze RGB and depth image directly to extract annotations in image space. Also, we attached RGB-D camera to a tablet to make it truly mobile, which in turn replace 3D navigation GUI with real world locomotion.