Carnegie Mellon University
April 13, 2018

Faculty and students win at IPSN 2018

Carnegie Mellon University had a strong showing at this year’s International Conference on Information Processing in Sensor Networks (IPSN) in Porto, Portugal, by winning Best Paper, Best Demo, and taking first and second place in the Microsoft Indoor Localization Competition.

Best Paper

Professors Bob Iannucci, Swarun Kumar, and Anthony Rowe teamed up with ECE Ph.D. students Artur Balanuta, Adwait Dongare, and Akshay Gadre, along with Anh Luong and Revathy Narayanan from CyLab, to present their paper, “Charm: Exploiting Geographical Diversity Through Coherent Combining in Low-Power Wide-Area Networks.” The team won Best Paper for their research on allowing wireless base stations to collaboratively decode weak signals from low-power devices.
Low-Power Wide-Area Networks (LPWANs) are an emerging wireless platform which can support battery-powered devices lasting 10-years while communicating at low data-rates to gateways several kilometers away. The work in this paper explores how to combine extremely faint signals from multiple gateways to receive packets that normally would have been dropped. This has enormous potential in increasing the number of sensors supported in an area and dramatically improves the battery-life of these sensors. Over the next few years, society will likely see many LP-WAN systems deployed across the US as part of ongoing smart city projects. This is all part of the OpenChirp project currently being deployed on Carnegie Mellon’s campus to help transform how future communities sense reason about and manage utilities, roads, traffic lights, bridges, parking complexes, agriculture, waterways, and the broader environment.

Best Demo

Professor Anthony Rowe and students Niranjini Rajagopal, John Miller, Krishna Kumar, and Anh Luong won Best Demo for their paper, “Welcome to My World: Demystifying Multi-User AR with the Cloud.” Recently, there has been a rapid growth in the platforms and devices that support Augmented Reality (AR) and Virtual Reality (VR) applications which have the potential to revolutionize connecting virtual information with physical environments. Over the past year, the emergence of Apple’s ARKit and Android’s ARCore have made AR a reality in mobile devices and accessible to the masses. These toolkits allow people to visualize virtual holograms, viewed through phones, that interact with the real world. Current mobile AR systems provides three key features: (1) visual tracking to precisely monitor the devices motion in a scene, (2) scene understanding to find floor surfaces and walls, and (3) rendering to draw realistic 3D objects. However, the AR experience on mobile devices is limited due to the lack of persistent data across session and multi-user interaction. In this demo, the team showed how using precise localization and information from magnetic fields, it is possible to create a truly persistent AR experience that can support any number of users simultaneously. The demo worked by allowing multiple people to independently drop blocks in the world similar to in Minecraft that would immediately show up on everyone’s view of the world. This showed one of the critical missing building blocks to what will likely be an exciting future for mobile augmented reality.

Microsoft Indoor Localization Competition

Two Carnegie Mellon teams consisting of Professor Anthony Rowe and students Niranjini Rajagopal, John Miller, Krishna Kumar, Anh Luong, Patrick Lazik, and Nick Wikerson came in first and second place out of 34 international teams consisting of a mix of academia and industry in the Microsoft Indoor Localization Competition. In this year’s competition, each team placed a mobile device on a backpack LiDAR-based ground truth setup to see how well they could measure a 3D location while in motion. The winning team, consisting of Niranjini Rajagopal, John Miller, and Anh Luong, achieved an average accuracy of 27cm using a combination of UWB and visual inertial odometry. The second Carnegie Mellon team, consisting of Patrick Lazik and Nick Wilkerson, was able to achieve 47cm of accuracy using a standard and completely unmodified smartphone that listened to ultrasonic beacons. This type of localization is critical for a wide-range of applications ranging from navigation for the visually impaired to optimizing warehouse and hospital operations.