Gone are the days when architects, remodelers, and interior designers would walk a house with a 30-foot tape measure for remodeling. This laborious process involved hours of measurement and sketching, and then hours of translating those sketches into CAD drawings. Now, with a tool as common as an iPhone Pro, equipped with the Canvas app, you can measure an entire home in less than half an hour. This shift not only saves significant time but also reduces potential errors inherent in manual methods. And it’s all thanks to a technological advancement called LiDAR. Since 2020, Apple has included LiDAR sensors on all iPad Pros and iPhone Pros (from iPhone 12 Pro on).
Understanding LiDAR Technology
LiDAR, which stands for Light Detection and Ranging, represents a quantum leap in scanning technology. It works by emitting infrared rays that bounce off surfaces. It measures the time it takes for these rays to travel to the surface and back, calculating the distance to each point. The resulting mesh is a 99% accurate representation of a space.
Canvas has devised a distinctive approach to streamline the transformation of scans into CAD models, encompassing a two-step process. Initially, for scanning, we utilize our proprietary technology, harnessing depth images, video frames, IMU data, and the prowess of SLAM and 3D reconstruction technology. This enables us to create detailed scans or meshes of the space. Then, we employ a separate set of semi-automated tools, a fusion of artificial intelligence and human expertise, to facilitate CAD modeling from the generated scan.
An iPad Pro or iPhone Pro’s IMU (inertial measurement unit) detects and tracks a device's movement and tilt by measuring forces and rotation. It's commonly used in smartphones to detect if you're holding your phone vertically or horizontally, and in drones to help them navigate and balance in the air.
SLAM (simultaneous localization and mapping) technology can map and understand your surroundings while simultaneously tracking the phone’s own movements within that environment. Apple's ARKit, a framework for Augmented Reality app development, relies on SLAM technology. This makes it simpler for developers, including us at Canvas, to craft advanced apps for compatible iPhones. However, we don't stop at ARKit: We use it as a starting point and then add our unique layer of SLAM and 3D reconstruction technology. This extra layer greatly boosts the accuracy of scans and offers users more guidance during the capture process.
“The traditional way of building as-builts is measuring manually and drafting a model manually,” said Anton Yakubenko, PhD, VP of Product at Canvas. “But this process itself is pretty time-consuming. So, first, we developed proprietary modeling and tooling for this process of converting of scans into CAD models. Secondly, is automation — we have our computer vision and AI technologies to automate parts of this process.”
The Canvas Difference
Canvas' technology stands out in the realm of scanning and mapping because our foundation was laid long before Apple introduced its LiDAR sensor. Canvas has been scanning spaces and creating CAD models since 2016, when we relied on an external sensor to capture dimensional information. We’ve since switched to using Apple’s built-in sensor, however, our experience before integrated LiDAR sensors benefits our customers. This isn't just about having a head start; it's about the years of refining, iterating, and perfecting our algorithms.
While modern iPhones and iPads come with built-in algorithms for simultaneous localization and mapping, ours have a rich history, too. ARKit was primarily designed to be a lightweight technology layer for AR applications on iPhones, prioritizing real-time performance over global accuracy. In contrast, our SLAM technology was purpose-built for high-quality space scanning. Here, we can dedicate more processing time to achieve superior results in scan data, emphasizing precision and quality over real-time constraints.
“One of the benefits of this accumulated experience is that we know how to support multiple formats, and we can easily support more formats,” said Anton. "So, now, we already support SketchUp, Revit, Chief Architect, Archicad, 2020 Design Live, and plans in PDF and AutoCAD format as well as the proprietary Canvas Measurement Report.”
Anton says that soon, we can expect to see support for even more formats.
High-Fidelity Level of Detail
In 2022, Apple introduced RoomPlan, a built-in room mapping technology, which has led to a proliferation of scanning apps. While helpful for consumers, our testing determined it isn't yet powerful enough for professional use.
For those familiar with BIM level of detail standards, our as-built outputs are targeted at LOD 200. The major value is that Canvas allows you to create a 3D model of a property faster and easier than ever before — without the cost and time of a full-scale survey. With Autodesk Revit supported as a native output, many customers use Canvas as part of a complete, end-to-end BIM workflow, making BIM (and 3D scanning) more accessible.
“We deliver a pretty high level of detail — LOD 200 specification — but we often deliver even more than that level of detail,” said Anton.
When customers model spaces by themselves, it can be hard to justify spending as much time to reach the level of detail we offer — baseboards, crown molding, light switches, and other details that are both built-in and affect the utilization of the space — but remodelers, architects, and interior designers can appreciate this level of high-fidelity.
“This level of detail is possible because we have proprietary tooling to model many of those templates,” said Anton. “And our machine learning algorithms that speed up the process have been trained on tens of thousands of scans and CAD models, which might be harder for a smaller company or an individual to do on their own.”
Harness the precision of the Canvas: LiDAR 3D Measurements app to transform your interior design, remodeling, or architectural projects. Use Canvas to measure and model so you can focus on design.