In October 2022, I was asked to help with a project to document the Quincy Dockside Warehouse located in the Keweenaw National Historical Park. The project was done in collaboration with the Douglass Houghton Student Chapter of the National Society of Professional Surveyors.
The basis of the project is to scan the Quincy Smelting Works Dockside Warehouse to create a virtual model of the building. The model will be used to create specific measurements of the building, such as its height and width, and to create floor plans of the building.
We used the FARO Focus 3D, which is a portable scanner that uses LiDAR to create a dense point cloud.
Me and Wes, another student in DHSC, scanning the building.
The scanner is mounted on a tripod and is moved around the building to capture the entire building. The scanner is also used to capture the interior of the building, which is done by moving the scanner around the room and capturing the ceiling, walls, and floor.
I set up targets around the building to help the scanner determine its position. The targets are small spheres that the scanner can see and use to determine its position. In addition to the targets, I also placed checkerboard patterns on the walls which we would later use to geo-reference the point cloud into Michigan North State Plane coordinates.
Due to time constraints, we were only able to scan the exterior and first floor of the building in one day. A second day was spent scanning the second floor of the building.
The point cloud containing the exterior of the building and the first floor.
The south-side of the saltbox roof was out of sight from all possible scanning positions, so no data was collected for that side. For visualization purposes, a plane was later created to fill in the missing data.
After the point cloud was captured, it was imported into Autodesk Revit and I began modeling the building by tracing the point cloud. This model would be used to draft a floor plan.
An early, incomplete model of the building in Revit.
I would eventually reach the limit of my Revit skills and would need to export a .DWG file to AutoCAD to to finish the floor plan. Once I had finished the floor plan, I would also use the point cloud to create a cross-section of the building.
The floor plan and cross section of the building. You can view a high-resolution version of the floor plan here .
After the floor plan was finished, I began working on the 3D model. The model was created in RealityCapture, a software package that can—among other features—create a 3D model from a point cloud. The model was then brought into Blender to be rendered. The roof was modeled in Blender as well to account for missing data.
An interactive 3D model of the Dockside Warehouse.
To create elevations, I simply rendered orthographic views of the model in Blender.
The west elevation of the building. View the other elevations here.
Both floors of the building were scanned in a separate session using a Matterport Pro2 camera. This enabled us to create a navigable tour of the building.
A screenshot from the navigable tour of the building. You can view the tour of the second floor north , second floor south , and first floor .
As the icing on the cake, I created a couple animations of the point cloud. Each animation shows each side of the second floor of the warehouse. The animations were created in CloudCompare, a free software package for point cloud visualization and processing.
An animation of the point cloud of the second floor of the warehouse.
I put together a flowchart that should illustrate the process of this project.
I'm really happy with the final results of this project! But as always, there are some things I would do differently if I were to do this project again.
I would have taken a photogrammetry approach for creating the exterior 3D model. The 3D model made from the LiDAR point cloud is generally high-quality and clean in most areas, but there are some interpolated areas that appear visually unappealing that could be eliminated using photogrammetry. Also, the scanner was subjected to variable lighting conditions—that is, some stations were in shadow and some in direct sunlight. This results in lighting artifacts on the model's texture.
If I were to do this project again, I would fly a UAV and collect aerial imagery to produce a photogrammetric model.
When scanning the second floor, I forgot to include enough coincident geometry with the first floor, meaning registration was less than ideal. I relied on the position of the roof on the north side to register the north second floor. I then used the finished floor elevation of the north side to vertically position the south floor, and the south, west, and east faces of the building for horizontal registration. While not incorrect, I would not consider this registration technique rigorous.
Throughout the scanning process, I—alongside my peers—were constantly in sight of the scanner. This resulted in a lot of "people" points in the point cloud. This meant that I had to manually remove these points from the point cloud before processing. If I were to do this project again, I would have moved to a location that is not in sight of the scanner to avoid this.
When scanning the second floor, the amount of ambient light was less than ideal. This resulted in a dark, grainy image for the navigable tours. We were able to mitigate this slightly by using an array of floor lights that we constantly moved around as we scanned. If I were to do this project again, I would have used a more powerful light source.
When I first agreed to help with the project, the scope was somewhat unclear—there was a general goal set in place, but no specific deliverables. I would have liked to have a more concrete scope of the project before I started to avoid scope creep.← return to home