ShapeFindAR: Exploring In-Situ Spatial Search for Physical Artifact Retrieval using Mixed Reality
Personal fabrication is made more accessible through repositories like Thingiverse, as they replace modeling with retrieval. However, they require users to translate spatial requirements to keywords, which paints an incomplete picture of physical artifacts: proportions or morphology are non-trivially encoded through text only. We explore a vision of in-situ spatial search for (future) physical artifacts, and present ShapeFindAR, a mixed-reality tool to search for 3D models using in-situ sketches blended with textual queries. With ShapeFindAR, users search for geometry, and not necessarily precise labels, while coupling the search process to the physical environment (e.g., by sketching in-situ, extracting search terms from objects present, or tracing them). We developed ShapeFindAR for HoloLens 2, connected to a database of 3D-printable artifacts. We specify in-situ spatial search, describe its advantages, and present walkthroughs using ShapeFindAR, which highlight novel ways for users to articulate their wishes, without requiring complex modeling tools or profound domain knowledge.
READ FULL TEXT