This presentation seeks to summarize a solution to helping the visually-impaired navigate new areas. While previous solutions have been relatively successful, many lacked two key features that we hope our solution addresses: being affordable and allowing customization towards those with compounding disabilities. Our solution consists of two main parts: (1) a user-interface created for Fusion 360, a popular 3D-modeling application, that is built upon an existing framework detailed in Hofmann (2018) called PARTs (Parameterized Abstractions of Reusable Things), and (2) an optimization algorithm to generate maps that are tailored for its users. Through PARTs, we developed different variations of modular pieces of map (e.g., roads, buildings, and sidewalks), which increases ease of customization. After the user specifies personal information and preferences through the PARTs UI—such as the width of their finger, their physical limitations, their understanding of braille, and their desired map features—the optimization algorithm will select the best combination of features from the PARTs database for that specific user. At the end of the process, users have a model of a tactile map in Fusion 360 which can be printed out with commercially-available 3D-printers. With 3D-printers becoming more affordable, this solution is significantly less cost prohibitive than other means of generating tactile maps, which required an initial investment upwards of a thousand dollars. Through user studies, we also test how blind users interpret these maps, which helps us guide design improvements in the future. In this presentation, we discuss the efficacy of our solution by comparing it to previous works and detail our plans to improve the system by making the PARTs user-interface more accessible and incorporating user feedback about the map itself.