Every year, hundreds of people die from fatal police violence, and there is no official repository that records this information. Over the past decade several crowd-sourced efforts have emerged to fill this gap, creating online repositories that compile the geospatial and demographic information of victims of fatal police force. Our team is working with three of these. The oldest and largest data set has information dating back to 2000, and contains roughly 17,000 observations, while the other two date back to 2013, and closer to 4,400 observations. There is missing data in all of the repositories, and the missingness levels are particularly high for the race of the victim, a variable of interest. We know a priori that there is substantial overlap in the cases covered by these three data sets, so we hope to use record linkage methods to combine the information across the data sets to recover or impute the missing data. Previously, we harmonized the three data sets, so the problem of inconsistencies in variable names, formatting or other irregularities has been addressed. We now have four sequential, dependent goals. Our first goal is to perform record linkage within each data set to eliminate duplicate records. Our second goal is to perform record linkage across the three data sets, using the victim's state as a blocking key to reduce the computational burden. Our third goal is to address the remaining missing values through statistical imputation. Our final goal is to provide a public repository with a clean unified data set, and the code needed to reproduce this from the original raw data. This will be paired with a corresponding browser-based public tool for data exploration in the form of an online R Shiny App.