Machine learning is a powerful tool that allows us to use data to make predictions and decisions about the world, but it requires expensive centralized hardware and data, is prone to algorithmic biases, and has privacy concerns surrounding the use of required data. In contrast, Federated Learning (FL) allows users to collaboratively train a shared model under a central server while keeping personal data on their devices. This ability potentially addresses problems of traditional machine learning by using widely available mobile devices to increase accessibility to mainstream users and leverages decentralized user data and computational resources to train machine learning models more efficiently. However, this emerging field requires established processes for training and measuring the efficiency of FL models on edge devices. This research provides an inclusive framework to federatively train models on Android devices and analyze their computational and energy efficiency. On the mobile devices, I leveraged a terminal application to install dependencies and natively train FL models on the device. Then, I analyzed the device’s efficiency by measuring the computational, energy, and network resources through terminal applications. This flexible framework can deploy diverse machine learning models and datasets for training on Android devices. In preliminary experiments, I used this framework to measure efficiency for a PyTorch obstacle detection model and a Tensorflow abnormal heartbeat detection algorithm. These experiments showed that federatively training machine learning models on mobile phones makes efficient use of CPU, memory, and bandwidth, and it uses minimal energy consumption compared to centralized machine learning systems. With little to no examples of FL on Android devices, this framework provides a novel plug-and-play solution for native FL on mobile devices. Applications of this research will also demonstrate using FL techniques to address topics of accessibility, privacy, algorithmic bias, and hardware limitations for the machine learning domain.