Causal inference is a large and long-standing field that attempts to answer a very fundamental question: does X cause Y? This question arises in many fields such as healthcare, genomics, biology, and econometrics. The complex systems analyzed in these fields often have many interacting components mathematically represented as nodes and edges (connectivity) of a network. Network inference is the study of the time-dependent behavior of these nodes to reverse-engineer the network connectivity. The theory of causal inference has many contending definitions of causality such as state-of-the-art techniques Granger Causality (GC) and Convergent Cross Mapping (CCM). These algorithms are computationally tractable and easy to use but require strong mathematical assumptions. We simulate networks of harmonic and Kuramoto oscillators and attempt to reconstruct their ground-truth network structure using observations of oscillator displacements over time. Our analysis investigates the performance of inference methods on Erdos-Renyi and scale-free random graphs. We show that the GC and CCM inference methods systematically fail to determine network structure by returning overly sparse or dense connectivity results. These findings challenge the applications of such top-down inference approaches to physical and biological systems. Using a few basic assumptions, we demonstrate how networked systems of coupled oscillators can be successfully reconstructed if perturbations of the system are allowed. We propose a Perturbation Causal Inference (PCI) algorithm that uses systematic perturbations and tracks how perturbation cascades spread through the network. Using changepoint detection, correlation, and windowed variance statistics, we predict causal relationships between nodes in the graph. Our analysis shows that PCI works at scale and efficiently returns high accuracy reconstructions of large networks with varying coupling strengths and connectivity structures. We conclude by proposing future applications of perturbation inference methods into neuroscience and make a connection with Hebbian learning rules.