Low power, long range wireless sensor networks (WSNs) are a field with many real world uses such as disaster, agricultural, and industrial applications. The nodes in these networks often communicate via radio frequency (RF) modulation schemes that prioritize long range and low power consumption at the expense of data rates. These low data rates can cause the network to quickly become saturated, as sensors can often generate data at rates higher than the network capacity. To address this issue, remote estimation techniques have been proposed to reduce the loads on the network, while still transmitting enough data to accurately reconstruct the original signal. This, paired with the concepts of Age of Information (AoI), has been shown to be an effective solution. Compression algorithms, both lossless and lossy, have been used to improve data throughput by increasing the amount of information encoded in a given number of bytes. In a low power WSN, the use of compression algorithms could improve the effective data rate of the network, without the need to downsample the signal, and discard data points. But compression can also have drawbacks, as algorithms with very high compression ratios can lead to needless distortion of information, as well as excessive time between transmissions as the node waits to properly fill a packet. In this work, a simulation environment and real world test bed were developed to understand these effects, and adaptive compression rate algorithms were developed to optimize the network to minimize average AoI.