Using band-edge filters for carrier frequency recovery with an FLL is an interesting technique that has been studied by fred harris and others. Usually this technique is presented for root-raised cosine waveforms, and in this post I will limit myself to this case. The intuitive idea of a band-edge FLL is to use two filters to measure the power in the band edges of the signal (the portion of the spectrum where the RRC frequency response rolls off). If there is zero frequency error, the powers will be equal. If there is some frequency error, the signal will have more “mass” in one of the two filters, so the power difference can be used as an error discriminant to drive an FLL.
The band-edge FLL is presented briefly in Section 13.4.2 of fred harris’ Multirate Signal Processing for Communication Systems book. Additionally, fred also gave a talk at GRCon 2017 that was mainly focused on how band-edge filters can also be used for symbol timing recovery, but the talk also goes through the basics of using them for carrier frequency recovery. Some papers that are referenced in this talk are fred harris, Elettra Venosa, Xiaofei Chen, Chris Dick, Band Edge Filters Perform Non Data-Aided Carrier and Timing Synchronization of Software Defined Radio QAM Receivers and fred harris, Band Edge Filters: Characteristics and Performance in Carrier and Symbol Synchronization.
Recently I was looking into band-edge FLLs and noticed some problems with the implementation of the FLL Band-Edge block in GNU Radio. In this post I go through a self-contained analysis of some of the relevant math. The post is in part intended as background information for a pull request to get these problems fixed, but it can also be useful as a guideline for implementing a band-edge FLL outside of GNU Radio.