About FLLs with band-edge filters

Using band-edge filters for carrier frequency recovery with an FLL is an interesting technique that has been studied by fred harris and others. Usually this technique is presented for root-raised cosine waveforms, and in this post I will limit myself to this case. The intuitive idea of a band-edge FLL is to use two filters to measure the power in the band edges of the signal (the portion of the spectrum where the RRC frequency response rolls off). If there is zero frequency error, the powers will be equal. If there is some frequency error, the signal will have more “mass” in one of the two filters, so the power difference can be used as an error discriminant to drive an FLL.

The band-edge FLL is presented briefly in Section 13.4.2 of fred harris’ Multirate Signal Processing for Communication Systems book. Additionally, fred also gave a talk at GRCon 2017 that was mainly focused on how band-edge filters can also be used for symbol timing recovery, but the talk also goes through the basics of using them for carrier frequency recovery. Some papers that are referenced in this talk are fred harris, Elettra Venosa, Xiaofei Chen, Chris Dick, Band Edge Filters Perform Non Data-Aided Carrier and Timing Synchronization of Software Defined Radio QAM Receivers and fred harris, Band Edge Filters: Characteristics and Performance in Carrier and Symbol Synchronization.

Recently I was looking into band-edge FLLs and noticed some problems with the implementation of the FLL Band-Edge block in GNU Radio. In this post I go through a self-contained analysis of some of the relevant math. The post is in part intended as background information for a pull request to get these problems fixed, but it can also be useful as a guideline for implementing a band-edge FLL outside of GNU Radio.

Z-Sat VHF transmissions

Z-Sat is a microsatellite by Mitsubishi Heavy Industries that was launched in 2021. It is a demonstrator for multi-wavelength infrared Earth observation technologies. It carries an amateur radio payload that was coordinated by IARU and which consists of a BBS (bulletin board system) with a 145.875 MHz downlink and 435.480 MHz uplink. I have not been able to find more information about the amateur radio payload on this satellite.

Recently, Daniel Ekman SA2KNG asked me to analyze some transmissions by this satellite. Apparently it has recently begun to transmit a digital modulation, as shown in this SatNOGS observation, while it typically had transmitted CW telemetry in the past. The point where this started appears to be on 2025-06-20, as there is a SatNOGS observation of CW telemetry on that day followed by an observation of the digital modulation. In this post I analyze this digital modulation and explain what it is.

5G NR PDSCH

In my previous post in the 5G NR RAN series, I showed how to decode the PDCCH (physical downlink control channel), which is used to send control information from the gNB (base station) to the UEs (cellphones). In this series I am using as an example a short recording of the downlink of an srsRAN gNB. The PDCCH transmission that I decoded in the previous post was a DCI (downlink control information) containing the scheduling of the SIB1 PDSCH transmission. The PDSCH is the physical downlink shared channel, which is the channel where the gNB transmits data. The SIB1 is the system information block 1. It contains basic information about the cell, and it is decoded by the UE after decoding the MIB in the PBCH, as part of the cell attach procedure. In this post I will show how to decode this PDSCH SIB1 transmission.

5G NR PDCCH

This is a new post in my series about the 5G NR RAN. As in previous posts, I am analyzing a short recording of the downlink of an srsRAN gNB. There are no UEs connected to the cell during this recording, so there isn’t much interesting traffic, but the recording contains all the essential 5G signalling. In particular, there is a SIB1 transmission in the PDSCH, with its corresponding transmission in the PDCCH.

The PDCCH (physical downlink control channel) is used to transmit control information to the UEs in the form of DCI messages (downlink control information). The most common types of DCIs are those that specify the scheduling parameters of transmissions in the PDSCH (physical downlink shared channel), and the uplink grants for UEs in the PUSCH (physical uplink shared channel). The role that the 5G PDCCH plays is very similar to the role that it plays in LTE, so my post about the LTE PDCCH can be good for more context. However, in 5G the channel coding and physical layer of the PDCCH is substantially different from LTE.

Published
Categorised as Software Tagged

Time-dependent delay in GNU Radio

Since some years ago, a Doppler correction block has been available in gr-satellites. This block uses a text file that defines a time series of frequency versus time, and applies the appropriate frequency shift to each sample by linearly interpolating the frequency corresponding to the time of that particular sample. It can be used both to correct Doppler in a satellite propagation channel and other similar channels, as well as to simulate Doppler.

For some time I have been wanting to implement a similar block that applies a time-dependent delay to a complex baseband signal. The delay versus time would be defined by a text file in the same way as for the Doppler correction block. With this block, the delay of a satellite propagation channel can be simulated. It can also be used to correct the delay-rate of a satellite channel, which causes waveforms to be expanded or compressed in time due to a changing time-of-flight. This effect is known as code Doppler in GNSS, and as symbol frequency offset in general digital communications.

For accurate simulation of a satellite propagation channel, both the Doppler block and the delay block are needed, since the Doppler block accounts for the effects of variable time-of-flight on the RF carrier, and the delay block accounts for the effects on the band-limited modulation of the signal, by delaying its complex baseband representation.

I have now added a Time-dependent Delay block to gr-satellites. In this post I give a few details about its implementation and usage.

Coding NEON kernels for the Cortex-A53

Some weeks ago, I presented at FOSDEM my work-in-progress high performance SDR runtime qsdr. I showed a hand-written NEON assembly implementation of a kernel that computes \(y[n] = ax[n] + b\), which I used as the basic math block for benchmarks on a Kria KV260 board (which has a quad-core ARM Cortex-A53 at 1.33 GHz). In that talk I glossed over the details of how I implemented this NEON kernel. There are enough tricks and considerations that I could make a full talk just out of explaining how to write this kernel. This will be the topic for this post.

Decoding IEEE 802.11ah

Since some time, I’ve been thinking about doing something similar to my posts about LTE and 5G NR, but for WiFi (IEEE 802.11). In these posts, I take a signal recording and write a Jupyter notebook from scratch to analyze the signal and decode the data. I use these posts as a way of learning all the details of how these standards work, and I have seen that some people find them very useful.

Recently I was taking a look at a baby monitor camera system, composed by a camera and a monitor screen, since I was curious about how the camera transmits the video. Using Maia SDR, I located the signal at 866 MHz and realized that both the camera and the monitor screen were transmitting OFDM packets of approximately 2 MHz of bandwidth on this frequency. With some cyclostationary analysis, I found that the subcarrier spacing was 31.25 kHz (which works out to be 2 MHz / 64 FFT points), and that the cyclic prefix was 1/4 of the useful symbol duration. This pointed me straight to IEEE 802.11ah (WiFi HaLow), a variant of WiFi designed for the 800 MHz and 900 MHz license-exempt bands. After comparing the packet formats on the 802.11ah standard with the waterfall of my recording, I was sure that this was indeed 802.11ah. What started as a fun and short signal recording experiment has ended up going through the rabbit hole of implementing 802.11ah decoding from scratch in a Jupyter notebook. In this post I explain my implementation and the analysis of this recording.

5G NR PBCH

This post is a continuation of my series about the 5G NR RAN. In these posts, I’m analyzing a recording of the downlink of an srsRAN gNB in a Jupyter notebook written from scratch. In this post I will show how to decode the PBCH (physical broadcast channel). The PBCH contains the MIB (master information block). It is transmitted in the SSB (synchronization signals / PBCH block). After detecting and measuring the synchronization signals, a UE must decode the PBCH to obtain the MIB, which contains some parameters which are essential to decode other physical downlink channels, including the PDSCH (physical downlink shared channel), which transmits the SIBs (system information blocks).

In my first post in the series, I already demodulated the PBCH. Therefore, in this post I will continue from there and show how to decode the MIB from the PBCH symbols. First I will give a summary of the encoding process. Decoding involves undoing each of these steps. Then I will show in detail how the decoding procedure works.

Published
Categorised as Software Tagged ,

Analysis of DME signals

You might remember that back in July I made a recording of the DME ground-to-air and air-to-ground frequencies for a nearby VOR-DME station. In that post, I performed a preliminary analysis of the recording. I mentioned that I was interested in measuring the delay between the signals received directly from the aircraft and the ground transponder replies, and match these to the aircraft trajectories. This post is focused on that kind of study. I will present a GNU Radio out-of-tree module gr-dme that I have written to detect and measure DME pulses, and show a Jupyter notebook where I match aircraft pulses with their corresponding ground transponder replies and compare the delays to those calculated from the aircraft positions given in ADS-B data.

Not-LoRa GRCon24 CTF challenge

This year I submitted a track of challenges called “Not-LoRa” to the GRCon 2024 Capture The Flag. The idea driving this challenge was to take some analog voice signals and apply chirp spread spectrum modulation to them. Solving the challenge would require the participants to identify the chirp parameters and dechirp the signal. This idea also provided the possibility of hiding weak signals that are below the noise floor until they are dechirped, which is a good way to add harder flags. This blog post is an in-depth explanation of the challenge. I have put the materials for this challenge in this Github repository.

To give participants a context they might already be familiar with, I took the chirp spread spectrum parameters from several common LoRa modulations. These ended up being 125 kHz SF9, SF11 and SF7. LoRa is somewhat popular within the open source SDR community, and often there are LoRa challenges or talks in GRCon. This year was no exception, with a Meshtastic packet in the Signal Identification 7 challenge, and talks about gr-lora_sdr and Meshtastic_SDR.