Update on the Galileo GST-UTC anomaly

At the beginning of September I wrote about an ongoing anomaly in the offset between the GST (the Galileo System Time) and the UTC timescales. This short post is an update on this problem, with new data and plots.

For this post I have only used final solutions for the CODE precise clock biases. I have replaced the rapid solutions that I used in the last post by their final versions. The data under analysis now spans from day of year 225 (2023-08-13) to day of year 266 (2023-09-23).

The plot of the difference between the broadcast clock biases and the CODE precise clock biases now has the following aspect. Comparing with the previous post, we see that the difference has stayed around -15 ns until 2023-09-08, and then it has started increasing towards zero, but it has overshoot and it is at around 20 ns by 2023-09-23.

The plot of the system time differences in the broadcast messages is also quite interesting. For this I am using the IGS BRDC data until day of year 275 (2023-10-03). Recall that it appeared that the GST-UTC drift had the wrong sing, because it was clearly increasing but the drift had negative sign. Now the situation looks more complicated. Though it appears that the drift has the wrong sign around 2023-08-28 and 2023-09-22, there is also a segment around 2023-09-08 where the sign looks correct. Additionally, around 2023-09-01 the sign should be close to zero but is not.

For comparison, here is the same plot with the sign of the GST-UTC drift flipped. Arguably, the drift gives a better match most of the time, but certainly not around 2023-09-08. Therefore, the problem with the modelling of the GST-UTC drift in the Galileo broadcast message looks more complicated than just the sign bit being wrong.

The updated Jupyter notebook and data is in this repository.

Published
Categorised as Space Tagged

Decoding Aditya-L1

Aditya-L1 is an Indian solar observer satellite that was launched to the Sun-Earth Lagrange point L1 on September 2. It transmits telemetry in X-band at 8497.475 MHz. This post is going to be just a quick note. Lately I’ve been writing very detailed posts about deep space satellites and I’ve had to skip some missions such as Chandrayaan-3 because of lack of time. There is a very active community of amateur observers doing frequent updates, but the information usually gets scattered in Twitter and elsewhere, and it is hard to find after some months. I think I’m going to start writing shorter posts about some missions to collect the information and be able to find it later.

DVB-S2 GRCon23 CTF challenge

This year I submitted a challenge track based around a DVB-S2 signal to the GRCon CTF (see this post for the challenges I sent in 2022). The challenge description was the following.

I was scanning some Astra satellites and found this interesting signal. I tried to receive it with my MiniTiouner, but it didn’t work great. Maybe you can do better?

Note: This challenge has multiple parts, which can be solved in any order. The flag format is flag{...}.

A single SigMF recording with a DVB-S2 signal sampled at 2 Msps at a carrier frequency of 11.723 GHz was given for the 10 flags of the track. The description and frequency were just some irrelevant fake backstory to introduce the challenge, although some people tried to look online for transponders on Astra satellites at this frequency.

The challenge was inspired by last year’s NTSC challenge by Clayton Smith (argilo), which I described in my last year’s post. This DVB-S2 challenge was intended as a challenge with many flags that can be solved in any order and that try to showcase many different places where one might sensibly put a flag in a DVB-S2 signal (by sensibly I mean a place that is actually intended to transmit some information; surely it is also possible to inject flags into headers, padding, and the likes, but I didn’t want to do that). In this sense, the DVB-S2 challenge was a spiritual successor to the NTSC challenge, using a digital and modern TV signal.

Another source of motivation was one of last year’s Dune challenges by muad’dib, which used a DVB-S2 Blockstream Satellite signal. In that case demodulating the signal was mostly straightforward, and the fun began once you had the transport stream. In my challenge I wanted to have participants deal with the structure of a DVB-S2 signal and maybe need to pull out the ETSI documentation as a help.

I wanted to have flags of progressively increasing difficulty, keeping some of the flags relatively easy and accessible to most people (in any case it’s a DVB-S2 signal, so at least you need to know which software tools can be used it to decode it and take some time to set them up correctly). This makes a lot of sense for a challenge with many flags, and in hindsight most of the challenges I sent last year were quite difficult, so I wanted to have several easier flags this time. I think I managed well in this respect. The challenge had 10 flags, numbered more or less in order of increasing difficulty. Flags #1 through #8 were solved by between 7 and 11 teams, except for flag #4, which was somewhat more difficult and only got 6 solves. Flags #9 and #10 were significantly more difficult than the rest. Vlad, from the Caliola Engineering LLC team, was the only person that managed to get flag #10, using a couple of hints I gave, and no one solved flag #9.

When I started designing the challenge, I knew I wanted to use most of the DVB-S2 signal bandwidth to send a regular transport stream video. There are plenty of ways of putting flags in video (a steady image, single frames, audio tracks, subtitles…), so this would be close to the way that people normally deal with DVB-S2, and give opportunities for many easier flags. I also wanted to put in some GSE with IP packets to show that DVB-S2 can also be used to transmit network traffic instead of or in addition to video. Finally, I wanted to use some of the more fancy features of DVB-S2 such as adaptive modulation and coding for the harder flags.

To ensure that the DVB-S2 signal that I prepared was not too hard to decode (since I was putting more things in addition to a regular transport stream), I kept constantly testing my signal during the design. I mostly used a Minitiouner, since perhaps some people would use hardware decoders. I heard that some people did last year’s NTSC challenge by playing back the signal into their hotel TVs with an SDR, and for DVB-S2 maybe the same would be possible. Hardware decoders tend to be more picky and less flexible. I also tested gr-dvbs2rx and leandvb to have an idea of how the challenge could be solved with the software tools that were more likely to be used.

What I found, as I will explain below in more detail, is that the initial way in which I intended to construct the signal (which was actually the proper way) was unfeasible for the challenge, because the Minitiouner would completely choke with it. I also found important limitations with the software decoders. Basically they would do some weird things when the signal was not 100% a regular transport stream video, because other cases were never considered or tested too thoroughly. Still, the problems didn’t get too much in the way of getting the video with the easier flags, and I found clever ways of getting around the limitations of gr-dvbs2rx and leandvb to get the more difficult flags, so I decided that this was appropriate for the challenge.

In the rest of this post I explain how I put together the signal, the design choices I made, and sketch some possible ways to solve it.

Galileo GST-UTC anomaly

Each GNSS system uses their own timescale for all the calculations. What this means is slightly difficult to explain, since anything involving “paper clocks” is conceptually tricky. In essence, the system, as part of the orbit and clock determination, maintains a timescale (paper clock) to which the ephemeris are referenced. This timescale is usually related to some atomic clocks in the countries that manage the GNSS system, and these clocks often participate in the realization of UTC, so that in the end this GNSS timescale is traceable to UTC. For instance, GPS time is related to UTC(USNO), and GST, the Galileo system time, is related to a few atomic clocks in Europe that are realizations of UTC.

Since all the information for a single GNSS system is self-consistent with respect to the timescale it uses, a receiver using a single system to compute position and velocity does not really care about the GNSS timescale. Everything will just work. However, when the receiver needs to compute UTC time accurately (for instance to produce a 1PPS output that is aligned to UTC) or to mix measurements from several GNSS systems, then it needs to handle the different GNSS timescales involved, and transfer all the measurements to a common timescale (and in the case of measuring UTC time or producing a 1PPS output, relating this timescale to UTC).

As part of their broadcast ephemeris, GNSS systems broadcast an estimate of the offset between the system timescale and UTC (this estimate is just a forecast; the true value of UTC is only known a posteriori, when the BIPM publishes its monthly Circular T). Additionally, some systems also broadcast the offset between their timescale and the timescale of another system. For example, Galileo broadcasts the offset GST-GPST, the difference between Galileo system time and GPS time. This is intended as a help to receivers that want to combine measurements from both systems. However, these values are somewhat inaccurate, so it is common for receivers to ignore them, and determine the offsets on their own (this is done by adding extra variables called intersystem biases to the system of equations solved to obtain the navigation solution). This trick only works for combining measurements of different systems. In the case of UTC, receivers do not have any way of directly measuring UTC, so they must trust the broadcast message. Alternatively, some receivers offer an option to align their 1PPS output to a GNSS timescale such as GPST.

These days, most GNSS systems, including GPS and Galileo usually keep their timescales within a few nanoseconds of UTC. According to the specifications of each system, the relation to UTC is much loose. GPS time promises to be synchronized to UTC(USNO) with a maximum error of 1 microsecond. GLONASS time only promises 1 millisecond synchronization to UTC. GST is supposed to be within 50 ns of UTC, and Beidou time within 100 ns. However, if timekeeping is done correctly, there is no reason why the GNSS timescale should drift from UTC by more than a couple nanoseconds (which is the typical drift of physical realizations of UTC). Receivers should assume that the offsets can be as large as the maximum errors in the specification, but might occasionally freak out if they see an offset which is large, either because of bugs if this case was never tested, as in practice the offsets are typically small, or because it suspects an error in its own measurements.

A few days ago, some people in the time-nuts mailing list, and Bert Hubert from galmon.eu noticed that the GST-UTC, the difference between Galileo system time and UTC, had increased to 17 ns and remain around this value during the end of August. This is highly unusual, because GST-UTC is usually smaller than 2 or 3 ns.

galmon.eu provides a public Grafana dashboard where this anomaly can be seen, though the dashboard (or web browser) tends to choke if we request to plot more than one week worth of data.

Broadcast GST-UTC in the galmon.eu Grafana dashboard

When hearing about this on Friday, I got interested and commented some ways in which this could be researched further. I also wondered: is the GST-UTC offset actually large, or is it just that the offset estimate in the navigation message is wrong? In this post I explore one of my ideas.

Published
Categorised as Space Tagged

Demodulation of the 5G NR downlink

At the end of July, Benjamin Menkuec was commenting in Twitter about some issues he had while demodulating a 5G NR downlink recording. There was a lively discussion in which other people and I participated. I had never touched 5G, but had done some work with LTE, so I decided to take the chance to learn more about the 5G physical layer. Compared to LTE, the changes are more evolutionary than revolutionary, but understanding what has changed, and how and why, is part of the puzzle.

I had to take an 11.5 hour flight in a few days, so I thought it would be a nice challenge to give this a shot, take with me the recordings that Benjamin was using and all the 3GPP documents, and write a demodulator in a Jupyter notebook from scratch during the flight, as I had done in the past with my LTE recordings. This turned out to be an enjoyable and interesting experience, quite different from working at home in shorter intervals scattered over multiple days or weeks, and with internet access. At the end of the flight I had a mostly working demodulation, but it had a few weird problems that turned out to be caused by some mistakes and misconceptions. I worked on cleaning this up and solving the problems over the next few days, and also now preparing this post.

Rather than trying to give an account of all my mistakes and dead ends (I spoke a little about these in Twitter), in this post I will show the final clean solution. There are some tricky new aspects in 5G NR (phase compensation, as we will see below) which can be quite confusing, so hopefully this post will do a good job at explaining them in a simple way.

The Jupyter notebook used in this post is here, and the recording in SigMF format can be found in this folder. Here I will only be using the first of Benjamin’s two recordings, since they are quite similar. It was done with an ADALM Pluto at 7.86 Msps and has a duration of 143 ms. The transmitter is an srsRAN 5 MHz cell. The recording was done off-the-air, or maybe with a cabled set up, but there are some other signals leaking in. The SNR is very good, so this is not a problem.

The first signal we find is at 9 ms. There is a transmission like this every 10 ms. As we will see, this is an SS/PBCH block. Something that stands out to those familiar with the LTE downlink spectrum is that the 5G NR spectrum is almost empty. In LTE, the cell-specific reference signals are transmitted all the time. In 5G this is not the case. Downlink signals are transmitted only when there is traffic. There is always a burst of one or several SS/PBCH blocks transmitted periodically (usually every 20 ms, but in this recording every 10 ms), as well as other signals that are always sent periodically (such as the SIB1 in the PDSCH), but this may be all if there is no traffic in the cell.

SS/PBCH block waterfall

A very quick RFI survey at the Allen Telescope Array

At the beginning of August I visited the Allen Telescope Array (ATA) for the Breakthrough Listen REU program field trip (I have been collaborating with the REU program for the past few years). One of the things I did during my visit was to use an ADALM Pluto running Maia SDR to do a very quick scan of the radio frequency interference (RFI) on-site. This was not intended as a proper study of any sort (for that we already have the Hat Creek Radio Observatory National Dynamic Radio Zone project), but rather as a spoor-of-the-moment proof of concept.

The idea was to use equipment I had in my backpack (the Pluto, a small antenna, my phone and a USB cable), step just outside the main office, scroll quickly through the spectrum from 70 MHz to 6 GHz, stop when I saw any signals (hopefully not too many, since Hat Creek Radio Observatory is supposed to be a relatively quiet radio location), and make a SigMF recording of each of the signals I found. It took me about 15 minutes to do this, and I made 6 recordings in the process. A considerable amount of time was spent downloading the recordings to my phone, which takes about one minute per recording (since recordings are stored on the Pluto DDR, only one recording can be stored on the Pluto at a time, and it must be downloaded to the phone before making a new recording).

This shows that Maia SDR can be a very effective tool to get a quick idea of how the local RF environment looks like, and also to hunt for local RFI in the field, since it is quite easy to carry around a Pluto and a phone. The antenna I used was far from ideal: a short monopole for ~450 MHz, shown in the picture below. This does a fine job at receiving strong signals regardless of frequency, but its sensitivity is probably very poor outside of its intended frequency range.

ADALM Pluto and 450 MHz antenna

The following plot shows the impedance of the antenna measured with a NanoVNA V2. The impedance changes noticeably if I put my hand on the NanoVNA, with the resonant peak shifting down in frequency by 50 to 100 MHz. Therefore, this measurement should be taken just as a rough ballpark of how the antenna looks like on the Pluto, which I was holding with my hand.

Impedance of the 450 MHz antenna

As the antenna I used for this survey is pretty bad, the scan will only show signals that are actually very strong on the ATA dishes. The log-periodic feeds on the ATA dishes tend to pick up signals that do not bounce off the dish reflectors, and instead arrive to the feed directly from a side. This is different from a waveguide type feed, in which signals need to enter through the waveguide opening. Therefore, besides having the main lobe corresponding to a 6.1 m reflector, the dishes also have relatively strong sidelobes in many directions, with a gain roughly comparable to an omnidirectional antenna. The system noise temperature of the dishes is around 100 to 150 K (including atmospheric noise and spillover), while the noise figure of the Pluto is probably around 5 dB. This all means that the dishes are more sensitive to detect signals from any direction that the set up I was using. In fact, signals from GNSS satellites from all directions can easily be seen with the dishes several dB above the noise floor, but not with this antenna and the Pluto.

Additionally, the scan only shows signals that are present all or most of the time, or that I just happened to come across by chance. This quick survey hasn’t turned up any new RFI signals. All the signals I found are signals we already knew about and have previously encountered with the dishes. Still, it gives a good indication of what are the strongest sources of RFI on-site. Something else to keep in mind is that the frequency range of the Allen Telescope Array is usually taken as 1 – 12 GHz. Although the feeds probably work with some reduced performance somewhat below 1 GHz, observations are done above 1 GHz. Therefore, none of the signals I have detected below 1 GHz are particularly important for the telescope observations, since they are not strong enough to cause out-of-band interference.

I have published the SigMF recordings in the Zenodo dataset Quick RFI Survey at the Allen Telescope Array. All the recordings have a sample rate of 61.44 Msps, since this is what I was using to view the largest possible amount of spectrum at the same time, and a duration of 2.274 seconds, which is what fits in 400 MiB of the Pluto DDR when recording at 61.44 Msps 12 bit IQ (this is the maximum recording size for Maia SDR). The published files are as produced by Maia SDR. At some point it could be interesting to add additional metadata and annotations.

In the following, I do a quick description of each of the 6 recordings I made.

Analysis of the W-Cube 75 GHz beacon

W-Cube is a cubesat project from ESA with the goal of studying propagation in the W-band for satellite communications (71-76 GHz downlink, 81-86 GHz downlink). Current ITU propagation channel models for satellite communications only go as high as 30-40 GHz. W-Cube carries a 75 GHz CW beacon that is being used to make measurements to derive new propagation models. The prime contractor is Joanneum Research, in Graz (Austria). Currently, the 75 GHz beacon is active only when the satellite is above 10 degrees elevation in Graz.

Some months ago, a team from ESA led by Václav Valenta have started to make observations of this 75 GHz beacon using a portable groundstation in ESTEC (Neatherlands). The groundstation design is open source and the ESA team is quite open about this project. I have recently started to collaborate with them regarding opportunities to engage with the amateur radio community in this and future W-band propagation projects (there is an amateur radio 4 mm band, which usually covers from 75.5 to 81.5 GHz, depending on the country). As I write this, some of my friends in the Spain and Portugal microwave group are running tests with their equipment to try to receive the beacon.

On 2023-06-16, the team made an IQ recording of the beacon using their portable groundstation. They have published some plots about this observation. Now they have shared the recording with me so that I can analyse it. One of the things they are interested in is to evaluate the usage of the gr-satellites Doppler correction block to perform real-time Doppler correction with a TLE. So far they are doing Doppler correction in post-processing, but due to the high Doppler drift, it is not easy to see the uncorrected signal in a spectrum plot.

Tianwen-1 third anniversary

Today is the third anniversary of Tianwen-1, which was launched in 23 July 2020. During the first year of the mission I was tracking it with great detail and writing a lot of posts. A fundamental part of this work was the help of AMSAT-DL. Using its 20 metre antenna in Bochum, they tracked the spacecraft, decoded telemetry and provided live coverage of many of the key mission events in their Youtube livestream. At some point the reception of Tianwen-1’s telemetry at Bochum got completely automated, as we described in a paper in the GNU Radio Conference 2021 proceedings.

To this date, the reception continues in Bochum almost every day, when the dish is not busy with other tasks such as tracking STEREO-A. This means that we get good coverage of the spacecraft orbital data, via the state vectors transmitted in its telemetry.

To celebrate the anniversary, I have updated my plot of the orbital parameters, which I made back in March 2022. The plot covers all the remote sensing orbit, from when it started on 8 November 2021, until the present day. To my knowledge, no manoeuvres have happened in this time (perhaps small station-keeping burns would be unnoticed without more careful analysis), so the changes in the orbit are just due to perturbations by forces such as the Sun’s gravity and the oblateness of Mars.

The updated plot can be seen below. We see that the periapsis latitude changes at a steady rate of 0.598 deg/day. The remote sensing orbit was designed so that the periapsis precessed in this way, which allows the spacecraft to cover all the surface of Mars from a low altitude in 301 days. The surface has now been covered twice, since the periapsis has moved from near the north pole to the south pole and back again to the north pole.

There is also an interesting change in eccentricity, which seems to be correlated with the latitude of the periapsis. The eccentricity is largest when the periapsis is over the south pole. In this case, the altitude of the periapsis decreases by 60 km, compared to when the periapsis is over the north pole. The inclination has remained mostly steady, although there seems to be a small perturbation with an amplitude of 0.1 deg.

The updated Jupyter notebook in which this plot was made can be found here.

Decoding STEREO-A SECCHI images

STEREO-A is a NASA solar observation spacecraft that was launched in 2006 to a heliocentric orbit with a radius of about 1 au. One of the instruments, called SECCHI, takes images of the Sun using different telescopes and coronagraphs to study coronal mass ejections. Near real-time images of this instrument can be seen in this webpage. These images are certainly the most eye-catching data transmitted by STEREO-A in its X-band space weather beacon.

In September 2022, Wei Mingchuan BG2BHC made some recordings of the space weather beacon using a 13 meter antenna in Harbin Institute of Technology. I wrote a blog post showing a GNU Radio decoder and a Jupyter notebook analysing the decoded frames. I managed to figure out that some of the data corresponded to the S/WAVES instrument, which is an electric field sensor with a spectrometer covering 125 kHz to 16 MHz.

During this summer, STEREO-A is very close to Earth for the first time since it was launched. This has enabled amateur observers with small stations to receive and decode the space weather beacon. As part of these activities, Alan Antoine F4LAU and Scott Tilley VE7TIL have figured out how to decode the SECCHI images. Scott has published a summary of this in his blog, and Alan has added a decoding pipeline to SatDump. Using this information, I have now extended my Jupyter notebook to decode the SECCHI images. Eventually I want to turn this into a GNU Radio demo, and the Jupyter notebook serves as reference code.

Analysis of Euclid frames

In a previous post I looked at the modulation and coding of Euclid, the recently launched ESA space telescope. I processed a 4.5 hour recording that I did with two dishes from the Allen Telescope Array. This post is an analysis of the telemetry frames. As we will see, Euclid uses CCSDS TM Space Data Link frames and the PUS protocol version 1. The telemetry structure is quite similar to that of JUICE, which I described in an earlier post. I have re-used most of my analysis code for JUICE. However, it is interesting to look at the few differences between the two spacecraft.