Keep in mind that the description of the DSLWP-B camera system that I give here is based on what I hear from Wei Mingchuan BG2BHC, so some points might be slightly incorrect or incomplete.

The camera on DSLWP-B has a buffer that can hold 16 images. This buffer works as a circular buffer, so when the camera takes an image, it overwrites the next position in the buffer. The positions in the buffer are identified with ids from 0 to 15.

One of the images in the buffer can be transferred to the UHF transmitter for its download. Usually, the B1 transmitter that operates at 436.4MHz is used to download images (DSLWP-B has another transmitter, called B0, which operates at 435.4MHz and usually transmits GMSK and JT4 telemetry). The id of the image (given by the position in the buffer, from 0 to 15) held by the transmitter is included in the `cam_memory_id`

telemetry variable of the transmitter. When the image download starts, the image referred by `cam_memory_id`

is downloaded.

The B1 transmitter is currently configured to take an image when it powers up. The image is taken, overwriting the next position in the circular buffer of the camera, and also transferred automatically to the B1 transmitter. However, the download has to be commanded manually. Instead of transmitting this image, a command can be sent to load another image from the buffer in the UHF transmitter.

The latest values of `cam_memory_id`

for the B1 transmitter can be seen here. This allows us to keep track of which image is being transmitted currently.

Keeping track of when an image with a particular `cam_memory_id`

was taken can be more tricky. If the UHF transmitter powers up and sends telemetry before another image is loaded by a manual command, the id of the new image can be seen in the telemetry. The problem is that sometimes an image is loaded manually before any telemetry is sent or that there are no receiving stations collecting the telemetry. In any case, each time the UHF transmitter powers up and takes an image, the id of the new image increments by one (eventually jumping from 15 back to 0), so this can help us keep track.

To check when the UHF transmitter has powered up, a very useful telemetry variable is `runtime`

, which contains the time in seconds since the UHF transmitter has been powered. This variable can be plotted here. This allows us to calculate the time of power up when there are receiving listening to the telemetry. If no stations listen during the activation period, the image id can be deduced, but the time of boot up has to be taken from the schedules published by Wei.

Another data that can be useful is the SSDV image id. This cannot be seen in the online telemetry dashboard, but it is the first byte of the SSDV packets and it is also shown when the SSDV decoder is run (the output of the SSDV decoder running on the telemetry server can be seen here). I think that the SSDV id increments by one each time a new image is taking, and in contrast to the `cam_memory_id`

, which only goes from 0 to 15, the SSDV goes from 0 to 255, so it doesn’t warp as often.

All the dates and times mentioned in this post are UTC. These activities have been coordinated by Wei Mingchuan BG2BHC. The uplink of commands in the 2m Amateur satellite band has been done by Reinhard Kühn DK5LA. The downloads have been received by the PI9CAM Dwingeloo radiotelescope during most of the sessions and also by Mike Rupprecht DK3WN, Imants Tukleris YL3CT and Dimitriy Borzenko 4Z5CP.

On 2016-10-14 at 12:16, the UHF transmitter was activated and an image was taken by the onboard Inory eye camera. This image was then downloaded.

During the same session, an image taken on 2018-10-10 which hadn’t been transmitted yet was downloaded. It was expected that the Earth would be visible in the image, and this was indeed the case.

For this week, the following activation times were listed by Wei in Twitter. This is a novelty in that the UHF transmitter onboard DSLWP-B would be active for all days in the week (normally it is only active a couple of days per week). The goal was to take a series of several images and to download all of them, so many hours of activity would be needed for this.

When the UHF transmitter came on at 08:50 on 2018-10-15, a series of 8 images was taken automatically. The images were spaced by 10 minutes between each image, so the series lasted between 08:50 to 10:10. This series would be downloaded over the next days of the week. The ids of the images of the series were 3 through 10.

On 2018-10-16 there were no people that could operate the Dwingeloo radiotelescope, so the receiving stations for the download were YL3CT, DK3WN and 4Z5CP. During this session, four of the images of the series were transmitted. Only one of them was received completely (except for the first chunk, which is not sent due to some bug) and the others had gaps. The images transmitted were 6, 4, 7 and 5, in this order.

On 2018-10-17, Dwingeloo could be operated during the download. A new image of the series of 8, with id number 3, was downloaded, and missing chunks of the 3 images of the day before were transmitted to try to complete them. The images transmitted were 3, 4, 5, 7 and 3 (again, to complete missing gaps). Image 4 was completed, but 5 and 7 still had gaps.

On 2018-10-18, three new images of the series, with ids 8, 9 and 10 were transmitted, as well as missing chunks of image 5. The order of transmission was 8, 9, 5, 10, 9. Images 8, 9 and 10 were received completely, and image 5 was completed. Only image 7 from the series of 8 remained incomplete.

On 2018-10-19, the remaining chunks of image 7 were transmitted. After completing this image, at 18:18, the B1 transmitter (the one which transmits the images and operates at 436.4MHz) was changed from 250baud to 500baud by a command from DK5LA. Transmitter B0 (which operates at 435.4MHz) was also attempted to set to 500baud, but it didn’t accept the commands and the period of activity of the UHF transmitters finished.

Note that, although the B1 transmitter is now using 500baud, it uses an r=1/4 Turbo code instead of r=1/2, so the net data and the decoding threshold should be more or less as before. One of the advantages of 500baud is that it increases by a factor of two the ranging resolution when doing VLBI or cross-correlations with the Moonbounce signal.

In the future perhaps the Turbo code will be changed to r=1/2 at 500baud, so the data rate will double, making it possible to transmit the images twice as fast. However, the decoding threshold will be 3dB worse. This is not a problem for Dwingeloo, which sees very high SNRs, but it will difficult reception by smaller stations.

After completing the series of 8 images, Wei converted them to grayscale to remove the typical purple hue and then made an animated GIF with the sequence. This can be seen below.

It is interesting to compare this animation with the animated GIF made by Janos HG5APZ of my GMAT simulations, where the Moon is seen to move from the upper left to the lower right . The comparison shows that DSLWP-B was rotated approximately 90º to the left in comparison to my GMAT simulation (recall that the camera always points away from the Sun, but freely rotates around this axis).

]]>In this post I study the cross-correlation of the Moonbounce signal against the direct signal. This gives some information about how the radio signals behave when reflecting off the Moon. Essentially, we compute the Doppler spread and time delay of the Moonbounce channel.

In the previous post we’ve seen that the reflected signal was weak, barely visible over the noise floor. Still, we can compute the cross-correlation between the reflected and direct signals. The calculations have been done in this Jupyter notebook.

The cross-correlation algorithm I have used can be summarised by the following formula:\[(x \star y)_k(\tau, \omega) = \sum_{j = 0}^{M-1} |\mathcal{F}_N^{-1}[z_{k+js,\omega}](\tau)|^2,\]where\[z_{k,\omega} = \mathcal{F}_N(x(n+k)w(n) e^{i\omega n} )\overline{\mathcal{F}_N(y(n+k) w(n))},\] and \(x,y\) are the direct and reflected signals respectively, \(\tau\) is the time delay, \(\omega\) is the frequency, \(k\) is the starting sample for the correlation calculation, \(\mathcal{F}_N\) denotes an \(N\)-point DFT, and \(w\) is a window function.

The parameter \(s\) determines the DFT overlap. An usual choice which we will use here is \(s = N/2\), which means that half of the points of one DFT overlap with the next one.

The parameters \(N\) and \(M\) give the so-called coherent integration length and non-coherent integration length. We say that the coherent integration length is \(N\) samples and the non-coherent integration length is \(M(s-1) + N\) samples (or that we do \(M\) non-coherent integrations). The choice of the coherent and non-coherent integration lengths is important because it determines the SNR of the correlation, the frequency resolution and the computational cost. It is also related to the frequency stability of the signal: unstable signals do not get an SNR gain when doing longer coherent integrations.

The recordings made at Dwingeloo use a sampling rate of 40ksps. After several trials I have decided to set a coherent length of \(N = 2^{16}\) samples, or 1.64 seconds. I have chosen \(M = 20\), which gives a non-coherent length of 17.2 seconds.

The window \(w\) I have chosen is the flattop window, which has minimal scalloping loss.

I have decided to make a video with the values of the correlation \((x \star y)_k(\tau,\omega)\). The variable \(k\) is represented by the time of the video, and each video frame represents the values of \((x \star y)_k\) using a colour scale. The horizontal axis represents the delay \(\tau\) and the vertical axis represents the frequency \(\omega\). The index \(k\) is incremented by \(N/2\) samples for each frame, so the frames are \(k = 0, N/2, N, 3N/2, \ldots\)

To give a smoother response in the video, I have modified slightly the definition of \(x \star y\) to include a window \(v\) as\[(x \tilde{\star} y)_k(\tau, \omega) = \sum_{j=0}^{M-1} |\mathcal{F}_N^{-1}[z_{k+js,\omega}](\tau)|^2 v(j).\]The window I have chosen as \(v\) is the Hann window.

The video is played back at 12 frames per second to get a rate which is approximately 10 times faster than real time (actually we get 9.83 seconds of real time per second of video). Each of the videos takes approximately 40 minutes to generate on my computer.

If you recall from the previous post, there were two SSDV transmissions done at 436.4MHz which showed a Moonbounce signal. Each of the transmissions lasts for approximately 10 minutes. One of the transmissions started at 11:17 UTC and the other one at 11:38 UTC. I have done a separate analysis and a separate video for each transmissions. The videos can be seen below.

The span of the horizontal axis of the video is from -600 to 600 samples. Taking into account the speed of light, this is \(\pm 4497\)km. The centre corresponds to no delay (the reflected signal arriving at the same time as the direct signal), so we expect the correlation to appear slightly to the right, corresponding to the extra length that the reflected signal travels (more on this later).

The span of the vertical axis is from -50Hz (bottom) to 50Hz (top). We can see that the signal is spread in frequency more or less randomly over approximately 50Hz.

We can see in the videos that the frequency of the reflections has a chaotic behaviour as the satellite orbits and the reflection path changes slightly. Strong reflections sometimes appear up to \(\pm 25\)Hz from the centre frequency, which represents the Doppler predicted by the specular reflection on a sphere model (see the previous post). Other than the fact that the signal in the second video is slightly weaker than in the first one, we do not see any major difference between both.

Summing up all the correlations (which corresponds to taking \(M\) very large), we obtain the average pattern for the delay and Doppler spread. This is shown in the figures below.

In the figures below, we now sum along the frequency axis to obtain the correlations depending on delay only. We see that the delay of the second transmission is slightly larger than for the first one.

As we have mentioned above, the delay of the reflected signal corresponds to the excess path length travelled, in comparison with the direct signal. Assuming the specular reflection model, the excess path length is shown in the figure below (calculation in this Jupyter notebook).

The figure below shows the excess path length during the first transmission. We see that its value is centred approximately on 675km and that it only changes by approximately 200km during the whole 10 minute transmission. This change is small in comparison with the correlation peak (a symbol at 250baud is 1200km), so the correlation peak is smeared only very little by summing up over the whole transmission without correcting for the change in delay, as we have done above.

The figure below shows the excess path length during the second transmission. The average is around 1075km and the change is 250km.

The peaks of the correlations occur at a delay of 660 and 1042km, which match quite well the excess path length model shown above.

If instead of summing up over the frequency axis we sum up over the delay axis, we obtain the energy spectral density of the Doppler spread of the reflected signal. This is shown in the image below. I find it quite interesting that the shape of the spread fits quite well a triangle. It is also interesting that the spread is not centred on 0Hz (which represents the frequency predicted by the specular reflection model).

Finally, the two figures below show a comparison of the signal power of the direct and reflected signals. Power has been averaged over intervals of \(2^{14}\) samples, or 400ms. We note a difference of around 22dB in the first transmission and 24dB in the second. Also, the power of the reflection decreases as time progresses, while the power of the direct signal is more or less constant.

Getting good results in this analysis has not been so easy because the SNR of the reflected signal is low, as the two images above show. The large Doppler spread also makes long coherent integrations impractical. I have spent a lot of time fine tuning the parameters of the correlations, partly because it is not so clear how to obtain good results (the goal is to be able to discern some characteristics of the reflection from the video qualitatively, not to optimize some quantitative parameter), and partly because the computations are lengthy.

I am interested in any other alternative ways of processing the data that yield better or newer results or insight. Although I think that the SNR is not good enough to be able to model the channel well, I am also interested in seeing if the reflected signal can be modelled as a Rayleigh channel, but I think that I still need to understand fully some key concepts about fading channels to attempt this.

Finally, a good research idea would be to try to determine the area of the lunar surface where the reflection happens. I think that my data shows that this area is centred on the point predicted by the specular reflection model, but I have no clue about how large or small this area is. Ideally one would like to determine the energy density of the reflection over the lunar surface. I still haven’t thought much about how to do this, but I think that the profile of Doppler spread can be a good tool. The fact that the correlation peak is not smeared also sets an upper bound for the size of the reflection area (compare the size of one symbol, which is 1200km with the lunar radius, which is 1737km).

As a small remark, after seeing the Doppler spread, I think that decoding the data from the reflected signal is very difficult or impossible. Not only the signal is weak, but also the Doppler spread of 50Hz is quite large in comparison with the symbol rate of 250baud. It would be a real challenge to estimate the phase of the signal and to design a PLL which doesn’t slip.

]]>**Update 17:00 UTC:** Wei comments that the camera sensor is CMOS, not CCD, and it has 2592×1944 pixels. The image is resampled to 640×480 to save memory and bandwidth.

The orientation of the camera is fixed: the camera is mounted looking in the opposite direction of the solar panel, which is usually kept pointing directly to the Sun. Therefore, the camera is usually looking directly away from the Sun. The possibility of imaging celestial bodies such as the Moon and the Earth depends on the relative positions of these and the Sun.

During the first week of October there was a new Moon, which implied that it was possible to take images of the Moon and the Earth, as I have described in this post and this other post. This is a report of all the images taken and downloaded during the observation window.

On 2018-10-06 13:55 UTC, when a part of the Moon was expected to be in the field of view of the camera, an image was taken to be downloaded later. The image was downloaded the next day at 10:20 UTC (note that it takes roughly 10 minutes to download an image, so this time refers to the beginning of the transmission). Reinhard Kuehn DKL5A commanded the download using the 2m Amateur satellite band and the transmission was received by the 25m radiotelescope at Dwingeloo.

The image is very over-exposed. An analysis of the image showed a camera pointing error of 3 degrees in comparison with the predictions. This error is small and tolerable.

The small gap in the middle of the image was caused by a frequency jump in the transmitter. This can be seen clearly in the waterfall of the signal. The beginning of the image is missing because the first chunk of the SSDV data was not transmitted due to an unknown software problem. The end of the image is also missing, as the transmission was cut short.

After receiving this image, Reinhard sent commands prepared by Wei Mingchuan BG2BHC to correct the exposure of the camera and take another image to validate the exposure. The image was taken at 11:10 UTC and transmitted at 11:20 UTC. Analysis of this image shows a pointing error of 1.8 degrees. As in the previous image, the first SSDV chunk was not transmitted, so the beginning of the image is missing.

The two images shown above have been decoded by myself using the recordings of the radio signal done at Dwingeloo. Thus, they can differ slightly from other versions of the image that you can find on Twitter or in the DSLWP-B camera webpage. If you click on the images, you will get a JPEG file that has been directly decoded from the SSDV data transmitted by the satellite (so ignoring missing chunks, it contains the same JPEG data as the image onboard the satellite). There has been no editing of the image.

The rest of the images shown in this post have been obtained either from the DSLWP-B camera webpage or Twitter, since the recordings for these images have not been published.

The next day, on October 8, the UHF radio in DSLWP-B was scheduled to come on at 08:21 UTC. During this 2 hour activation, an image capture and download was commanded by Reinhard, received by the radiotelescope in Dwingeloo and published in Twitter by Cees Bassa.

Unfortunately, the recordings of the radio signal for this download have not been published, so the image shown below is taken from the DSLWP-B camera webpage. I do not have an exact confirmation for the time when the image was taken, but I expect it to be around 08:21 UTC, when the UHF transmitter was activated and the Moon was nearest to the centre of the camera.

After seeing that the image was still a bit over-exposed, another image with lower exposition was made and downloaded. This image is still over-exposed, since the automatic exposure algorithm of the camera needs a larger part of the Moon in the image to correct the exposure.

I do not have the exact time when this image was made, but given the fact that the Moon is still in view (it would quickly disappear as DSLWP-B continued its orbit) and the timeline of taking and transmitting the previous image, I expect that the image was taken around 08:45 UTC. The image was published on Twitter by Cees at 09:52 UTC.

Also, the first chunk of the image taken on 2018-10-07 at 11:10 UTC was downloaded during this activation, completing that image.

**Update 16:15 UTC:** Cees Bassa has now published the rest of the recordings done at Dwingeloo. On 2018-10-08 08:21 UTC the image taken at 08:21 UTC was transmitted twice to get all the image chunks. Then the image taken around 08:45 UTC was transmitted at 09:20 UTC, but not all the chunks were received. The 2018-10-08 09:31 UTC recording only contains small chunks missing from previous images. The first chunk of the 2018-10-07 11:10 UTC image and a couple of chunks missing from the 2018-10-08 08:45 UTC. I have also replaced the images above by images I have obtained directly from the Dwingeloo recordings.

On 2018-10-09 at 05:25 UTC the UHF radio was activated again and another image was taken automatically. This was the best time to take an image of the Moon and Earth according to the calculations I did in this post. Unfortunately, it was not possible to download the image at that time since the Moon was not visible from Dwingeloo.

The image was downloaded the next day, during a scheduled activation of the UHF transmitter on 2018-10-10 at 14:00 UTC. A partial image was published by Cees Bassa at 14:48 UTC and the missing chunks were downloaded later during the same activation.

The complete image was published on Twitter at 15:15 UTC together with a colour-corrected version made by Wei, which is shown below.

A large portion of this image was also received independently by Robert Mattaliano N6RFM in the United States using a modest station with a single long yagi.

Also, during these days, an image showing only a lens flare (or perhaps Earth terribly over-exposed) appeared in the DSLWP-B camera webpage. I do not know when this image was taken or downloaded. I would be grateful if anyone can provide some info about this image.

**Update 15:20 UTC:** Cees Bassa explains that the image below was the first image downloaded in the activation of the UHF transmitter on 2018-10-10 at 14:00 UTC. After downloading this image, Reinhard sent commands to transmit the image of the Moon and Earth taken at 2018-10-09 05:25 UTC. Cees does not know when the image below was taken, but he suspects that perhaps it was taken when the camera was activated at 2018-10-08 13:00 UTC. In this case, the bright spot could be the Earth (note that it occupies a similar area as the Earth in the other images).

The image of the Earth and Moon taken on October 9 is really impressive given the technology used to take it. Modifications of this image identifying the visible lunar craters have been shared. The event has been covered in the media with an article by Andrew Jones in gbtimes and some reports in Chinese media. **Update:** also in the New Scientist.

**Update 16:30 UTC:** Unfortunately, the 2018-10-10 recordings done at Dwingeloo show clock problems in the receiver. There are a lot frequency jumps which cause a lot of packet losses. The images I have decoded from these recordings at Dwingeloo are shown below. We see that the help from other stations was essential in obtaining complete images.

On October 6 an image of the Moon was taken to calibrate the exposure of the camera. This image was downlinked on the UTC morning of October 7. The download was commanded by Reinhard Kuehn DK5LA and received by the Dwingeloo radiotelescope.

Cees Bassa observed that in the waterfalls of the recordings made in Dwingeloo a weak Doppler-shifted signal of the DSLWP-B GMSK signal could be seen. This signal was a reflection off the Moon.

As far as I know, this is the first reported case of satellite-Moon-Earth (or SME) propagation, at least in Amateur radio. Here I do a Doppler analysis confirming that the signal is indeed reflected on the Moon surface and do some general remarks about the possibility of receiving the SME signal from DSLWP-B. Further analysis will be done in future posts.

Below you can see the tweet where Cees Bassa reported the Moonbounce signal. As you can see, it is an awesome result indeed, only achievable with large dishes such as Dwingeloo’s. He even “challenged” me to decode the Moonbounce signal. I think that the signal is too weak for the Turbo decoder to wipe all bit errors, but perhaps I will try something in a future post.

The figure below shows the direct path and Moonbounce Doppler for the 436.4MHz downlink signal of DSLWP-B, as seen in the groundstation at Dwingeloo, for the timespan of the recordings done on the morning of October 7.

The direct path Doppler is computed as usual by using GMAT to output the position and velocity of DSLWP-B in a topocentric frame of reference centred at Dwingeloo and calculating the Doppler as the projection of the velocity vector onto the line-of-sight vector (see this post). Computing the Moonbounce Doppler is not as easy as it might first seem. The details are included in an appendix below.

The figures below show the waterfalls of the recordings of the 436.5MHz signal overlaid with the direct and Moonbounce Doppler curves. The waterfalls have been shifted -750Hz to correct for the frequency offset in the local oscillators of DSLWP-B and the receiver in Dwingeloo, so that the direct Doppler curve matches the received signal.

In the figure above we see that there is a visible Moonbounce signal at the beginning of the recording. Then the difference between the direct and Moonbounce Dopplers is too small to tell the two signals apart.

We can also observe a frequency jump in the DSLWP-B TCXO in the long transmission, which contains the SSDV packets. These TCXO jumps have already been documented and they explain some of the losses of SSDV packets.

In the two figures above the difference between the direct and Moonbounce Dopplers is large and the Moonbounce signal can be seen almost all the time in the waterfall.

The three figures below contain the recordings of the 435.4MHz signal made at Dwingeloo. The Moonbounce signal is weaker, so Cees hadn’t noticed it, but with the help of the Doppler curve it can be seen sometimes.

The 435.5MHz signal alternates between GMSK and JT4G beacon transmissions. The lowest tone of the JT4G signal is 1kHz above the GMSK signal, and the tone separation is 312.5Hz. I have included lines marking the expected frequencies of each of the four JT4G tones, according to direct or Moonbounce Doppler.

At the beginning of the recording shown in the figure above, we have a visible reflection of the GMSK signal. Also, a reflection of the first JT4G signal is barely visible. It seems to show some Doppler spread.

In the figure above, faint reflections of the GMSK signal can be seen near the centre of the recording.

No reflections can be seen in the last 435.4MHz recording, shown in the figure above.

For a rough comparison of the signal strengths, I have extracted the beginning of the SSDV transmission in the second recording of the 436.4MHz signal and used my GMSK detector with the direct and Moonbounce signal (I have used a low pass filter in Audacity to remove the direct signal to force the GMSK detector to choose the Moonbounce signal). The results are shown below.

Direct signal:

Start time: 36.86s Frequency: 346.8Hz CN0: 45.1dB, EbN0: 21.1dB, SNR (in 2500Hz): 11.1dB

Moonbounce signal:

Start time: 20.13s Frequency: 8.1Hz CN0: 19.3dB, EbN0: -4.6dB, SNR (in 2500Hz): -14.6dB

Even thought the GMSK detector doesn’t detected the same packets (note the different start times), we see that there is a difference of roughly 25dB between the direct and reflected signals.

The figures below show the correlation in sync and frequency generated by the GMSK detector with the direct signal. The ASMs are clearly visible and the sync in frequency is very sharp. Note that the resolution in time is on the order of 1ms (or 300km) and the resolution in frequency is on the order of 5Hz.

The figures below show the correlations in time and frequency for the Moonbounce signal. Only the first and second ASM are clearly visible. The sync in frequency is still sharp. With the results of these correlations, there is no indication of Doppler spread, so it seems that the reflection happens in a specular manner over a relatively small area of the lunar surface.

In the future I would like to study the cross-correlation of the direct signal (which has a large SNR) against the Moonbounce signal. This might shed more light into whether there is any Doppler or time spread of the Moonbounce signal, indicating a weaker diffuse reflection over a larger area of the lunar surface. Another interesting by-product of this study would be the measurement of the excess path length of the reflected signal, allowing its comparison with the geometry described in the appendix below.

As a final remark, let us compare the path loss of the direct and Moonbounce paths. At a frequency of 436MHz and a distance of 380000km, the free space path loss is -196.8dB. This can be taken as a good estimate for the path loss of the direct path.

The bistatic radar equation shows that the path loss for the reflected path is\[L = \frac{\lambda^2 \sigma}{(4\pi)^3d_1^2d_2^2},\]where \(\lambda\) is the wavelength, \(\sigma\) is the radar cross-section of the reflector, and \(d_1\) and \(d_2\) are the distances between the transmitter and the reflector and between the receiver and the reflector respectively.

As a typical example of the Moonbounce geometry for DSLWP-B we can take \(d_1 = 10000\text{km}\) and \(d_2 = 380000\text{km}\). The radar cross-section of the Moon at a wavelength of 70cm,as seen from the Earth, is estimated as 0.07 times its physical area. The physical area of the side of the Moon visible from Earth is \(9.49\cdot10^{12} \text{m}^2\). Disregarding the fact that we are assuming that the reflection occurs only over a much smaller area of the surface, and using this value as physical area, we have a radar cross-section of \(\sigma = 6.64\cdot10^{11} \text{m}^2\).

Therefore, the estimated path loss of the Moonbounce path is -218.6dB. This means that the Moonbounce path is 21.8dB weaker than the direct path. Taking into account the fact that we have done a very rough estimate for the radar cross section of the reflection area, this fits nicely with the difference of 25dB that we have measured.

Thus, in theory, the Moonbounce signal of DSLWP-B should be receivable by large stations such as Dwingeloo that are able to receive the direct signal with an excess SNR of 25dB, given the right geometric conditions for the reflection. As a comparison, note that the EME path loss at 436MHz is -250.2dB. This is 31.6dB less than the path loss for SME in lunar orbit, and it explains why it is much easier to receive lunar orbit SME than EME. A transmitter on Earth with the same EIRP as DSLWP-B (around 1W) would be virtually impossible to receive in Dwingeloo via EME.

The calculations in this post have been done in this Jupyter notebook and using this GMAT script.

As a rough approximation to compute the Doppler of the Moonbounce path, one is tempted to take the path from DSLWP-B to the centre of the Moon and then from the centre of the Moon to Dwingeloo to compute the Doppler. Then the Moonbounce Doppler would be the sum of the DSLWP-B Doppler, as seen from the centre of the Moon, plus the Doppler of the centre of the Moon, as seen from Dwingeloo. Both of these quantities are easy to compute as we have done with the direct path Doppler.

However, one now may ask what happens when the centre of the Moon is replaced by a particular point on the lunar surface. It turns out that the Doppler curve depends a lot on which point is chosen. This is not so surprising. Since DSLWP-B is relatively close to the Moon, as it orbits, some points of the lunar surface see DSLWP-B approaching and others see it receding. Thus, different points of the lunar surface see DSLWP-B with rather different Dopplers.

Therefore, to have any chance of computing the Doppler of the Moonbounce signal that we have observed in Dwingeloo’s recordings, we first need to determine the point of the lunar surface where the reflection happens. To this end, we assume that the reflection is specular. Approximating the shape of the Moon by a sphere, we are lead to the following interesting mathematical problem.

Let \(A\) and \(B\) be points in \(\mathbb{R}^3\) and \(S\) a sphere such that the line segment joining \(A\) and \(B\) lies outside of \(S\). Prove that there is a unique point \(P \in S\) such that a ray from \(A\) to \(P\) reflects off \(S\) and passes through \(B\). Compute the point \(P\).

This problem can be reduced to the plane by noting that the point \(P\) and the ray should lie in the plane which contains \(A\), \(B\) and the centre of \(S\). Then we can assume that \(A\) and \(B\) are in \(\mathbb{R}^2\) and \(S\) is a circle. Still, it is not so easy to compute the coordinates of a point \(P\) satisfying the needed conditions (one can write a system of two non-linear equations).

An alternative approach is to consider an ellipse \(E\) whose foci are \(A\) and \(B\). If \(P \in E\) and a ray from \(A\) to \(P\) reflects off the ellipse \(E\), then it is known that it passes through \(B\) because of the properties of ellipses. If we can arrange so that the ellipse \(E\) is tangent to \(S\) and define \(P\) as the point of tangency, then the point \(P\) would satisfy the needed conditions, because the reflection off \(S\) at \(P\) is the same as the reflection off \(E\) at \(P\), since these two curves are tangent at \(P\).

Recall that the ellipse \(E\) can defined as the set of points \(Q\) such that \(|Q-A| + |Q-B| = k\), where \(k\) is a positive constant. If we consider the family of ellipses obtained by varying the parameter \(k\), we see that if \(k\) is small enough, then the \(E\) will not intersect \(S\). If we make \(k\) larger, eventually \(E\) will intersect \(S\). The smallest such \(k\) so that \(E\) intersects \(S\) gives an ellipse tangent to \(S\).

Looking at this analysis in another way, we see that the point of tangency \(P\) is the point \(P \in S\) that minimizes the sum of distances \(|P-A|+|P-B|\). From this characterization, the uniqueness of \(P\) follows and it also tells us a way to compute the point of tangency \(P\).

Turning back to our problem about computing the Moonbounce Doppler, we note an additional difficulty. As time passes, the relative positions of the spacecraft, the Moon and the groundstation at Dwingeloo change, so the reflection point \(P\) moves along the Moon surface. Therefore, the velocity of the point \(P\) should also be taken into account when computing the Doppler.

Actually it is easier to organize the computations by writing the Doppler in terms of the time derivative of the propagation path distance. The Doppler equals\[-\frac{f}{c}\frac{d}{dt}(|P-A|+|P-B|).\]

I have done the calculations in probably the least clever way, since I wanted to have this running quickly. We work with the list of positions of DSLWP-B and the Moon in a topocentric frame of reference centred in Dwingeloo. For each timestamp, the point \(P\) is found by searching the point that minimizes the sum of distances \(|P-A|+|P-B|\) where \(P\) runs along a fine grid of points on the Moon surface (actually the point \(P\) is not computed explicitly, since we only need to obtain the minimum sum of distances).

The grid of points on the lunar surface for the minimization is built by taking a grid of points equally spaced in latitude and longitude, so the grid is far from being uniformly distributed over the lunar surface. A grid size of 1000×1000 points seems to work well enough.

The time derivative of the propagation path distance is approximated as the difference quotient between two consecutive timestamps. To give a smooth result, the time step in GMAT is limited to a maximum of 10 seconds.

This algorithm seems to work well, as it generates a smooth Doppler curve that matches the observations. It is interesting to note that the Doppler in this case tells us something about the geometry of the reflection. We see that most of the reflection follows the path that a specular reflection would take.

It is also possible that part of the energy is reflected in a non-specular manner, bouncing off other points on the lunar surface and causing a Doppler spread. It is difficult to judge whether this effect happens just by looking at the waterfalls of the recordings.

I asked Wei Mingchuan BG2BHC to compare his calculations with mine and shortly after this he emailed me his planning for observations between October 8 and 10. After measuring the field of view of the camera as 37×28 degrees, we can plot the angular distances between the Moon or Earth and the centre of the camera to check if the celestial body will be in the field of view of the camera.

The image below shows the angular distance between these celestial bodies and the centre of the camera. As we did in the photo planning post, we assume that the camera points precisely away from the Sun. Since the Moon and Earth (especially the Moon) have an angular size of several degrees, we plot the centre of these objects with a dashed line and the edges which are nearest and furthest from the camera centre with a solid line.

The field of view of the camera is represented with dotted red lines. Since the field of view is a rectangle, we have one mark for the minimum field of view, which is attained between the centre of the image and the centre of the top or bottom edge, and another mark for the maximum field of view, which is attained between the centre of the image and each of the corners.

The rotation of the spacecraft around the camera axis is not controlled precisely, so objects between the two red lines may or may not appear in the image depending on the rotation. Objects below the lower red line are guaranteed to appear if the pointing of the camera is correct and objects above the upper red line will not appear in the image, regardless of the rotation around the camera axis.

To calibrate the exposure of the camera, an image was taken yesterday on 2018-10-06 13:55 UTC. This time is marked in the figure above with an orange line. The image was downloaded this UTC morning. The download was commanded by Reinhard Kuehn DK5LA and received by Cees Bassa and the rest of the PI9CAM team in Dwingeloo. This image is shown below.

The image shows an over exposed Moon. Here we are interested in using the image to confirm the orientation of the camera. The distance between the centre of the image and the edge of the Moon is 240 pixels, which amounts to 14 degrees. The plot above gives a distance of 11 degrees between the edge of the Moon and the camera centre.

Thus, it seems that the camera is pointed off-axis by 3 degrees. This error is not important for scheduling camera photos, since an offset of a few degrees represents a small fraction of the total field of view and the largest error in predicting what will appear in the image is due to rotation of the spacecraft around the camera axis.

The observations planned by Wei for the upcoming days are shown in the plot above by green lines. The start of the observation is marked with a dashed line and the end of the observation (which is 2 hours later) is marked with a dotted line. The camera should take an image at the beginning of the observation and then we have 2 hours to download the image during the rest of the observation.

We see that Wei has taken care to schedule observations exactly on the next three times that the Moon will be closest to the camera centre. This gives the best chance of getting good images of the lunar surface (but the Moon will only fill the image partially, as in the picture shown above).

There are also two additional observations planned when the Moon is not in view. The first, on October 8 is guaranteed to give a good image of the Earth. The second, on October 10 will only give an image of the Earth if the rotation of the spacecraft is right.

The orbital calculations for the plot shown above have been done in GMAT. I have modified the photo_planning.script script to output a report with the coordinates of the Earth and the Moon in the Sun-pointing frame of reference (see the photo planning post).

The angle between the centre of the camera and the centre of the Earth of the Moon can be calculated as\[\arccos\left(\frac{x}{\sqrt{x^2+y^2+z^2}}\right),\]where \((x,y,z)\) are the coordinates of the celestial body in the Sun-pointing frame of reference. The apparent angular radius of the celestial body can be computed as\[\arcsin\left(\frac{r}{\sqrt{x^2+y^2+z^2}}\right),\]where \(r\) is the mean radius of the body.

These calculations and the plot have been made in this Jupyter notebook.

Here I measure the field of view of the camera using the image of Mars and Capricornus taken on August 4, confirming that these numbers are measured from the centre of the image to one side, so the total field of view is 28×37 degrees.

The first image taken by the DSLWP-B Inory eye camera was a picture of the sky including Mars and the constellation of Capricornus. It was taken on 2018-08-04 15:30 UTC and downloaded later. This image is shown below in enhanced contrast with some of the stars identified.

The distance between Mars and β Aquarii in the image above is 454 pixels. Using Astropy we compute in this Jupyter notebook the separation between Mars and β Aquarii to be 26.5 degrees. Since the full image is 640×480 pixels, this amounts to 37.36×28.02 degrees, which is very close to the value of 37×28 degrees mentioned above. This measurement confirms the field of view of the camera.

The picture below shows a diagram of DSLWP-B. The solar panel is behind the main body, pointing towards the back right. The Inory eye camera points in the opposite direction, towards the front left of the diagram. The camera is located in the front panel of the Amateur radio module, which is one of the pink modules.

In the nominal flight configuration, the solar panel is pointed towards the Sun to gather the maximum amount of sunlight. Therefore, the camera should point opposite to the Sun. This is good, because objects in the field of view of the camera are guaranteed to be well illuminated and it also prevents the Sun from appearing in the image, causing lens flare.

In one of the images taken previously by the camera there is a very bright lens flare, possibly caused by the Sun hitting the camera directly. This seems to indicate that the spacecraft is not always in the nominal flight orientation. As the orientation is very important when doing photo planning, here we assume that the spacecraft is always oriented in the nominal flight configuration, with its camera pointing opposite to the Sun.

We use GMAT with the 20181006 tracking file from dswlp_dev for the orbital calculations. The GMAT script can be downloaded here.

To simulate the orientation of the camera in GMAT, I have used an ObjectReferenced coordinate system. The primary is the Sun, the secondary is DSLWP-B, and the axes are chosen as X = R, Z = N. Therefore, the X axis points away from the Sun, in the direction of the camera. The orbit view is centred on DSLWP-B and points towards +X, so it shows what the camera would see (except for the field of view, which is not simulated).

When propagating the orbit in GMAT we see that the Earth passes near the +X axis every Lunar month and that the Moon crosses the image every orbit.

The two images below show the dates when the Earth will pass nearby the +X axis. These dates are good candidates for taking images of the Earth. From the point of view of the camera, the Earth seems to orbit slowly from right to left in these images. Therefore, there should be a tolerance of a couple of days as to when the images of the Earth could be taken.

Thus, we see that the next good dates for attempting to take images of the Earth are October 9 and November 7 (plus/minus a couple days of margin).

The Moon is much larger from the point of view of DSLWP-B and we have seen already that it fills the field of view of the camera completely. Thus, even though in this GMAT simulation the Moon crosses the screen every orbit, we need to wait until the path taken by the Moon is near the centre of the screen.

In the image below we see a good moment to take an image of the Moon. From the point of view of the camera, the Moon crosses from top to bottom of the screen every orbit and its path moves slightly to the right every time, taking it closer to the centre as we progress towards December.

Therefore, a good moment to attempt to take an image of the Moon is late October and all of November. However, the time when the picture is taken is critical, because the Moon crosses the screen quickly. It is near the +X axis only for one or two hours. Therefore, starting in late October, there will be a window of a couple hours each orbit (or each day, since the orbital period is close to one day) where photos of the Moon can be attempted.

Judging by one of Wei Mingchuan BG2BHC’s lasts tweets, he has also been thinking about dates to take good images with the camera. It would be interesting to know if his findings match what I have exposed here.

A good question when doing this sort of planning is what is the field of view of the camera. Probably this can be estimated from some of the existing images.

]]>Since I know that many international people would be interested in this talk and the talk was in Spanish, I wanted to prepare something in English. As I would be overlaying the slides on the video (since the video quality is quite poor and it is impossible to see the projected slides), I thought of overlaying the slides also in English and adding English subtitles.

Doing the subtitles has been a nightmare. I have been working on this non-stop since Sunday morning (as my free time allows me). I intended to use YouTube’s transcript and auto-sync feature, where you provide a transcript (in the video’s original language) and presumably an AI will auto-sync this to the video’s audio. I say presumably because in my case the auto-sync was a complete failure, so I had to resync everything by hand.

Also, since I have listened to the video over and over, I have gotten a bit bored of my own voice and noted some expressions and words that I tend to use a lot. I think this is good, because now I am more conscious when talking in public and can try to avoid using these expressions so much.

In any case, here is the final result. First you have the video with English slides:

And also you have the video with Spanish slides:

Remember that you can use either Spanish or English subtitles with any of the videos. Also, I have the translation contributions enabled, so feel free to provide subtitles in your own language if you wish.

The PDF slides for the talk can be downloaded in Spanish here and in English here.

By the way, in case anyone is interested (or I need to do this again), the procedure I use to overlay the slides on the video is as follows. I use ImageMagick to convert the PDF file to PNG images doing

`convert -density 600 -antialias dslwp.pdf -resize 1109x -quality 100 "slides-%02d.png"`

Here I chose the resolution that I want the slide to have on the video. In this case it is 1109 pixels wide, while the video resolution is 1920×1080.

Then I embed each PNG image in a 1920×1080 image by using a transparent background.

`for file in *.png; do convert $file -gravity Northeast -background transparent -extent 1920x1080 resized/$file; done`

Finally I use OpenShot to import the 1920×1080 images, sync them up with the recording, and produce the final video.

]]>It turns out that the cause of the decoding failures is that the DSLWP-B clock is running a few seconds late. Thus, the JT4G transmission starts several seconds after the start of the UTC minute and so the decoding fails, since WSJT-X only searchs for a time offset of a few seconds.

Satou Tetsurou JA0CAW has shared a couple of recordings of the JT4G beacon made by JA5BLZ. The recordings can be downloaded here: 180912_0420.wav, 180912_0440.wav. These have been made with WSJT-X, so they start on the top of the UTC minute according to JA5BLZ’s clock.

I have used my JT4G detection algorithm to find the start of the DSLWP-B JT4G transmissions in JA5BLZ’s recordings. The results are show below. We see that the start of the transmission is received between 8 and 9 seconds after the start of the UTC minute.

Comparing these results with the first JT4G test, where the transmission starts between seconds 2 and 3, we see that the DSLWP-B clock is running approximately 6 seconds behind.

During today’s tests, it seems that the clock problem persists. See for instance this report by Ferruccio IW1DTU. It will be interesting to see how this problem evolves, and whether the DSLWP-B clock can be set remotely.

]]>