If you have been following my latest posts, you will know that a series of observations with the DSLWP-B Inory eye camera have been scheduled over the last few days to try to take and download images of the Moon and Earth (see my last post). In a future post I will do a chronicle of these observations.
On October 6 an image of the Moon was taken to calibrate the exposure of the camera. This image was downlinked on the UTC morning of October 7. The download was commanded by Reinhard Kuehn DK5LA and received by the Dwingeloo radiotelescope.
Cees Bassa observed that in the waterfalls of the recordings made in Dwingeloo a weak Doppler-shifted signal of the DSLWP-B GMSK signal could be seen. This signal was a reflection off the Moon.
As far as I know, this is the first reported case of satellite-Moon-Earth (or SME) propagation, at least in Amateur radio. Here I do a Doppler analysis confirming that the signal is indeed reflected on the Moon surface and do some general remarks about the possibility of receiving the SME signal from DSLWP-B. Further analysis will be done in future posts.
Yesterday I looked at the photo planning for DSLWP-B, studying the appropriate times to take images with the DSLWP-B Inory eye camera so that there is a chance of getting images of the Moon or Earth. As I remarked, the Earth will be in view of the camera over the next few days, so this is a good time to plan and take images.
I asked Wei Mingchuan BG2BHC to compare his calculations with mine and shortly after this he emailed me his planning for observations between October 8 and 10. After measuring the field of view of the camera as 37×28 degrees, we can plot the angular distances between the Moon or Earth and the centre of the camera to check if the celestial body will be in the field of view of the camera.
The image below shows the angular distance between these celestial bodies and the centre of the camera. As we did in the photo planning post, we assume that the camera points precisely away from the Sun. Since the Moon and Earth (especially the Moon) have an angular size of several degrees, we plot the centre of these objects with a dashed line and the edges which are nearest and furthest from the camera centre with a solid line.
The field of view of the camera is represented with dotted red lines. Since the field of view is a rectangle, we have one mark for the minimum field of view, which is attained between the centre of the image and the centre of the top or bottom edge, and another mark for the maximum field of view, which is attained between the centre of the image and each of the corners.
The rotation of the spacecraft around the camera axis is not controlled precisely, so objects between the two red lines may or may not appear in the image depending on the rotation. Objects below the lower red line are guaranteed to appear if the pointing of the camera is correct and objects above the upper red line will not appear in the image, regardless of the rotation around the camera axis.
To calibrate the exposure of the camera, an image was taken yesterday on 2018-10-06 13:55 UTC. This time is marked in the figure above with an orange line. The image was downloaded this UTC morning. The download was commanded by Reinhard Kuehn DK5LA and received by Cees Bassa and the rest of the PI9CAM team in Dwingeloo. This image is shown below.
The image shows an over exposed Moon. Here we are interested in using the image to confirm the orientation of the camera. The distance between the centre of the image and the edge of the Moon is 240 pixels, which amounts to 14 degrees. The plot above gives a distance of 11 degrees between the edge of the Moon and the camera centre.
Thus, it seems that the camera is pointed off-axis by 3 degrees. This error is not important for scheduling camera photos, since an offset of a few degrees represents a small fraction of the total field of view and the largest error in predicting what will appear in the image is due to rotation of the spacecraft around the camera axis.
The observations planned by Wei for the upcoming days are shown in the plot above by green lines. The start of the observation is marked with a dashed line and the end of the observation (which is 2 hours later) is marked with a dotted line. The camera should take an image at the beginning of the observation and then we have 2 hours to download the image during the rest of the observation.
We see that Wei has taken care to schedule observations exactly on the next three times that the Moon will be closest to the camera centre. This gives the best chance of getting good images of the lunar surface (but the Moon will only fill the image partially, as in the picture shown above).
There are also two additional observations planned when the Moon is not in view. The first, on October 8 is guaranteed to give a good image of the Earth. The second, on October 10 will only give an image of the Earth if the rotation of the spacecraft is right.
The orbital calculations for the plot shown above have been done in GMAT. I have modified the photo_planning.script script to output a report with the coordinates of the Earth and the Moon in the Sun-pointing frame of reference (see the photo planning post).
The angle between the centre of the camera and the centre of the Earth of the Moon can be calculated as\[\arccos\left(\frac{x}{\sqrt{x^2+y^2+z^2}}\right),\]where \((x,y,z)\) are the coordinates of the celestial body in the Sun-pointing frame of reference. The apparent angular radius of the celestial body can be computed as\[\arcsin\left(\frac{r}{\sqrt{x^2+y^2+z^2}}\right),\]where \(r\) is the mean radius of the body.
In my previous post, I wondered about what was the field of view of the Inory eye camera in DSLWP-B. Wei Mingchuan BG2BHC has answered me on Twitter that the field of view of the camera is 14×18.5 degrees. However, he wasn’t clear about whether these figures are measured from the centre of the image to one side or between two opposite sides of the image. I guess that these values are measured from the centre to one side, since otherwise the total field of view of the camera seems too small.
Here I measure the field of view of the camera using the image of Mars and Capricornus taken on August 4, confirming that these numbers are measured from the centre of the image to one side, so the total field of view is 28×37 degrees.
As you may already know, most of the pictures taken so far by the DSLWP-B Inory eye camera have been over-exposed purple things. In order to take more interesting pictures, such as pictures of the Earth and the Moon, we have to plan ahead and know when these objects will be in view of the camera.
The picture below shows a diagram of DSLWP-B. The solar panel is behind the main body, pointing towards the back right. The Inory eye camera points in the opposite direction, towards the front left of the diagram. The camera is located in the front panel of the Amateur radio module, which is one of the pink modules.
In the nominal flight configuration, the solar panel is pointed towards the Sun to gather the maximum amount of sunlight. Therefore, the camera should point opposite to the Sun. This is good, because objects in the field of view of the camera are guaranteed to be well illuminated and it also prevents the Sun from appearing in the image, causing lens flare.
In one of the images taken previously by the camera there is a very bright lens flare, possibly caused by the Sun hitting the camera directly. This seems to indicate that the spacecraft is not always in the nominal flight orientation. As the orientation is very important when doing photo planning, here we assume that the spacecraft is always oriented in the nominal flight configuration, with its camera pointing opposite to the Sun.
We use GMAT with the 20181006 tracking file from dswlp_dev for the orbital calculations. The GMAT script can be downloaded here.
To simulate the orientation of the camera in GMAT, I have used an ObjectReferenced coordinate system. The primary is the Sun, the secondary is DSLWP-B, and the axes are chosen as X = R, Z = N. Therefore, the X axis points away from the Sun, in the direction of the camera. The orbit view is centred on DSLWP-B and points towards +X, so it shows what the camera would see (except for the field of view, which is not simulated).
When propagating the orbit in GMAT we see that the Earth passes near the +X axis every Lunar month and that the Moon crosses the image every orbit.
The two images below show the dates when the Earth will pass nearby the +X axis. These dates are good candidates for taking images of the Earth. From the point of view of the camera, the Earth seems to orbit slowly from right to left in these images. Therefore, there should be a tolerance of a couple of days as to when the images of the Earth could be taken.
Thus, we see that the next good dates for attempting to take images of the Earth are October 9 and November 7 (plus/minus a couple days of margin).
The Moon is much larger from the point of view of DSLWP-B and we have seen already that it fills the field of view of the camera completely. Thus, even though in this GMAT simulation the Moon crosses the screen every orbit, we need to wait until the path taken by the Moon is near the centre of the screen.
In the image below we see a good moment to take an image of the Moon. From the point of view of the camera, the Moon crosses from top to bottom of the screen every orbit and its path moves slightly to the right every time, taking it closer to the centre as we progress towards December.
Therefore, a good moment to attempt to take an image of the Moon is late October and all of November. However, the time when the picture is taken is critical, because the Moon crosses the screen quickly. It is near the +X axis only for one or two hours. Therefore, starting in late October, there will be a window of a couple hours each orbit (or each day, since the orbital period is close to one day) where photos of the Moon can be attempted.
Judging by one of Wei Mingchuan BG2BHC’s lasts tweets, he has also been thinking about dates to take good images with the camera. It would be interesting to know if his findings match what I have exposed here.
A good question when doing this sort of planning is what is the field of view of the camera. Probably this can be estimated from some of the existing images.
As you may already know if you follow me on Twitter, on Saturday September 15 I was in Ávila in the IberRadio Spanish Amateur Radio fair giving a talk about DSLWP-B. The talk was well received and I had quite a full room. I recorded my talk to upload it later to YouTube as I did last year.
Since I know that many international people would be interested in this talk and the talk was in Spanish, I wanted to prepare something in English. As I would be overlaying the slides on the video (since the video quality is quite poor and it is impossible to see the projected slides), I thought of overlaying the slides also in English and adding English subtitles.
Doing the subtitles has been a nightmare. I have been working on this non-stop since Sunday morning (as my free time allows me). I intended to use YouTube’s transcript and auto-sync feature, where you provide a transcript (in the video’s original language) and presumably an AI will auto-sync this to the video’s audio. I say presumably because in my case the auto-sync was a complete failure, so I had to resync everything by hand.
Also, since I have listened to the video over and over, I have gotten a bit bored of my own voice and noted some expressions and words that I tend to use a lot. I think this is good, because now I am more conscious when talking in public and can try to avoid using these expressions so much.
In any case, here is the final result. First you have the video with English slides:
And also you have the video with Spanish slides:
Remember that you can use either Spanish or English subtitles with any of the videos. Also, I have the translation contributions enabled, so feel free to provide subtitles in your own language if you wish.
This Wednesday, a DSLWP-B test was done between 04:00 and 06:00 UTC. During this test, a few stations reported on Twitter that they were able to receive the JT4G signal correctly (they saw the tones on the waterfall) but decoding failed.
It turns out that the cause of the decoding failures is that the DSLWP-B clock is running a few seconds late. Thus, the JT4G transmission starts several seconds after the start of the UTC minute and so the decoding fails, since WSJT-X only searchs for a time offset of a few seconds.
Ever since simulating DSLWP-B’s long term orbit with GMAT, I wanted to understand the cause of the periodic perturbations that occur in some Keplerian elements such as the eccentricity. As a reminder from that post, the eccentricity of DSLWP-B’s orbit shows two periodic perturbations (see the figure below). One of them has a period of half a sidereal lunar month, so it should be possible to explain this effect from the rotation of the Moon around the Earth. The other has a period on the order of 8 or 9 months, so explaining this could be more difficult.
In this post I look at how to model the perturbations of the orbit of a satellite in lunar orbit, explaining the behaviour of the long term orbit of DSLWP-B.
Even though the cubesat LilacSat-1 was launched more than a year ago, I haven’t played with it much, since I’ve been busy with many other things. I tested it briefly after it was launched, using its Codec2 downlink, but I hadn’t done anything else since then.
LilacSat-1 has an FM/Codec2 transponder (uplink is analog FM in the 2m band and downlink is Codec2 digital voice in the 70cm band) and a camera that can be remotely commanded to take and downlink JPEG images (see the instructions here). Thus, it offers very interesting possibilities.
Since I have some free time this weekend, I had planned on playing again with LilacSat-1 by using the Codec2 transponder. Wei Mingchuan BG2BHCpersuaded me to try the camera as well, so I teamed up with Mike Rupprecht DK3WN to try the camera this morning. Mike would command the camera, since he has a fixed station with more power, and we would collaborate to receive the image. This is important because a single bit error or lost chunk in a JPEG file ruins the image from the point where it happens, and LilacSat-1 doesn’t have much protection against these problems. By joining the data received by multiple stations, the chances of receiving the complete image correctly are higher.
The SiriusSats are using 4k8 FSK AX.25 packet radio at 435.570MHz and 435.670MHz respectively, using callsigns RS13S and RS14S. The Tanushas transmit at 437.050MHz. Tanusha-3 normally transmits 1k2 AFSK AX.25 packet radio using the callsign RS8S, but Mike Rupprecht sent me the other day a recording of a transmission from Tanusha-3 that he could not decode.
It turns out that the packet in this recording uses a very peculiar modulation. The modulation is FM, but the data is carried in audio frequency phase modulation with a deviation of approximately 1 radian. The baudrate is 1200baud and the frequency for the phase modulation carrier is 2400Hz. The coding is AX.25 packet radio.
Why this peculiar mode is used in addition to the standard 1k2 packet radio is a mystery. Mike believes that the satellite is somehow faulty, since the pre-recorded audio messages that it transmits are also garbled (see this recording). If this is the case, it would be very interesting to know which particular failure can turn an AFSK transmitter into a phase modulation transmitter.
I have added support to gr-satellites for decoding the Tanusha-3 phase modulation telemetry. To decode the standard 1k2 AFSK telemetry direwolf can be used. The decoder flowgraph can be seen in the figure below.
The FM demodulated signal comes in from the UDP source. It is first converted down to baseband and then a PLL is used to recover the carrier. The Complex to Arg block recovers the phase, yielding an NRZ signal. This signal is lowpass filtered, and then clock recovery, bit slicing and AX.25 deframing is done. Note that it is also possible to decode this kind of signal differentially, without doing carrier recovery, since the NRZI encoding used by AX.25 is differential. However, the carrier recovery works really well, because there is a lot of residual carrier and this is an audio frequency carrier, so it should be very stable in frequency.
The recording that Mike sent me is in tanusha3_pm.wav. It contains a single AX.25 packet that when analyzed in direwolf yields the following.
RS8S>ALL:This is SWSU satellite TANUSHA-3 from Russia, Kursk<0x0d>
------
U frame UI: p/f=0, No layer 3 protocol implemented., length = 68
dest ALL 0 c/r=1 res=3 last=0
source RS8S 0 c/r=0 res=3 last=1
000: 82 98 98 40 40 40 e0 a4 a6 70 a6 40 40 61 03 f0 ...@@@...p.@@a..
010: 54 68 69 73 20 69 73 20 53 57 53 55 20 73 61 74 This is SWSU sat
020: 65 6c 6c 69 74 65 20 54 41 4e 55 53 48 41 2d 33 ellite TANUSHA-3
030: 20 66 72 6f 6d 20 52 75 73 73 69 61 2c 20 4b 75 from Russia, Ku
040: 72 73 6b 0d rsk.
------
The contents of the packet are a message in ASCII. The message is of the same kind as those transmitted in AFSK.
If you’ve being following my latestposts, probably you’ve seen that I’m taking great care to decode as much as possible from the SSDV transmissions by DSLWP-B using the recordings made at the Dwingeloo radiotelescope. Since Dwingeloo sees a very high SNR, the reception should be error free, even without any bit error before Turbo decoding.
However, there are some occasional glitches that corrupt a packet, thus losing an SSDV frame. Some of these glitches have been attributed to a frequency jump in the DSLWP-B transmitter. This jump has to do with the onboard TCXO, which compensates frequency digitally, in discrete steps. When the frequency jump happens, the decoder’s PLL loses lock and this corrupts the packet that is being received (note that a carrier phase slip will render the packet undecodable unless it happens very near the end of the packet).
There are other glitches where the gr-dslwp decoder is at fault. The ones that I’ve identify deal in one way or another with the detection of the ASM (attached sync marker). Here I describe some of these problems and my proposed solutions.