In my previous post, I wondered about what was the field of view of the Inory eye camera in DSLWP-B. Wei Mingchuan BG2BHC has answered me on Twitter that the field of view of the camera is 14×18.5 degrees. However, he wasn’t clear about whether these figures are measured from the centre of the image to one side or between two opposite sides of the image. I guess that these values are measured from the centre to one side, since otherwise the total field of view of the camera seems too small.
Here I measure the field of view of the camera using the image of Mars and Capricornus taken on August 4, confirming that these numbers are measured from the centre of the image to one side, so the total field of view is 28×37 degrees.
As you may already know, most of the pictures taken so far by the DSLWP-B Inory eye camera have been over-exposed purple things. In order to take more interesting pictures, such as pictures of the Earth and the Moon, we have to plan ahead and know when these objects will be in view of the camera.
The picture below shows a diagram of DSLWP-B. The solar panel is behind the main body, pointing towards the back right. The Inory eye camera points in the opposite direction, towards the front left of the diagram. The camera is located in the front panel of the Amateur radio module, which is one of the pink modules.
In the nominal flight configuration, the solar panel is pointed towards the Sun to gather the maximum amount of sunlight. Therefore, the camera should point opposite to the Sun. This is good, because objects in the field of view of the camera are guaranteed to be well illuminated and it also prevents the Sun from appearing in the image, causing lens flare.
In one of the images taken previously by the camera there is a very bright lens flare, possibly caused by the Sun hitting the camera directly. This seems to indicate that the spacecraft is not always in the nominal flight orientation. As the orientation is very important when doing photo planning, here we assume that the spacecraft is always oriented in the nominal flight configuration, with its camera pointing opposite to the Sun.
We use GMAT with the 20181006 tracking file from dswlp_dev for the orbital calculations. The GMAT script can be downloaded here.
To simulate the orientation of the camera in GMAT, I have used an ObjectReferenced coordinate system. The primary is the Sun, the secondary is DSLWP-B, and the axes are chosen as X = R, Z = N. Therefore, the X axis points away from the Sun, in the direction of the camera. The orbit view is centred on DSLWP-B and points towards +X, so it shows what the camera would see (except for the field of view, which is not simulated).
When propagating the orbit in GMAT we see that the Earth passes near the +X axis every Lunar month and that the Moon crosses the image every orbit.
The two images below show the dates when the Earth will pass nearby the +X axis. These dates are good candidates for taking images of the Earth. From the point of view of the camera, the Earth seems to orbit slowly from right to left in these images. Therefore, there should be a tolerance of a couple of days as to when the images of the Earth could be taken.
Thus, we see that the next good dates for attempting to take images of the Earth are October 9 and November 7 (plus/minus a couple days of margin).
The Moon is much larger from the point of view of DSLWP-B and we have seen already that it fills the field of view of the camera completely. Thus, even though in this GMAT simulation the Moon crosses the screen every orbit, we need to wait until the path taken by the Moon is near the centre of the screen.
In the image below we see a good moment to take an image of the Moon. From the point of view of the camera, the Moon crosses from top to bottom of the screen every orbit and its path moves slightly to the right every time, taking it closer to the centre as we progress towards December.
Therefore, a good moment to attempt to take an image of the Moon is late October and all of November. However, the time when the picture is taken is critical, because the Moon crosses the screen quickly. It is near the +X axis only for one or two hours. Therefore, starting in late October, there will be a window of a couple hours each orbit (or each day, since the orbital period is close to one day) where photos of the Moon can be attempted.
Judging by one of Wei Mingchuan BG2BHC’s lasts tweets, he has also been thinking about dates to take good images with the camera. It would be interesting to know if his findings match what I have exposed here.
A good question when doing this sort of planning is what is the field of view of the camera. Probably this can be estimated from some of the existing images.
As you may already know if you follow me on Twitter, on Saturday September 15 I was in Ávila in the IberRadio Spanish Amateur Radio fair giving a talk about DSLWP-B. The talk was well received and I had quite a full room. I recorded my talk to upload it later to YouTube as I did last year.
Since I know that many international people would be interested in this talk and the talk was in Spanish, I wanted to prepare something in English. As I would be overlaying the slides on the video (since the video quality is quite poor and it is impossible to see the projected slides), I thought of overlaying the slides also in English and adding English subtitles.
Doing the subtitles has been a nightmare. I have been working on this non-stop since Sunday morning (as my free time allows me). I intended to use YouTube’s transcript and auto-sync feature, where you provide a transcript (in the video’s original language) and presumably an AI will auto-sync this to the video’s audio. I say presumably because in my case the auto-sync was a complete failure, so I had to resync everything by hand.
Also, since I have listened to the video over and over, I have gotten a bit bored of my own voice and noted some expressions and words that I tend to use a lot. I think this is good, because now I am more conscious when talking in public and can try to avoid using these expressions so much.
In any case, here is the final result. First you have the video with English slides:
And also you have the video with Spanish slides:
Remember that you can use either Spanish or English subtitles with any of the videos. Also, I have the translation contributions enabled, so feel free to provide subtitles in your own language if you wish.
This Wednesday, a DSLWP-B test was done between 04:00 and 06:00 UTC. During this test, a few stations reported on Twitter that they were able to receive the JT4G signal correctly (they saw the tones on the waterfall) but decoding failed.
It turns out that the cause of the decoding failures is that the DSLWP-B clock is running a few seconds late. Thus, the JT4G transmission starts several seconds after the start of the UTC minute and so the decoding fails, since WSJT-X only searchs for a time offset of a few seconds.
Ever since simulating DSLWP-B’s long term orbit with GMAT, I wanted to understand the cause of the periodic perturbations that occur in some Keplerian elements such as the eccentricity. As a reminder from that post, the eccentricity of DSLWP-B’s orbit shows two periodic perturbations (see the figure below). One of them has a period of half a sidereal lunar month, so it should be possible to explain this effect from the rotation of the Moon around the Earth. The other has a period on the order of 8 or 9 months, so explaining this could be more difficult.
In this post I look at how to model the perturbations of the orbit of a satellite in lunar orbit, explaining the behaviour of the long term orbit of DSLWP-B.
Even though the cubesat LilacSat-1 was launched more than a year ago, I haven’t played with it much, since I’ve been busy with many other things. I tested it briefly after it was launched, using its Codec2 downlink, but I hadn’t done anything else since then.
LilacSat-1 has an FM/Codec2 transponder (uplink is analog FM in the 2m band and downlink is Codec2 digital voice in the 70cm band) and a camera that can be remotely commanded to take and downlink JPEG images (see the instructions here). Thus, it offers very interesting possibilities.
Since I have some free time this weekend, I had planned on playing again with LilacSat-1 by using the Codec2 transponder. Wei Mingchuan BG2BHCpersuaded me to try the camera as well, so I teamed up with Mike Rupprecht DK3WN to try the camera this morning. Mike would command the camera, since he has a fixed station with more power, and we would collaborate to receive the image. This is important because a single bit error or lost chunk in a JPEG file ruins the image from the point where it happens, and LilacSat-1 doesn’t have much protection against these problems. By joining the data received by multiple stations, the chances of receiving the complete image correctly are higher.
A couple months ago, Andrés Calleja EB4FJV installed a 2.3GHz beacon in his home in Colmenar Viejo, Madrid. The beacon has 2W of power, radiates with an omnidirectional antenna in the vertical polarization, and transmits a tone and CW identification at the frequency 2320.865MHz.
Since Colmenar Viejo is only 10km away from Tres Cantos, where I live, I can receive the beacon with a very strong signal from home. The Madrid-Barajas airport is also quite near (15km to the threshold of runway 18R) and several departure and approach aircraft routes pass nearby, particularly those flying over the Colmenar VOR. Therefore, it is quite easy to see reflections off aircraft when listening to the beacon.
On July 8 I did a recording of the beacon from 10:04 to 11:03 UTC from the countryside just outside Tres Cantos. In this post I will examine the aircraft reflections seen in the recording and match them with ADS-B aircraft position and velocity data obtained from adsbexchange.com. This will show the locations and trajectories which produce reflections strong enough to be detected.
In one of my latest posts I analysed the meteor scatter pings from GRAVES on a recording I did on August 11 (see that post for more details about the recording). The recording covered the frequency range from 142.5MHz to 146.5MHz and was 1 hour and 34 minutes long. Here I look at the Amateur stations that can be heard in the recording. Note that Amateur activity in meteor scatter communications increases considerably during large meteor showers, due to the higher probabilities of making contacts.
I have a DF9NP 10MHz GPSDO that is based on a u-blox LEA-5S GPS receiver. Essentially, the LEA-5S outputs an 800Hz signal that is used to discipline a 10MHz VCTCXO with a PLL. The LEA-5S doesn’t have persistent storage, so an I2C EEPROM is use to store the settings across reboots.
Lately it seemed that the reading of the settings from the EEPROM had failed. The u-blox was always booting with the default settings. This prevents the GPSDO from working, since the default for the timepulse signal is 1Hz instead of 800Hz. Here is the summary of my troubleshooting session and the weird repair that I did.
The SiriusSats are using 4k8 FSK AX.25 packet radio at 435.570MHz and 435.670MHz respectively, using callsigns RS13S and RS14S. The Tanushas transmit at 437.050MHz. Tanusha-3 normally transmits 1k2 AFSK AX.25 packet radio using the callsign RS8S, but Mike Rupprecht sent me the other day a recording of a transmission from Tanusha-3 that he could not decode.
It turns out that the packet in this recording uses a very peculiar modulation. The modulation is FM, but the data is carried in audio frequency phase modulation with a deviation of approximately 1 radian. The baudrate is 1200baud and the frequency for the phase modulation carrier is 2400Hz. The coding is AX.25 packet radio.
Why this peculiar mode is used in addition to the standard 1k2 packet radio is a mystery. Mike believes that the satellite is somehow faulty, since the pre-recorded audio messages that it transmits are also garbled (see this recording). If this is the case, it would be very interesting to know which particular failure can turn an AFSK transmitter into a phase modulation transmitter.
I have added support to gr-satellites for decoding the Tanusha-3 phase modulation telemetry. To decode the standard 1k2 AFSK telemetry direwolf can be used. The decoder flowgraph can be seen in the figure below.
The FM demodulated signal comes in from the UDP source. It is first converted down to baseband and then a PLL is used to recover the carrier. The Complex to Arg block recovers the phase, yielding an NRZ signal. This signal is lowpass filtered, and then clock recovery, bit slicing and AX.25 deframing is done. Note that it is also possible to decode this kind of signal differentially, without doing carrier recovery, since the NRZI encoding used by AX.25 is differential. However, the carrier recovery works really well, because there is a lot of residual carrier and this is an audio frequency carrier, so it should be very stable in frequency.
The recording that Mike sent me is in tanusha3_pm.wav. It contains a single AX.25 packet that when analyzed in direwolf yields the following.
RS8S>ALL:This is SWSU satellite TANUSHA-3 from Russia, Kursk<0x0d>
------
U frame UI: p/f=0, No layer 3 protocol implemented., length = 68
dest ALL 0 c/r=1 res=3 last=0
source RS8S 0 c/r=0 res=3 last=1
000: 82 98 98 40 40 40 e0 a4 a6 70 a6 40 40 61 03 f0 ...@@@...p.@@a..
010: 54 68 69 73 20 69 73 20 53 57 53 55 20 73 61 74 This is SWSU sat
020: 65 6c 6c 69 74 65 20 54 41 4e 55 53 48 41 2d 33 ellite TANUSHA-3
030: 20 66 72 6f 6d 20 52 75 73 73 69 61 2c 20 4b 75 from Russia, Ku
040: 72 73 6b 0d rsk.
------
The contents of the packet are a message in ASCII. The message is of the same kind as those transmitted in AFSK.