Knowledge Base

Timing difference of RxDone (DIO0) interrupt


I am a developer working on the SX1276 Family and I would like to address a problem that has been bothering me for some time. I am investigating the timing of RxDone interrupt raised on DIO0 of the chip and I would like to understand the reason why it fluctuates a lot between different receivers.

My experiment setup is, one SX1276 that is set as transmitter and the RF output is connected to a directional coupler with a coax cable. The lossless output of the coupler is connected to RF input of one SX1276 that is set as receiver and the coupled output with approximately 30 dB loss is also connected to RF input of another SX1276 that is set as receiver. I am monitoring the DIO0 RxDone interrupt with an oscilloscope and I don't understand the big and fluctuating difference between the DIO0 interrupts of these two receivers.

I am attaching an image from the oscilloscope from one of the measurements where you can see the source 3 is the DIO0 of first receiver and source 4 is the DIO0 of second receiver. Can you please tell me the reason behind this difference of around 1.5 micro seconds? How is the sampling done for timing on the RF end of SX1276? Is there a way to overcome this difference?

In order to address prior to some of the possible questions:
1. I have tried many different hardware to rule out whether one is defect.
2. It is not always one is prior to the other it always fluctuates randomly as if the sampling occurs according to a duty cycle but not continuously.
3. I would expect this outcome in a real world scenario due to the low bandwidth of LoRa, where the direct path can not be detected or a multi-path or shadowing influences the receiver interrupt. But in this case the RF lines are connected directly with coax cables which should eliminate these effects.

Please let me know if you can help me on this topic. Thanks.

Best regards,
Image Attachments