Skip to content

LoRaWAN: Fine tuning timing for delays and receive windows #7191

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Jun 19, 2018

Conversation

hasnainvirk
Copy link
Contributor

Description

This PR mainly focuses on making the timing behaviour for delays and receive windows a touch better.
In the previous incarnations of the code base, as nobody else was using the main event queue, we got away with the problems. But we may encounter them if the application does some expansive operations on the event queue. We now take a timestamp at the moment the tx interrupt happens after which we are supposed to indulge into delays and opening of receive windows. Then we take a diff of time passed in between the tx interrupt and our processing time and hence we tend towards more precise timing overall.

Automatic uplink was being triggered right in the reception sequence and that was causing the call stack to be really cumbersome especially in debug builds. So to counter that, we now queue a call for automatic uplink rather than directly doing an uplink.

Two new APIs are introduced in LoRaMac class (an internal class) which provide current timing and receive slot information to the controller, i.e., LoRaWANStack.

Target Release Version : 5.9.1

Pull request type

[X] Fix
[ ] Refactor
[ ] New target
[ ] Feature
[ ] Breaking change

RX1 and 2 delays needed to be more precise and aggregate tx time was
drifiting because of timing difference between actual tx interrupt and
our processing of that interrupt ever so slightly.

We now take a timestamp of the tx interrupt and take a time diff while
instantiating delay timers. The timestamp is then used to update the aggregate
tx time.

Two new methods are introduced in the LoRaMac class which provide current
timing and current receive slot. These functions are used by LoRaWANStack
for its processing.
@hasnainvirk
Copy link
Contributor Author

@kjbracey-arm @AnttiKauppila @cmonr @0xc0170 Please review.

@0xc0170 0xc0170 requested a review from a team June 12, 2018 11:17
AnttiKauppila
AnttiKauppila previously approved these changes Jun 12, 2018
@cmonr
Copy link
Contributor

cmonr commented Jun 13, 2018

/morph build

@mbed-ci
Copy link

mbed-ci commented Jun 13, 2018

Build : SUCCESS

Build number : 2349
Build artifacts/logs : http://mbed-os.s3-website-eu-west-1.amazonaws.com/?prefix=builds/7191/

Triggering tests

/morph test
/morph uvisor-test
/morph export-build
/morph mbed2-build

@mbed-ci
Copy link

mbed-ci commented Jun 13, 2018

Hasnain Virk added 4 commits June 14, 2018 14:52
If the automatic uplink is sent directly the call-stack becomes larger than 1K
which may cause serious problems in debug builds. Just to have a respite between
RX and TX we queue an event for the automatic uplink rather than directly undergoing
an automatic uplink.
We must check for a valid value of a frequency being sent
by the network server.
@hasnainvirk
Copy link
Contributor Author

@kjbracey-arm @AnttiKauppila Please review.

@cmonr cmonr requested a review from kjbracey June 14, 2018 13:01
@mbed-ci
Copy link

mbed-ci commented Jun 15, 2018

@cmonr
Copy link
Contributor

cmonr commented Jun 19, 2018

/morph build

@mbed-ci
Copy link

mbed-ci commented Jun 19, 2018

Build : SUCCESS

Build number : 2378
Build artifacts/logs : http://mbed-os.s3-website-eu-west-1.amazonaws.com/?prefix=builds/7191/

Triggering tests

/morph test
/morph uvisor-test
/morph export-build
/morph mbed2-build

@mbed-ci
Copy link

mbed-ci commented Jun 19, 2018

@mbed-ci
Copy link

mbed-ci commented Jun 19, 2018

@hasnainvirk
Copy link
Contributor Author

hasnainvirk commented Jun 19, 2018

@cmonr @kjbracey-arm Please review and merge.

@cmonr cmonr merged commit ba5b5a3 into ARMmbed:master Jun 19, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants