Skip to content

Fix issue when TDBStore has varying erase sizes between areas. (Backport) #12653

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 2, 2020

Conversation

jarlamsa
Copy link
Contributor

Summary of changes

This is a backport of #12558 to mbed-os-5.15

In some cases, it is possible that every erase unit in area 0
has the same size, but they are still different than in area 1.
Remove the flag for varying erase sizes and instead check from
flash, what is the erase size of the current unit.

Impact of changes

Migration actions required

Documentation

None


Pull request type

[X] Patch update (Bug fix / Target update / Docs update / Test update / Refactor)
[] Feature update (New feature / Functionality change / New API)
[] Major update (Breaking change E.g. Return code change / API behaviour change)

Test results

[] No Tests required for this change (E.g docs only update)
[X] Covered by existing mbed-os tests (Greentea or Unittest)
[] Tests / results supplied as part of this PR

Reviewers

@ARMmbed/mbed-os-storage
@soleilplanet
@adbridge


In some cases, it is possible that every erase unit in area 0
has the same size, but they are still different than in area 1.
Remove the flag for varying erase sizes and instead check from
flash, what is the erase size of the current unit.
@adbridge
Copy link
Contributor

This has already been approved for 5.15.2.

@mergify mergify bot added the needs: CI label Mar 19, 2020
@0xc0170
Copy link
Contributor

0xc0170 commented Mar 19, 2020

CI started

Note: dynamic test will most likely fail, will be fixed today

@mbed-ci
Copy link

mbed-ci commented Mar 19, 2020

Test run: FAILED

Summary: 2 of 7 test jobs failed
Build number : 1
Build artifacts

Failed test jobs:

  • jenkins-ci/mbed-os-ci_dynamic-memory-usage
  • jenkins-ci/mbed-os-ci_greentea-test

@mergify mergify bot added needs: work and removed needs: CI labels Mar 19, 2020
@jarlamsa
Copy link
Contributor Author

@0xc0170 could you retrigger the CI if the dynamic memory tests are now stable?

@0xc0170
Copy link
Contributor

0xc0170 commented Mar 25, 2020

CI restarted

@mbed-ci
Copy link

mbed-ci commented Mar 25, 2020

Test run: FAILED

Summary: 1 of 6 test jobs failed
Build number : 3
Build artifacts

Failed test jobs:

  • jenkins-ci/mbed-os-ci_greentea-test

@0xc0170
Copy link
Contributor

0xc0170 commented Mar 26, 2020

I restarted tests, there were usb failures

@adbridge
Copy link
Contributor

mmm still seeing the USB failures...

@adbridge
Copy link
Contributor

This needs a fix to how the USB tests are run (@jamesbeyond will raise shortly)

@0xc0170
Copy link
Contributor

0xc0170 commented Mar 31, 2020

test restarted

@0xc0170
Copy link
Contributor

0xc0170 commented Apr 1, 2020

Started new CI job

@mbed-ci
Copy link

mbed-ci commented Apr 1, 2020

Test run: SUCCESS

Summary: 9 of 9 test jobs passed
Build number : 2
Build artifacts

@0xc0170 0xc0170 merged commit 41273e3 into ARMmbed:mbed-os-5.15 Apr 2, 2020
@mergify mergify bot removed the ready for merge label Apr 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants