Skip to content

Prevent div-by-0 when producing test execution summary #21

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

briancroom
Copy link
Contributor

This is a fix for SR-334, in which a NaN value would appear in the final summary line at the end of test execution:

Total executed 0 tests, with 0 failures (0 unexpected) in -nan (0.0) seconds

This is a simple fix for the issue, but I have greater doubts about the calculation of the time values that are being printed. The current code prints ... in {average test duration} ({total test duration}) seconds, however the two figures printed by the original XCTest seem to be something closer to ... in {total time spent executing tests} ({total elapsed time since XCTest began running}) seconds, or similar. Does anyone have more insight into what XCTest provides here?

@modocache

@briancroom briancroom changed the title Prevent div-by-0 when producing text execution summary Prevent div-by-0 when producing test execution summary Dec 22, 2015
@modocache
Copy link
Contributor

Awesome, thanks @briancroom! 👍

...the two figures printed by the original XCTest seem to be something closer to ... in {total time spent executing tests} ({total elapsed time since XCTest began running}) seconds, or similar.

Here's some output from Apple XCTest run on an open-source project of mine:

Test Suite 'PendingTests_ObjC' started at 2015-12-20 16:29:09.975
Test Case '-[PendingTests_ObjC testAnOtherwiseFailingExampleWhenMarkedPendingDoesNotCauseTheSuiteToFail]' started.
Test Case '-[PendingTests_ObjC testAnOtherwiseFailingExampleWhenMarkedPendingDoesNotCauseTheSuiteToFail]' passed (0.007 seconds).
Test Case '-[PendingTests_ObjC testBeforeEachDoesNotRunForContextsWithOnlyPendingExamples]' started.
Test Case '-[PendingTests_ObjC testBeforeEachDoesNotRunForContextsWithOnlyPendingExamples]' passed (0.051 seconds).
Test Case '-[PendingTests_ObjC testBeforeEachOnlyRunForEnabledExamples]' started.
Test Case '-[PendingTests_ObjC testBeforeEachOnlyRunForEnabledExamples]' passed (0.022 seconds).
Test Suite 'PendingTests_ObjC' passed at 2015-12-20 16:29:10.116.
     Executed 3 tests, with 0 failures (0 unexpected) in 0.080 (0.140) seconds

The last line, Executed 3 tests, with 0 failures (0 unexpected) in 0.080 (0.140) seconds, outputs two time intervals:

  1. 0.080: the sum of the times between the start and stop of each XCTestCaseRun in the XCTestSuite. Notice the three test cases pass in 0.007, 0.051, and 0.022 seconds, and that 0.007 + 0.051 + 0.022 == 0.080.
  2. 0.140: the time between the start and stop of the XCTestSuiteRun.

The current code prints ... in {average test duration} ({total test duration}) seconds...

The outputs from Apple XCTest and swift-corelibs-xctest definitely appear to represent two different things.

My takeaways:

  1. It should be clear to users what these numbers represent. If they are indeed averages in swift-corelibs-xctest, we should state that fact in the output. For swift-corelibs-xctest, how about creating a JIRA issue or submit a pull request? For Apple XCTest, how about submitting a radar?
  2. The goal of swift-corelibs-xctest is to "enable your project's tests to run on all Swift platforms without having to rewrite them". I consider different output to be contrary to that goal--if I write code to parse test output on OS X, it should work equally well on Linux. swift-corelibs-xctest doesn't have an XCTestSuite yet, but it could print ...in {sum of times for each test method} ({delta between test case start and stop}) seconds, which I think would be more in line with what Apple XCTest outputs. Again, I'd create a JIRA issue or submit a pull request.

@modocache
Copy link
Contributor

While we're printing averages, though, I think the proposed change is an improvement. LGTM! 🚢

@briancroom
Copy link
Contributor Author

Closing in favor of #23!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants