Skip to content

Resolve Performance test failures and add unit tests for proto3 data transport format #2503

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
72 changes: 44 additions & 28 deletions packages/performance/src/services/cc_service.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
* limitations under the License.
*/

import { stub, useFakeTimers, SinonStub } from 'sinon';
import { stub, useFakeTimers, SinonStub, SinonFakeTimers } from 'sinon';
import { use, expect } from 'chai';
import { Logger, LogLevel, LogHandler } from '@firebase/logger';
import * as sinonChai from 'sinon-chai';
Expand All @@ -27,12 +27,12 @@ use(sinonChai);
// Starts date at timestamp 1 instead of 0, otherwise it causes validation errors.
import { ccHandler } from './cc_service';
// eslint-disable-next-line
describe.only('Firebase Performance > cc_service', () => {
describe('Firebase Performance > cc_service', () => {
describe('ccHandler', () => {
let fetchStub: SinonStub<[RequestInfo, RequestInit?], Promise<Response>>;
let clock: sinon.SinonFakeTimers;
const INITIAL_SEND_TIME_DELAY_MS = 5.5 * 1000;
const DEFAULT_SEND_INTERVAL_MS = 10 * 1000;
let clock: SinonFakeTimers;
let testCCHandler: LogHandler;

beforeEach(() => {
Expand All @@ -45,46 +45,62 @@ describe.only('Firebase Performance > cc_service', () => {

afterEach(() => {
fetchStub.restore();
clock.restore();
});

// First setTimeout in cc_service#processQueue(timeOffset) happens before useFakeTimers() is
// called. Therefore all tests need to wait for first timeout to finish before starting the
// actual test logic.
function waitForFirstTimeout(callback: () => void): void {
setTimeout(() => {
callback();
}, 10 * DEFAULT_SEND_INTERVAL_MS);
}

it('throws an error when logging an empty message', () => {
const logger = new Logger('@firebase/performance/cc');
expect(() => {
testCCHandler(logger, LogLevel.SILENT, '');
}).to.throw;
waitForFirstTimeout(() => {
Comment on lines 60 to +61
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I tried this out and I don't think any of these 3 tests actually run, as written. When a test is async you have to handle it with a done() or by returning a promise, or else Mocha will just go on to the next test and not wait for the async part to finish, and count it as passed. I think as the code currently stands, you can put anything in the callback and it will pass.

If you wanted it to reach the expect line you'd want

    it('throws an error when logging an empty message', (done) => {
      waitForFirstTimeout(() => {
        const logger = new Logger('@firebase/performance/cc');
        expect(() => {
          testCCHandler(logger, LogLevel.SILENT, '');
        }).to.throw;
       done();
      });
    });

Or instead of using done(), return a promise.

I'm able to get around the described problem (doing a real time offset before tests begin, which allows all imported instances of ccHandler to reach the initial setTimeout callback of processQueue) with this block, which runs before all tests.

before((done) => {
      setTimeout(() => done(), INITIAL_SEND_TIME_DELAY_MS);
    });

You don't need to run this before each test, just all tests, because all instances of the top level call to processQueue runs before any tests which I don't think is what you expect or want, so, can I suggest not calling processQueue automatically at the top level of cc_service.ts but instead providing an init function to run it, which perf_logger.ts can call at the top. This way each test can call the init on its own time and not have to do any kind of setTimeout workaround.

Also another warning, if this code worked as intended (using done() or a promise return to pause the async test until timeout ends and the expect runs), the waitForFirstTimeout would wait on the real time clock, and I think it's set to 100 seconds (10 * 10 * 1000) and on the real time clock that would be bad.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you Christina for the detailed feedback and tried with the testing! I have updated the logic in cc_service.ts to provide an init function and updated tests to include fake timer inside describe(). Since this test update is separated from feature branch merge in this PR (fireperf-update-log-format), I created master branch PR for this #2506. Thank you!

const logger = new Logger('@firebase/performance/cc');
expect(() => {
testCCHandler(logger, LogLevel.SILENT, '');
}).to.throw;
});
});

it('does not attempt to log an event to clearcut after INITIAL_SEND_TIME_DELAY_MS if queue is empty', () => {
fetchStub.resolves(
new Response('', {
status: 200,
headers: { 'Content-type': 'application/json' }
})
);
waitForFirstTimeout(() => {
fetchStub.resolves(
new Response('', {
status: 200,
headers: { 'Content-type': 'application/json' }
})
);

clock.tick(INITIAL_SEND_TIME_DELAY_MS);
expect(fetchStub).to.not.have.been.called;
clock.tick(INITIAL_SEND_TIME_DELAY_MS);
expect(fetchStub).to.not.have.been.called;
});
});

it('attempts to log an event to clearcut after DEFAULT_SEND_INTERVAL_MS if queue not empty', () => {
const logger = new Logger('@firebase/performance/cc');
waitForFirstTimeout(() => {
const logger = new Logger('@firebase/performance/cc');

fetchStub.resolves(
new Response('', {
status: 200,
headers: { 'Content-type': 'application/json' }
})
);
fetchStub.resolves(
new Response('', {
status: 200,
headers: { 'Content-type': 'application/json' }
})
);

testCCHandler(logger, LogLevel.DEBUG, 'someEvent');
clock.tick(20 * DEFAULT_SEND_INTERVAL_MS);
testCCHandler(logger, LogLevel.DEBUG, 'someEvent');
clock.tick(100 * DEFAULT_SEND_INTERVAL_MS);

expect(fetchStub).to.have.been.calledOnce;
expect(fetchStub).to.have.been.calledWith({
body: `{"request_time_ms": ,"client_info":{"client_type":1,\
expect(fetchStub).to.have.been.calledOnce;
expect(fetchStub).to.have.been.calledWith({
body: `{"request_time_ms": ,"client_info":{"client_type":1,\
"js_client_info":{}},"log_source":462,"log_event":[{"source_extension_json_proto3":"someEvent",\
"event_time_ms":${1}}]}`,
method: 'POST'
method: 'POST'
});
});
});
});
Expand Down
75 changes: 65 additions & 10 deletions packages/performance/src/services/perf_logger.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,12 @@ describe('Performance Monitoring > perf_logger', () => {
const TRACE_NAME = 'testTrace';
const START_TIME = 12345;
const DURATION = 321;
// Perf event header which is constant across tests in this file.
const webAppInfo = `"application_info":{"google_app_id":"${APP_ID}",\
"app_instance_id":"${IID}","web_app_info":{"sdk_version":"${SDK_VERSION}",\
"page_url":"${PAGE_URL}","service_worker_status":${SERVICE_WORKER_STATUS},\
"visibility_state":${VISIBILITY_STATE},"effective_connection_type":${EFFECTIVE_CONNECTION_TYPE}},\
"application_process_state":0}`;

let addToQueueStub: SinonStub<
Array<{ message: string; eventTime: number }>,
Expand Down Expand Up @@ -83,11 +89,10 @@ describe('Performance Monitoring > perf_logger', () => {

describe('logTrace', () => {
it('creates, serializes and sends a trace to cc service', () => {
const EXPECTED_TRACE_MESSAGE = `{"application_info":{"google_app_id":"${APP_ID}",\
"app_instance_id":"${IID}","web_app_info":{"sdk_version":"${SDK_VERSION}",\
"page_url":"${PAGE_URL}","service_worker_status":${SERVICE_WORKER_STATUS},\
"visibility_state":${VISIBILITY_STATE},"effective_connection_type":${EFFECTIVE_CONNECTION_TYPE}},\
"application_process_state":0},"trace_metric":{"name":"${TRACE_NAME}","is_auto":false,\
const EXPECTED_TRACE_MESSAGE =
`{` +
webAppInfo +
`,"trace_metric":{"name":"${TRACE_NAME}","is_auto":false,\
"client_start_time_us":${START_TIME * 1000},"duration_us":${DURATION * 1000},\
"counters":{"counter1":3},"custom_attributes":{"attr":"val"}}}`;
getIidStub.returns(IID);
Expand All @@ -113,6 +118,57 @@ describe('Performance Monitoring > perf_logger', () => {

expect(addToQueueStub).not.to.be.called;
});

it('Amount of customMetric is 32.', () => {
const EXPECTED_TRACE_MESSAGE =
`{` +
webAppInfo +
`,"trace_metric":{"name":"${TRACE_NAME}","is_auto":false,\
"client_start_time_us":${START_TIME * 1000},"duration_us":${DURATION * 1000},\
"counters":{"counter1":1,"counter2":2,"counter3":3,"counter4":4,"counter5":5,"counter6":6,\
"counter7":7,"counter8":8,"counter9":9,"counter10":10,"counter11":11,"counter12":12,\
"counter13":13,"counter14":14,"counter15":15,"counter16":16,"counter17":17,"counter18":18,\
"counter19":19,"counter20":20,"counter21":21,"counter22":22,"counter23":23,"counter24":24,\
"counter25":25,"counter26":26,"counter27":27,"counter28":28,"counter29":29,"counter30":30,\
"counter31":31,"counter32":32}}}`;
getIidStub.returns(IID);
SettingsService.getInstance().loggingEnabled = true;
SettingsService.getInstance().logTraceAfterSampling = true;
const trace = new Trace(TRACE_NAME);
for (let i = 1; i <= 32; i++) {
trace.putMetric('counter' + i, i);
}
trace.record(START_TIME, DURATION);
clock.tick(1);

expect(addToQueueStub).to.be.called;
expect(addToQueueStub.getCall(0).args[0].message).to.be.equal(
EXPECTED_TRACE_MESSAGE
);
});

it('Amount of customAttribute is 5.', () => {
const EXPECTED_TRACE_MESSAGE =
`{` +
webAppInfo +
`,"trace_metric":{"name":"${TRACE_NAME}","is_auto":false,\
"client_start_time_us":${START_TIME * 1000},"duration_us":${DURATION * 1000},\
"custom_attributes":{"attr1":"val1","attr2":"val2","attr3":"val3","attr4":"val4","attr5":"val5"}}}`;
getIidStub.returns(IID);
SettingsService.getInstance().loggingEnabled = true;
SettingsService.getInstance().logTraceAfterSampling = true;
const trace = new Trace(TRACE_NAME);
for (let i = 1; i <= 5; i++) {
trace.putAttribute('attr' + i, 'val' + i);
}
trace.record(START_TIME, DURATION);
clock.tick(1);

expect(addToQueueStub).to.be.called;
expect(addToQueueStub.getCall(0).args[0].message).to.be.equal(
EXPECTED_TRACE_MESSAGE
);
});
});

describe('logNetworkRequest', () => {
Expand Down Expand Up @@ -149,11 +205,10 @@ describe('Performance Monitoring > perf_logger', () => {
RESOURCE_PERFORMANCE_ENTRY.startTime) *
1000
);
const EXPECTED_NETWORK_MESSAGE = `{"application_info":{"google_app_id":"${APP_ID}",\
"app_instance_id":"${IID}","web_app_info":{"sdk_version":"${SDK_VERSION}",\
"page_url":"${PAGE_URL}","service_worker_status":${SERVICE_WORKER_STATUS},\
"visibility_state":${VISIBILITY_STATE},"effective_connection_type":${EFFECTIVE_CONNECTION_TYPE}},\
"application_process_state":0},\
const EXPECTED_NETWORK_MESSAGE =
`{` +
webAppInfo +
`,\
"network_request_metric":{"url":"${RESOURCE_PERFORMANCE_ENTRY.name}",\
"http_method":0,"http_response_code":200,\
"response_payload_bytes":${RESOURCE_PERFORMANCE_ENTRY.transferSize},\
Expand Down