Skip to content

Add test apps template and fireci commands to measure sdk startup times. #2611

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
May 5, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 32 additions & 0 deletions ci/fireci/fireci/dir_utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Copyright 2021 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import contextlib
import logging
import os

_logger = logging.getLogger('fireci.dir_utils')


@contextlib.contextmanager
def chdir(directory):
"""Change working dir to `directory` and restore to original afterwards."""
_logger.debug(f'Changing directory to: {directory} ...')
original_dir = os.getcwd()
os.chdir(directory)
try:
yield
finally:
_logger.debug(f'Restoring directory to: {original_dir} ...')
os.chdir(original_dir)
14 changes: 1 addition & 13 deletions ci/fireci/fireciplugins/fireperf.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@

from fireci import ci_command
from fireci import gradle
from fireci.dir_utils import chdir

_logger = logging.getLogger('fireci.fireperf')

Expand Down Expand Up @@ -60,19 +61,6 @@ def fireperf_e2e_test(target_environment, plugin_repo_dir):
gradle.run(*fireperf_e2e_test_gradle_command)


@contextlib.contextmanager
def chdir(directory):
"""Change working dir to `directory` and restore to original afterwards."""
_logger.debug(f'Changing directory to: {directory} ...')
original_dir = os.getcwd()
os.chdir(directory)
try:
yield
finally:
_logger.debug(f'Restoring directory to: {original_dir} ...')
os.chdir(original_dir)


def _find_fireperf_plugin_version():
local_maven_repo_dir = pathlib.Path.home().joinpath('.m2', 'repository')
artifacts_path = local_maven_repo_dir.joinpath('com', 'google', 'firebase', 'perf-plugin')
Expand Down
221 changes: 221 additions & 0 deletions ci/fireci/fireciplugins/macrobenchmark.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,221 @@
# Copyright 2021 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import asyncio
import glob
import json
import logging
import os
import random
import shutil
import sys

import click
import pystache
import yaml

from fireci import ci_command
from fireci.dir_utils import chdir

_logger = logging.getLogger('fireci.macrobenchmark')


@ci_command()
def macrobenchmark():
"""Measures app startup times for Firebase SDKs."""
asyncio.run(_launch_macrobenchmark_test())


async def _launch_macrobenchmark_test():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the motivation for all the asyncs? not sure it's warranted for this simple use case

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here is how I come to all these async code:

  1. gcloud calls takes long time, and I want to parallelize this part
  2. now that test apps for different SDKs are isolated, to build them I need to run ./gradlew assemble assembleAndroidTest one by one in each rootprojects for all SDKs (previously all test apps are subprojects of the same root project, so I only had to run one gradle command, and org.gradle.parallel=true takes care of the parallelization). Anyway I think I want to parallelize this part as well
  3. with Popen and blocking wait, I'm able to make apk assembling and FTL uploading for different SDKs concurrent, respectively. But with async code, it's convenient for me to even make uploading for one SDK only depend on its corresponding assembling.
  4. It also seems easier for me to handle stdout/stderr streaming (with color and sdk name decoration) to parent process from multiple subprocesses with async.
  5. at this point, I feel like it's actually easier for me to write async code.

I do agree that it brings a lot of (unnecessarily) complexity. Maybe there are simpler ways to do it without async, and I just have not found them. What do you think? I'd like to have a quick GVC if you have time?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, I take it back. It does seem to provide a lot of value, not the least of which is that it provides concurrency out of the box without having to deal with it explicitly and makes the code to read simple. Using asyncio sgtm

_logger.info('Starting macrobenchmark test...')

artifact_versions, config, _ = await asyncio.gather(
_parse_artifact_versions(),
_parse_config_yaml(),
_create_gradle_wrapper()
)

with chdir('macrobenchmark'):
runners = [MacrobenchmarkTest(k, v, artifact_versions) for k, v in config.items()]
results = await asyncio.gather(*[x.run() for x in runners], return_exceptions=True)

if any(map(lambda x: isinstance(x, Exception), results)):
_logger.error(f'Exceptions: {[x for x in results if (isinstance(x, Exception))]}')
raise click.ClickException('Macrobenchmark test failed with above errors.')

_logger.info('Macrobenchmark test finished.')


async def _parse_artifact_versions():
proc = await asyncio.subprocess.create_subprocess_exec('./gradlew', 'assembleAllForSmokeTests')
await proc.wait()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wouldn't line 56 await be enough? not sure

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds to me it is required. So if I don't await on it, it gives me this warning:

/usr/local/Cellar/python/3.7.7/Frameworks/Python.framework/Versions/3.7/lib/python3.7/asyncio/unix_events.py:878: RuntimeWarning: A loop is being detached from a child watcher with pending handlers
  RuntimeWarning)

My understanding is that it means the underneath Future object may not be settled before my code exit. Also in the python doc for asyncio subprocess, the first example and the last example also await on proc.communicate() and proc.wait() respectively.

This also reminds me that I forgot to await at other places in the code. The problem was masked previously. I fixed in the new commit.


with open('build/m2repository/changed-artifacts.json') as json_file:
artifacts = json.load(json_file)
return dict(_artifact_key_version(x) for x in artifacts['headGit'])


def _artifact_key_version(artifact):
group_id, artifact_id, version = artifact.split(':')
return f'{group_id}:{artifact_id}', version


async def _parse_config_yaml():
with open('macrobenchmark/config.yaml') as yaml_file:
return yaml.safe_load(yaml_file)


async def _create_gradle_wrapper():
with open('macrobenchmark/settings.gradle', 'w'):
pass
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this intended?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. Without a settings.gradle file, ./gradlew wrapper failed with this error:

FAILURE: Build failed with an exception.

* What went wrong:
Project directory '/Users/yifany/Documents/firebase/firebase-android-sdk/macrobenchmark' is not part of the build defined by settings file '/Users/yifany/Documents/firebase/firebase-android-sdk/settings.gradle'. If this is an unrelated build, it must have its own settings file.

And looks like an empty settings.gradle is good enough.


proc = await asyncio.subprocess.create_subprocess_exec(
'./gradlew',
'wrapper',
'--gradle-version',
'7.0',
'--project-dir',
'macrobenchmark'
)
await proc.wait()


class MacrobenchmarkTest:
"""Builds the test based on configurations and runs the test on FTL."""
def __init__(
self,
sdk_name,
test_app_config,
artifact_versions,
logger=_logger
):
self.sdk_name = sdk_name
self.test_app_config = test_app_config
self.artifact_versions = artifact_versions
self.logger = MacrobenchmarkLoggerAdapter(logger, sdk_name)
self.test_app_dir = os.path.join('test-apps', test_app_config['name'])

async def run(self):
"""Starts the workflow of src creation, apks assembling, and FTL testing in order."""
await self._create_test_src()
await self._assemble_apks()
await self._upload_apks_to_ftl()

async def _create_test_src(self):
app_name = self.test_app_config['name']
app_id = self.test_app_config['application-id']
self.logger.info(f'Creating test app "{app_name}" with application-id "{app_id}"...')

mustache_context = {
'application-id': app_id,
'plugins': self.test_app_config['plugins'] if 'plugins' in self.test_app_config else [],
'dependencies': [
{
'key': x,
'version': self.artifact_versions[x]
} for x in self.test_app_config['dependencies']
] if 'dependencies' in self.test_app_config else [],
}

if app_name != 'baseline':
mustache_context['plugins'].append('com.google.gms.google-services')

shutil.copytree('template', self.test_app_dir)
with chdir(self.test_app_dir):
renderer = pystache.Renderer()
mustaches = glob.glob('**/*.mustache', recursive=True)
for mustache in mustaches:
result = renderer.render_path(mustache, mustache_context)
original_name = mustache[:-9] # TODO(yifany): mustache.removesuffix('.mustache')
with open(original_name, 'w') as file:
file.write(result)

async def _assemble_apks(self):
executable = './gradlew'
args = ['assemble', 'assembleAndroidTest', '--project-dir', self.test_app_dir]
await self._exec_subprocess(executable, args)

async def _upload_apks_to_ftl(self):
app_apk_path = glob.glob(f'{self.test_app_dir}/app/**/*.apk', recursive=True)[0]
test_apk_path = glob.glob(f'{self.test_app_dir}/benchmark/**/*.apk', recursive=True)[0]

self.logger.info(f'App apk: {app_apk_path}')
self.logger.info(f'Test apk: {test_apk_path}')

ftl_environment_variables = [
'clearPackageData=true',
'additionalTestOutputDir=/sdcard/Download',
'no-isolated-storage=true',
]
executable = 'gcloud'
args = ['firebase', 'test', 'android', 'run']
args += ['--type', 'instrumentation']
args += ['--app', app_apk_path]
args += ['--test', test_apk_path]
args += ['--device', 'model=flame,version=30,locale=en,orientation=portrait']
args += ['--directories-to-pull', '/sdcard/Download']
args += ['--results-bucket', 'gs://fireescape-macrobenchmark']
args += ['--environment-variables', ','.join(ftl_environment_variables)]
args += ['--timeout', '30m']
args += ['--project', 'fireescape-c4819']

await self._exec_subprocess(executable, args)

async def _exec_subprocess(self, executable, args):
command = " ".join([executable, *args])
self.logger.info(f'Executing command: "{command}"...')

proc = await asyncio.subprocess.create_subprocess_exec(
executable,
*args,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE
)
await asyncio.gather(
self._stream_output(executable, proc.stdout),
self._stream_output(executable, proc.stderr)
)

await proc.communicate()
if proc.returncode == 0:
self.logger.info(f'"{command}" finished.')
else:
message = f'"{command}" exited with return code {proc.returncode}.'
self.logger.error(message)
raise click.ClickException(message)

async def _stream_output(self, executable, stream: asyncio.StreamReader):
async for line in stream:
self.logger.info(f'[{executable}] {line.decode("utf-8").strip()}')


class MacrobenchmarkLoggerAdapter(logging.LoggerAdapter):
"""Decorates log messages for a sdk to make them more distinguishable."""

reset_code = '\x1b[m'

@staticmethod
def random_color_code():
code = random.randint(16, 231) # https://en.wikipedia.org/wiki/ANSI_escape_code#8-bit
return f'\x1b[38;5;{code}m'

def __init__(self, logger, sdk_name, color_code=None):
super().__init__(logger, {})
self.sdk_name = sdk_name
self.color_code = self.random_color_code() if color_code is None else color_code

def process(self, msg, kwargs):
colored = f'{self.color_code}[{self.sdk_name}]{self.reset_code} {msg}'
uncolored = f'[{self.sdk_name}] {msg}'
return colored if sys.stderr.isatty() else uncolored, kwargs
2 changes: 2 additions & 0 deletions ci/fireci/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,9 @@
install_requires=[
'click==7.0',
'PyGithub==1.43.8',
'pystache==0.5.4',
'requests==2.23.0',
'PyYAML==5.4.1',
],
packages=find_packages(exclude=['tests']),
entry_points={
Expand Down
79 changes: 79 additions & 0 deletions macrobenchmark/config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
baseline:
name: baseline
application-id: com.google.firebase.benchmark.baseline

firebase-config:
name: config
application-id: com.google.firebase.benchmark.config
dependencies:
- com.google.firebase:firebase-config-ktx

firebase-crashlytics:
name: crash
application-id: com.google.firebase.benchmark.crash
dependencies:
- com.google.firebase:firebase-crashlytics-ktx
plugins:
- com.google.firebase.crashlytics

firebase-database:
name: database
application-id: com.google.firebase.benchmark.database
dependencies:
- com.google.firebase:firebase-database-ktx

firebase-dynamic-links:
name: dynamiclinks
application-id: com.google.firebase.benchmark.dynamiclinks
dependencies:
- com.google.firebase:firebase-dynamic-links-ktx

firebase-firestore:
name: firestore
application-id: com.google.firebase.benchmark.firestore
dependencies:
- com.google.firebase:firebase-firestore-ktx

firebase-functions:
name: functions
application-id: com.google.firebase.benchmark.functions
dependencies:
- com.google.firebase:firebase-functions-ktx

firebase-inappmessaging-display:
name: inappmessaging
application-id: com.google.firebase.benchmark.inappmessaging
dependencies:
- com.google.firebase:firebase-inappmessaging-ktx
- com.google.firebase:firebase-inappmessaging-display-ktx

firebase-messaging:
name: messaging
application-id: com.google.firebase.benchmark.messaging
dependencies:
- com.google.firebase:firebase-messaging-ktx

firebase-perf:
name: perf
application-id: com.google.firebase.benchmark.perf
dependencies:
- com.google.firebase:firebase-perf-ktx
plugins:
- com.google.firebase.firebase-perf

firebase-storage:
name: storage
application-id: com.google.firebase.benchmark.storage
dependencies:
- com.google.firebase:firebase-storage-ktx



# TODO(yifany): google3 sdks, customizing FTL devices
# auth
# analytics
# combined
# - crashlytics + analytics
# - crashlytics + fireperf
# - auth + firestore
# - ...
Loading