-
Notifications
You must be signed in to change notification settings - Fork 552
Added a new integration for Google Cloud Functions and automation test case framework #785
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The integration looks fine, I have concerns about the test setup and test performance.
test-requirements.txt
Outdated
@@ -4,6 +4,10 @@ tox==3.7.0 | |||
Werkzeug==0.15.5 | |||
pytest-localserver==0.5.0 | |||
pytest-cov==2.8.1 | |||
google-api-python-client==1.10.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please put those in tox.ini like the boto3 dependency
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay sure, I'll update this.
with open( | ||
os.environ.get("SENTRY_PYTHON_TEST_GCP_CREDENTIALS_JSON"), "rb" | ||
) as creds_file: | ||
for line in creds_file.readlines(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We will not be able to store any files in CI. Can we store the entire JSON in an envvar?
Also do I read it correctly that you're expecting that file to contain multiple JSON objects where you're only interested in the last one? Because you're splitting by line separator and override creds_json
in each iteration.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually it is only a single json object which is there in the file I'm just reading the file line by line and fetching one single json.
The thing is that, these oauth client credentials are downloaded from as a .json file. I was just setting this environment variable to that json which is stored in the file.
This will be a minor overhead on user to copy the json from the file and set it to the environment variable. Do we want to do that? If so, I'll modify this implementation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My main point was that it's easy to set secrets in Travis using environment variables but I may have misunderstood the auth flow.
Reading your other comment is there even a way to run this in CI if there's manual user interaction involved?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not completely sure about this, though what I can do is try to give all permissions from my google account to the APIs which are using the credentials. But I think for the first run atleast it still requires user interaction.
if not project_id: | ||
pytest.skip("Credentials json file is not valid") | ||
|
||
if os.path.exists("token.pickle"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we avoid using pickle? If possible I'd just stick to JSON.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The thing is that, the pickle basically is used so that user doesn't have to re-authenticate its credentials from the browser for every test case.
So, do we want to do that or keep this usage?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no I would just replace pickle.dumps
with json.dumps
or something like that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This .pickle file contains actually the authentication token and not json. Since we are using that authentication token to authenticate our requests, I feel this can't be avoided. If we avoid this .pickle file, we'll have to verify each test case run through the browser.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand the purpose of the file but I don't understand why we need to use pickle as a serialization format specifically. It seems to me that we could use any other format, like yaml or json.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay, I got your point, I'll try to do it with json serialization.
|
||
# STEP : Invoke the cloud function | ||
# Adding delay of 60 seconds for new created function to get deployed. | ||
time.sleep(60) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is an awfully slow test for CI. Is there a way around this, such as polling the deployment status?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I need to check if polling deployment status is possible or not. Although, GCP deployments do take an awful lot of time. I've checked and figured that this is a stable minimum time which is needed for the deployment to finish, (the minimum time which I observed was 45 seconds nothing less than that).
So, I'll check if it is possible to poll and verify the status of deployment, but the test cases will still be slow.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a way we could reuse existing deployments then perhaps?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, that is something I haven't check yet and I'm not completely sure about this.
Although, even if a small change is there in a particular file in cloud function we need to redeploy the code. That's the reason building this integration took so much more time as compared to AWS. I'm not completely aware if there is a separate process than this.
tox.ini
Outdated
@@ -44,6 +44,9 @@ envlist = | |||
# The aws_lambda tests deploy to the real AWS and have their own matrix of Python versions. | |||
py3.7-aws_lambda | |||
|
|||
# The gcp deploy to the real AWS and have their own matrix of Python versions. | |||
py3.7-gcp |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@untitaker I've pushed new changes regarding GCP integration tox.ini file for test cases, does this look fine?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary from call with @shantanu73:
- Fix linting error
- Fix missing test dependency (tox env config)
- Document limitation that the current solution only works on Python 3.7 and not on 3.8 / disable the integration if the environment is unsupported to avoid breaking user functions
For now tests will be skipped in CI. As a follow up, convert the tests to use https://github.com/GoogleCloudPlatform/functions-framework-python to implement run_cloud_function
so that we don't need to deal with credentials nor network calls.
sentry_sdk/integrations/gcp.py
Outdated
# If an integration is there, a client has to be there. | ||
client = hub.client # type: Any | ||
|
||
configured_time = int(environ.get("FUNCTION_TIMEOUT_SEC")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is failing in mypy
because environ.get
may return None
if the env var is not set.
In [1]: import os
In [2]: os.environ.get("FUNCTION_TIMEOUT_SEC")
In [3]: int(os.environ.get("FUNCTION_TIMEOUT_SEC"))
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-3-d034f2d5d55a> in <module>
----> 1 int(os.environ.get("FUNCTION_TIMEOUT_SEC"))
TypeError: int() argument must be a string, a bytes-like object or a number, not 'NoneType'
We need to handle the case in which the variable is not set.
If we don't know the configured timeout, we cannot start the TimeoutThread
.
I suggest handling this by writing a debug log and then ignoring the rest of the wrapper, similar to when the integration is not in the current hub, as above.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, sure. I'll make this change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rhcarvalho modified this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rhcarvalho Maybe bcoz f the last check-in the linting got failed. I'll use make format and make lint. Though I do follow this as standard practice.
I'll check for the import errors in test jobs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rhcarvalho I'm a little doubtful on updating of tox.ini file. I've updated it as per the reference from aws_lambda. But, it still gave the import error.
Are my changes appropriate?
Also, I've tried running the commands you suggested, it took a lot of time to run but didn't complete.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you are missing an tests/integrations/aws_lambda/__init__.py
which would skip the tests if the dependency is not installed (in other jobs)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@untitaker okay, so I need a similar file for gcp tests right?
@shantanu73 now the linting job fails because of code formatting -- https://travis-ci.com/github/getsentry/sentry-python/jobs/375368281 You can run the lint job locally before submitting your commit with There are still import errors in the test jobs: https://travis-ci.com/github/getsentry/sentry-python/jobs/375368274 You can run them locally / partially using |
tests/integrations/gcp/test_gcp.py
Outdated
import os.path | ||
import os | ||
from googleapiclient.discovery import build | ||
from google_auth_oauthlib.flow import InstalledAppFlow | ||
from google.auth.transport.requests import Request | ||
|
||
requests = pytest.importorskip("requests") | ||
pickle = pytest.importorskip("pickle") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pickle is stdlib, this should not be necessary
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay, I'll change this.
import os.path | ||
import os | ||
|
||
requests = pytest.importorskip("requests") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would importorskip the google cloud sdk additionally
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure I'll do that as well.
Changes: