-
Notifications
You must be signed in to change notification settings - Fork 65
Upgrade Tron to Python 3.10 - TRON-2435 #1071
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Changes from 2 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,4 @@ | ||
#!/usr/bin/env python3.8 | ||
#!/usr/bin/env python3.10 | ||
import ast | ||
import sys | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,13 +2,13 @@ Source: tron | |
Section: admin | ||
Priority: optional | ||
Maintainer: Daniel Nephin <[email protected]> | ||
Build-Depends: debhelper (>= 7), python3.8-dev, libdb5.3-dev, libyaml-dev, libssl-dev, libffi-dev, dh-virtualenv | ||
Build-Depends: debhelper (>= 7), python3.10-dev, libdb5.3-dev, libyaml-dev, libssl-dev, libffi-dev, dh-virtualenv | ||
Standards-Version: 3.8.3 | ||
|
||
Package: tron | ||
Architecture: all | ||
Homepage: http://github.com/yelp/Tron | ||
Depends: bsdutils, python3.8, libdb5.3, libyaml-0-2, ${shlibs:Depends}, ${misc:Depends} | ||
Depends: bsdutils, python3.10, libdb5.3, libyaml-0-2, ${shlibs:Depends}, ${misc:Depends} | ||
Description: Tron is a job scheduling, running and monitoring package. | ||
Designed to replace Cron for complex scheduling and dependencies. | ||
Provides: | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,3 @@ | ||
[tool.black] | ||
line-length = 120 | ||
target_version = ['py38'] | ||
target_version = ['py310'] |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -5,7 +5,6 @@ flake8 | |
mock | ||
mypy | ||
pre-commit | ||
pylint | ||
pytest | ||
pytest-asyncio | ||
requirements-tools | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -4,8 +4,8 @@ | |
import boto3 | ||
import pytest | ||
import staticconf.testing | ||
from moto import mock_dynamodb2 | ||
from moto.dynamodb2.responses import dynamo_json_dump | ||
from moto import mock_dynamodb | ||
from moto.dynamodb.responses import dynamo_json_dump | ||
|
||
from testifycompat import assert_equal | ||
from tron.serialize.runstate.dynamodb_state_store import DynamoDBStateStore | ||
|
@@ -61,10 +61,10 @@ def update_item(item): | |
@pytest.fixture(autouse=True) | ||
def store(): | ||
with mock.patch( | ||
"moto.dynamodb2.responses.DynamoHandler.transact_write_items", | ||
"moto.dynamodb.responses.DynamoHandler.transact_write_items", | ||
new=mock_transact_write_items, | ||
create=True, | ||
), mock_dynamodb2(): | ||
), mock_dynamodb(): | ||
dynamodb = boto3.resource("dynamodb", region_name="us-west-2") | ||
table_name = "tmp" | ||
store = DynamoDBStateStore(table_name, "us-west-2", stopping=True) | ||
|
@@ -125,7 +125,7 @@ def large_object(): | |
"runs": [], | ||
"cleanup_run": None, | ||
"manual": False, | ||
"large_data": [i for i in range(1_000_000)], | ||
"large_data": [i for i in range(10000)], | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. was this meant to be changed? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yea, I was getting this error There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. gotcha! i assume that since the tests are still passing, then we're still ending up splitting this object There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I suspect that by reducing the data to 10,000 integers, we're now creating an object that is smaller than our OBJECT_SIZE of 200KB. This would mean it no longer gets partitioned, and these tests are no longer verifying the multi-partition logic they were designed for. Maybe we should also add an assert num_partitions(key) > 1 in these tests to guarantee we're actually testing the partitioning logic? Also, outside the bounds of this PR, it might be a good time to rename these tests to something like test_save_object_requiring_partitioning. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. iirc from some manual tests, 50,000 gets us ~2 partitions There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think what might be happening here is related to OBJECT_SIZE. We don't leave a lot of overhead with our current limit. You could try changing OBJECT_SIZE to 150,000. Keep in mind that we still want to make sure we're setting large object in a way that both the pickle and json are getting partitioned. |
||
} | ||
|
||
|
||
|
Uh oh!
There was an error while loading. Please reload this page.