Auto merge of #29629 - servo:fix-nightly-job-upload-error, r=mukilan

Fix nightly upload to GH release logic.

The boto3 S3 client automatically [closes the given fileobj](https://github.com/boto/s3transfer/blob/develop/s3transfer/upload.py#L106) after the transfer is complete. This prevents us from reusing the package_hash_fileobj between s3 and github upload methods. This is causing the [upload to github to fail](https://github.com/servo/servo/actions/runs/4685791796/jobs/8303739124) with:
```
ValueError: I/O operation on closed file.

  File "/home/runner/work/servo/servo/python/servo/package_commands.py", line 792, in upload_nightly
    upload_to_github_release(platform, package, package_hash_fileobj)
  File "/home/runner/work/servo/servo/python/servo/package_commands.py", line 635, in upload_to_github_release
    package_hash_fileobj.getbuffer().nbytes,
```

This PR changes fixes the issue by creating fresh instances of io.BytesIO within the two upload_to_* methods.

I've triggered a [manual nightly job](https://github.com/servo/servo/actions/runs/4686470246) based on this branch. This PR can be kept open until the build completes, in case other issues are surfaced.

---
<!-- Thank you for contributing to Servo! Please replace each `[ ]` by `[X]` when the step is complete, and replace `___` with appropriate data: -->
- [x] `./mach build -d` does not report any errors
- [x] `./mach test-tidy` does not report any errors
- [ ] These changes fix #___ (GitHub issue number if applicable)

<!-- Either: -->
- [ ] There are tests for these changes OR
- [x] These changes do not require tests because they fix an error in nightly CI Job
This commit is contained in:
bors-servo 2023-04-13 12:54:23 +02:00 committed by GitHub
commit 45000be019
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -618,24 +618,28 @@ class PackageCommands(CommandBase):
path.basename(package)
)
def upload_to_github_release(platform, package, package_hash_fileobj):
def upload_to_github_release(platform, package, package_hash):
if not github_release_id:
return
extension = path.basename(package).partition('.')[2]
g = Github(os.environ['NIGHTLY_REPO_TOKEN'])
nightly_repo = g.get_repo(os.environ['NIGHTLY_REPO'])
release = nightly_repo.get_release(github_release_id)
package_hash_fileobj = io.BytesIO(package_hash.encode('utf-8'))
if '2020' in platform:
asset_name = f'servo-latest-layout-2020.{extension}'
else:
asset_name = f'servo-latest.{extension}'
release.upload_asset(package, name=asset_name)
release.upload_asset_from_memory(
package_hash_fileobj,
package_hash_fileobj.getbuffer().nbytes,
name=f'{asset_name}.sha256')
def upload_to_s3(platform, package, package_hash_fileobj, timestamp):
def upload_to_s3(platform, package, package_hash, timestamp):
(aws_access_key, aws_secret_access_key) = get_s3_secret()
s3 = boto3.client(
's3',
@ -658,6 +662,7 @@ class PackageCommands(CommandBase):
extension = path.basename(package).partition('.')[2]
latest_upload_key = '{}/servo-latest.{}'.format(nightly_dir, extension)
package_hash_fileobj = io.BytesIO(package_hash.encode('utf-8'))
latest_hash_upload_key = f'{latest_upload_key}.sha256'
s3.upload_file(package, BUCKET, package_upload_key)
@ -786,10 +791,9 @@ class PackageCommands(CommandBase):
break
sha256_digest.update(data)
package_hash = sha256_digest.hexdigest()
package_hash_fileobj = io.BytesIO(package_hash.encode('utf-8'))
upload_to_s3(platform, package, package_hash_fileobj, timestamp)
upload_to_github_release(platform, package, package_hash_fileobj)
upload_to_s3(platform, package, package_hash, timestamp)
upload_to_github_release(platform, package, package_hash)
if platform == 'maven':
for package in PACKAGES[platform]: