-
Notifications
You must be signed in to change notification settings - Fork 143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ERROR: cannot find the current segment/subsegment when segment is open and uploading file to s3. #377
Comments
Hello, I have a few questions that will hopefully help clarify where this issue is coming from:
|
Hello, to answer your questions
Heres a snippet including more trace_id logs and the corresponding output if __name__ == "__main__":
patch_all()
xray_recorder.configure(context_missing='LOG_ERROR', daemon_address='localhost:2000')
segment = xray_recorder.begin_segment('test segment')
logger.info(f'created segment trace_id:{segment.trace_id} ')
s3_resource = boto3.resource("s3")
logger.info(f'current trace entity trace_id: {xray_recorder.get_trace_entity().trace_id}')
# this give the error
logger.info(f'trace_id before error: {xray_recorder.current_segment().trace_id}')
s3_resource.Bucket('resultfiles-test' ).upload_file('hello_world.txt', 'hello_world.txt')
logger.info(f'trace_id after error: {xray_recorder.current_segment().trace_id}')
# this works
logger.info(f'trace_id before working: {xray_recorder.current_segment().trace_id}')
s3_resource.Bucket('resultfiles-test' ).put_object(Key='hello_world.txt', Body=b'hello world')
logger.info(f'trace_id after working: {xray_recorder.current_segment().trace_id}')
xray_recorder.end_segment() output
When I set the logging level to debug to see the segment sent, it includes the second working call to s3. {
"id": "a69a1a6eb4e12c86",
"name": "test segment",
"start_time": 1675123698.393169,
"in_progress": false,
"aws": {
"xray": {
"sdk": "X-Ray for Python",
"sdk_version": "2.11.0"
}
},
"subsegments": [
{
"id": "1130954ac4c9383e",
"name": "s3",
"start_time": 1675123699.1555898,
"parent_id": "a69a1a6eb4e12c86",
"in_progress": false,
"http": {
"response": {
"status": 200
}
},
"aws": {
"operation": "PutObject",
"region": "us-east-1",
"request_id": "CTQFC52JXK7BED7F",
"id_2": "4lazYkItN+LCNrSfBTnved/tEByY50Od+KArL3cX5uWrx8BPHDXNMCnxOi/hhT4Tv9wd+aGbGdw=",
"key": "hello_world.txt",
"bucket_name": "resultfiles-test"
},
"trace_id": "1-63d85bf2-67e791f683029bb451728c4a",
"type": "subsegment",
"namespace": "aws",
"end_time": 1675123699.3337479
}
],
"trace_id": "1-63d85bf2-67e791f683029bb451728c4a",
"service": {
"runtime": "CPython",
"runtime_version": "3.9.8"
},
"end_time": 1675123699.335922
}
|
I discovered the issue can be resolved if I disable multi-threading on upload_file. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Currently using python version 3.9, botocore version 1.29.50, aws_xray_sdk version 2.11.0
When I run the code snippet below I get an error saying there's no open segment for the calls the use upload file. The subsequent calls using put_object have no issues and show up in the segment data. Logging the trace_id was also successful.
The text was updated successfully, but these errors were encountered: