Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add event metadata as json to publish error logs #188

Merged
merged 3 commits into from
Aug 9, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,12 @@ Change Log
Unreleased
**********

[5.3.0] - 2023-08-08
********************
Changed
=======
* Added event_metadata_as_dict to ProducingContext for easier replay from logs

[5.2.0] - 2023-08-03
********************
Changed
Expand Down
2 changes: 1 addition & 1 deletion edx_event_bus_kafka/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@
from edx_event_bus_kafka.internal.consumer import KafkaEventConsumer
from edx_event_bus_kafka.internal.producer import KafkaEventProducer, create_producer

__version__ = '5.2.0'
__version__ = '5.3.0'
2 changes: 2 additions & 0 deletions edx_event_bus_kafka/internal/producer.py
Original file line number Diff line number Diff line change
Expand Up @@ -208,6 +208,7 @@ class ProducingContext:
event_data = attr.ib(type=dict, default=None)
event_metadata = attr.ib(type=EventsMetadata, default=None)
event_data_as_json = attr.ib(type=str, default=None)
event_metadata_as_json = attr.ib(type=str, default=None)

def __repr__(self):
"""Create a logging-friendly string"""
Expand Down Expand Up @@ -284,6 +285,7 @@ def send(
event_data=event_data, event_metadata=event_metadata)
try:
context.event_data_as_json = json.dumps(get_signal_serializer(signal).to_dict(event_data))
context.event_metadata_as_json = event_metadata.to_json()
full_topic = get_full_topic(topic)
context.full_topic = full_topic

Expand Down
2 changes: 2 additions & 0 deletions edx_event_bus_kafka/internal/tests/test_producer.py
Original file line number Diff line number Diff line change
Expand Up @@ -247,6 +247,7 @@ def test_full_event_data_present_in_key_extraction_error(self, mock_logger, *arg
assert f"sourcehost='{metadata.sourcehost}'" in error_string
assert "event_data_as_json='{\"test_data\": {\"course_id\": \"id\", \"sub_name\": \"name\"}}'"\
in error_string
assert f"event_metadata_as_json='{metadata.to_json()}'" in error_string
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[observation] It seems some of the metadata is now duplicated. I guess that's not a big deal if this is just for errors.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is indeed duplicated. I thought we'd be able to use the columns from Splunk to make a metadata object, but the problem is there's a collision in the "source" field, which Splunk uses to store other data.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose on further reflection we could have just changed how the source shows up in the string representation but that would mean making a change specifically for Splunk!reasons in openedx-events, whereas the json representation seems like a more general solution. I don't think it's too much of a load on the logs to add this, especially because we only log on failure.


@patch(
'edx_event_bus_kafka.internal.producer.get_serializers', autospec=True,
Expand Down Expand Up @@ -287,6 +288,7 @@ def test_full_event_data_present_in_kafka_error(self, mock_logger, *args):
assert "error=bad!" in error_string
assert "event_data_as_json='{\"test_data\": {\"course_id\": \"ABCx\", \"sub_name\": \"name\"}}'"\
in error_string
assert f"event_metadata_as_json='{metadata.to_json()}'" in error_string

@override_settings(EVENT_BUS_KAFKA_POLL_INTERVAL_SEC=0.05)
def test_polling_loop_terminates(self):
Expand Down
2 changes: 1 addition & 1 deletion requirements/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ markupsafe==2.1.3
# via jinja2
newrelic==8.8.1
# via edx-django-utils
openedx-events==8.3.0
openedx-events==8.5.0
# via -r requirements/base.in
pbr==5.11.1
# via stevedore
Expand Down
2 changes: 1 addition & 1 deletion requirements/dev.txt
Original file line number Diff line number Diff line change
Expand Up @@ -202,7 +202,7 @@ newrelic==8.8.1
# via
# -r requirements/quality.txt
# edx-django-utils
openedx-events==8.3.0
openedx-events==8.5.0
# via -r requirements/quality.txt
packaging==23.1
# via
Expand Down
2 changes: 1 addition & 1 deletion requirements/doc.txt
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ newrelic==8.8.1
# via
# -r requirements/test.txt
# edx-django-utils
openedx-events==8.3.0
openedx-events==8.5.0
# via -r requirements/test.txt
packaging==23.1
# via
Expand Down
2 changes: 1 addition & 1 deletion requirements/quality.txt
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ newrelic==8.8.1
# via
# -r requirements/test.txt
# edx-django-utils
openedx-events==8.3.0
openedx-events==8.5.0
# via -r requirements/test.txt
packaging==23.1
# via
Expand Down
2 changes: 1 addition & 1 deletion requirements/test.txt
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ newrelic==8.8.1
# via
# -r requirements/base.txt
# edx-django-utils
openedx-events==8.3.0
openedx-events==8.5.0
# via -r requirements/base.txt
packaging==23.1
# via pytest
Expand Down