-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'main' of github.com:juspay/hyperswitch into iatapay-thr…
…ough-hyperswitch-cypress * 'main' of github.com:juspay/hyperswitch: feat(router): add support for googlepay step up flow (#2744) fix(access_token): use `merchant_connector_id` in access token (#5106) feat: added kafka events for authentication create and update (#4991) feat(ci): add vector to handle logs pipeline (#5021) feat(users): Decision manager flow changes for SSO (#4995) ci(cypress): Fix payment method id for non supported connectors (#5075) refactor(core): introduce an interface to switch between old and new connector integration implementations on the connectors (#5013) refactor(events): populate object identifiers in outgoing webhooks analytics events during retries (#5067) Refactor: [Fiserv] Remove Default Case Handling (#4767) chore(version): 2024.06.24.0 fix(router): avoid considering pre-routing results during `perform_session_token_routing` (#5076) refactor(redis): spawn one subscriber thread for handling all the published messages to different channel (#5064) feat(users): setup user authentication methods schema and apis (#4999) feat(payment_methods): Implement Process tracker workflow for Payment method Status update (#4668) chore(version): 2024.06.20.1 chore(postman): update Postman collection files fix(payment_methods): support last used for off session token payments (#5039) ci(postman): add net_amount field test cases (#3286) refactor(connector): [Mifinity]dynamic fields for mifinity (#5056) refactor(payment_method): [Klarna] store and populate payment_type for klarna_sdk Paylater in response (#4956)
- Loading branch information
Showing
215 changed files
with
6,208 additions
and
1,100 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,134 @@ | ||
acknowledgements: | ||
enabled: true | ||
|
||
api: | ||
enabled: true | ||
address: 0.0.0.0:8686 | ||
|
||
sources: | ||
kafka_tx_events: | ||
type: kafka | ||
bootstrap_servers: kafka0:29092 | ||
group_id: sessionizer | ||
topics: | ||
- hyperswitch-payment-attempt-events | ||
- hyperswitch-payment-intent-events | ||
- hyperswitch-refund-events | ||
- hyperswitch-dispute-events | ||
decoding: | ||
codec: json | ||
|
||
app_logs: | ||
type: docker_logs | ||
include_labels: | ||
- "logs=promtail" | ||
|
||
vector_metrics: | ||
type: internal_metrics | ||
|
||
node_metrics: | ||
type: host_metrics | ||
|
||
transforms: | ||
plus_1_events: | ||
type: filter | ||
inputs: | ||
- kafka_tx_events | ||
condition: ".sign_flag == 1" | ||
|
||
hs_server_logs: | ||
type: filter | ||
inputs: | ||
- app_logs | ||
condition: '.labels."com.docker.compose.service" == "hyperswitch-server"' | ||
|
||
parsed_hs_server_logs: | ||
type: remap | ||
inputs: | ||
- app_logs | ||
source: |- | ||
.message = parse_json!(.message) | ||
events: | ||
type: remap | ||
inputs: | ||
- plus_1_events | ||
source: |- | ||
.timestamp = from_unix_timestamp!(.created_at, unit: "seconds") | ||
sinks: | ||
opensearch_events: | ||
type: elasticsearch | ||
inputs: | ||
- events | ||
endpoints: | ||
- "https://opensearch:9200" | ||
id_key: message_key | ||
api_version: v7 | ||
tls: | ||
verify_certificate: false | ||
verify_hostname: false | ||
auth: | ||
strategy: basic | ||
user: admin | ||
password: 0penS3arc# | ||
encoding: | ||
except_fields: | ||
- message_key | ||
- offset | ||
- partition | ||
- topic | ||
bulk: | ||
# Add a date prefixed index for better grouping | ||
# index: "vector-{{ .topic }}-%Y-%m-%d" | ||
index: "{{ .topic }}" | ||
|
||
opensearch_logs: | ||
type: elasticsearch | ||
inputs: | ||
- parsed_hs_server_logs | ||
endpoints: | ||
- "https://opensearch:9200" | ||
api_version: v7 | ||
tls: | ||
verify_certificate: false | ||
verify_hostname: false | ||
auth: | ||
strategy: basic | ||
user: admin | ||
password: 0penS3arc# | ||
bulk: | ||
# Add a date prefixed index for better grouping | ||
# index: "vector-{{ .topic }}-%Y-%m-%d" | ||
index: "logs-{{ .container_name }}-%Y-%m-%d" | ||
|
||
log_events: | ||
type: loki | ||
inputs: | ||
- kafka_tx_events | ||
endpoint: http://loki:3100 | ||
labels: | ||
source: vector | ||
topic: "{{ .topic }}" | ||
job: kafka | ||
encoding: | ||
codec: json | ||
|
||
log_app_loki: | ||
type: loki | ||
inputs: | ||
- parsed_hs_server_logs | ||
endpoint: http://loki:3100 | ||
labels: | ||
source: vector | ||
job: app_logs | ||
container: "{{ .container_name }}" | ||
stream: "{{ .stream }}" | ||
encoding: | ||
codec: json | ||
|
||
metrics: | ||
type: prometheus_exporter | ||
inputs: | ||
- vector_metrics | ||
- node_metrics |
Oops, something went wrong.