-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create an adapter for DynamoDB event trigger #152
Comments
@ceilfors I would like to work on this issue. |
Hi @sakthivel-tw and @ceilfors, I see a really good start has been made with this. Would it be alright if I pick it up? There are a few nuances to this adaptor as a lot of people have multiple entities in a single table. I think Laconia should aim to support that by providing more context in the records array. perhaps this is what the record should look like: [{ item: { new: { pk: '1', sk: 'party', message: 'New Item!'}, old: undefined }, type: 'party-create' }] aws-lambda-stream has quite a mature implementation handling plenty of the edge cases so it could be used to inform this. A special adapter that supports batching and rate-limiting would be especially cool |
@all-contributors please add @Ankcorn for ideas |
I've put up a pull request to add @Ankcorn! 🎉 |
@Ankcorn There is no active pull request at the moment by @sakthivel-tw, so definitely happy for you to pick it up! That's an interesting thought on supporting multiple entities, I presume you're talking about the Single table design? To be honest it's a pattern that I haven't explored yet as I've been living with Faux SQL at the moment. But based on my understanding, most users will have their table designed differently that it'll be quite impossible for a framework to derive what type of entity that's coming. So I think even for simple table, the conversion from DDB Items to Entities should belong to the user land (an Entity Factory). This means, Laconia may simply need to be able to accept an entity factory type of function. This function will then be called when we're adapting the event. I don't actually know the best practice on the batching and rate-limiting of ddb streams. Is this something that should be handled on AWS level? I'm thinking that there'll be a complexity thing around handling large DDB stream that the user might hit a timeout (we could recurse I guess, which is what Laconia is already doing, see: https://laconiajs.io/docs/guides/long-running-tasks). Is this a better pattern for example: DDB stream -> lambda -> sqs/kinesis -> lambda -> legacy, to throttle the request? Something to bear in mind, the API for DynamoDB adapter has sadly has a design flaw; it only supports newImage at the moment. If I were to redesign this, I'd make sure that laconia users know what requests they are handling, and even explicitly force them to be aware that they might need to handle different types of DynamoDB events. Something like (not well thought yet..):
|
You are absolutely right about the fact it would be impossible to serve everyone. My suggestion would have been to check type field -> SK -> table name. But I agree with your suggestion of a factory function would be much better. Because this is a polling handler like kinesis hitting the timeout is not a concern except for some very niche use cases. You just configure the batch size and the paralysation factor to the current values for your table throughout and workload. If you have a thing that takes 15 mins you can just set the batch size to 1 and then configure the paralysation so the stream doesn't lag behind too much. My suggestions of ways the rate limit and batch in the stream are totally a case of excitement driven over-engineering and should be ignored. How would you recommend to handle the different event formats? My opinion is one of the things that I really like about Laconia is that it just handles things where AWS gets the API wrong in a lovely way. sns(event)
// vs
JSON.parse(event.Records[0].Sns.Message) Just saves so many brain cells and reduces the chance of bugs. I once shipped a bug where i typed SNS 😓 My personal preference would be to do this
type would be a string with the value insert/update/delete For insert old would be undefined and for delete new would be undefined. Obviously this has the drawback of being a breaking change. How much should this impact the decision? |
Although to disagree with my previous comment all the code we have to handle db stream events starts with if(insert) {
// do the insert thing
} else if (modify) {
...
} else {
...} so maybe the framework abstracting that would be nice but I am not sure if it would just add more complexity |
This ticket is created from #111.
Is your feature request related to a problem? Please describe.
An adapter for dynamodb stream trigger, similar to what laconia already have for kinesis: https://laconiajs.io/docs/api/adapter#kinesis-app.
Describe the solution you'd like
Additional context
The text was updated successfully, but these errors were encountered: