Conversation
|
Thank you for submitting it! I'll try to find time to review it over the weekend |
|
Sorry for the delay here, and thank you for submitting this PR! 🔥 What do you think about isolating this logic to GraphQL only? My current thinking:
Just as an idea,
To clarify, is it to avoid breaking changes if someone uses |
|
Just chiming in: I just came looking in this repo for the ability to call .then on the loader object and I'm not using graphQL: I'm using Blueprinter (i.e. classic rest API with serializers). My use case is the same: often I've got multiple fields in my serializer that each make use of the same underlying query and I don't want to have to run those queries twice. For the record, I would personally prefer using .then so I can just pass a block that describes what I want to do with the value once it's synced. I don't care if the method is actually called 'then' or not (I'd happily use 'batch_loader_then' to avoid name collisions), I just want some way of specifying what I want to do next with the value. I suspect this would be simple to implement and it would benefit both GraphQL and non-Graphql users. |
Follow up to this pr: #89
Added a
lazy_evalmethod to chain methods together.I called it
lazy_evalinstead oflazyto avoid breaking changes.Follows Enumerator::Lazy with
eagerandforcemethods that only become available afterlazy_evalis called.eagertells the loader to stop accumulating methods in the lazy chain and force a real evaluation on the next call.forcetells the loader to sync immediately (kind of redundant, you shouldn't use lazy_eval if you're gonna force)In graphql, a sync is implicitly done at the correct time so
eagerisn't required to break the lazy chain. I automatically disable lazy chaining after a sync. This allows a nice looking:But in non graphql contexts, you'd have to do:
The code is relatively easy to understand, and the api is fairly analogous to Enumerator::Lazy, but I understand if you think it's too complex. Let me know if you have any thoughts, was fun to work on regardless.