-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initial prototype for trio.Nursery
per-task-scope-mangement via a user defined single-yield-generator function 😎
#363
base: ctx_cancel_semantics_and_overruns
Are you sure you want to change the base?
Conversation
tractor/trionics/_supervisor.py
Outdated
return self.result | ||
|
||
|
||
class TaskHandle(Struct): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: show an second example where the cancel_scope_manager
generator wraps the trio internals listed as attrs into this compound type and then that is what's returned from the .start_soon()
call?
|
||
scope_manager: ContextManager | None = None | ||
|
||
async def start_soon( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't really know what the best API would be for doing this with .start()
?
- you don't really need start since you can always implement it by getting the child task to
Event.set()
, or even further use thisMaybeOutcome
thing from above. - how do you get the
cancel_scope_manager
yielded output in that case, since.start()
would normally return whatever the child passed tostatus.started()
?
- i.e. you can't do
started = await n.start(blah)
and expectstarted
to be theoutcome, cs
stuff
- maybe the child task could get access to its own cancel scope "stuff" from somewhere and then choose to
status.started((outcome, cs))
it back to the caller?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Huh, i guess we could just offer some kinda support for hooking into the started value returned by the task? that way the cancel_scope_manager
could decide what to send back to the .start()
caller with the default being the value that the child .started()
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
huh, likely you could implement that as a double yield generator then?
I dunno what would be best for ensuring that other then decorator params and then async_fn
checking inside .start()
against that i guess?
tractor/trionics/_supervisor.py
Outdated
|
||
val: str = 'yoyoyo' | ||
val_outcome, cs = await sn.start_soon(sleep_then_return_val, val) | ||
res = await val_outcome.unwrap() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As already mentioned above in the comment on the method, probably a better name is warranted for this since you're actually first waiting for the outcome then unwrapping it's underlying value/result.
tractor/trionics/_supervisor.py
Outdated
assert res == val | ||
print(f'GOT EXPECTED TASK VALUE: {res}') | ||
|
||
print('WAITING FOR CRASH..') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
scheduling sleep_then_err()
results in an assertion error that is caught and handled in the pdbp
REPL thanks to the cancel_scope_manager
implementation which does this on std Exception
s 😎
Note, you could also catch trio.Cancelled
or wtv else by changing the except line 🏄🏼
|
||
async def _start_wrapped_in_scope( | ||
task_status: TaskStatus[ | ||
tuple[CancelScope, Task] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wrong, this should be the type sig of whatever is yielded from the self.scope_manager
obviously
tractor/trionics/_supervisor.py
Outdated
|
||
# TODO: better way to concat the values delivered by the user | ||
# provided `.scope_manager` and the outcome? | ||
return tuple([maybe_outcome] + to_return) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah as mentioned in the TODO, don't love this concat-ing of the results from the scope_manager()
, but i think this goes away if we instead implement our own generator as @trio.cancel_scope_manager
protocol since then the absolute value tuple
yielded can be returned directly?
bc2bd2d
to
a7af633
Compare
tractor/trionics/_supervisor.py
Outdated
# TODO: this was working before?! | ||
# nonlocal to_return | ||
|
||
with cs: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahh wait, right.. so we do need to activate this here or can it be done in the user's @task_scope_manager
thing?
tractor/trionics/_supervisor.py
Outdated
# task = new_tasks.pop() | ||
|
||
n: Nursery = self._n | ||
cs = CancelScope() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need this allocated here?
See comment(s) below.
e4d4434
to
6c089c0
Compare
trio.Nursery
per-task-scope-mangement via a user defined single-yield-generator function 😎
As was listed in the many todos, this changes the `.start_soon()` impl to instead (manually) `.send()` into the user defined `@task_scope_manager` an `Outcome` from the spawned task. In this case the task manager wraps that in a user defined (and renamed) `TaskOutcome` and delivers that + a containing `trio.CancelScope` to the `.start_soon()` caller. Here the user defined `TaskOutcome` defines a `.wait_for_result()` method that can be used to await the task's exit and handle it's underlying returned value or raised error; the implementation could be different and subject to the user's own whims. Note that by default, if this was added to `trio`'s core, the `@task_scope_manager` would simply be implemented as either a `None` yielding single-yield-generator but more likely just entirely ignored by the runtime (as in no manual task outcome collecting, generator calling and sending is done at all) by default if the user does not provide the `task_scope_manager` to the nursery at open time.
Turns out the nursery doesn't have to care about allocating a per task `CancelScope` since the user can just do that in the `@task_scope_manager` if desired B) So just mask all the nursery cs allocating with the intention of removal. Also add a test for per-task-cancellation by starting the crash task as a `trio.sleep_forever()` but then cancel it via the user allocated cs and ensure the crash propagates as expected 💥
- drop unneeded (and commented) internal cs allocating bits. - bypass all task manager stuff if no generator is provided by the caller; i.e. just call `.start_soon()` as normal. - fix `Generator` typing. - add some prints around task manager. - wrap in `TaskOutcome.lowlevel_task: Task`.
6c089c0
to
940e65f
Compare
tractor/trionics/_supervisor.py
Outdated
return self.result | ||
|
||
|
||
class ScopePerTaskNursery(Struct): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Heh, guess we don't need much of a special name now?
PerTaskHookableNursery
😂
tractor/trionics/_supervisor.py
Outdated
*args, | ||
|
||
name=None, | ||
scope_manager: ContextManager | None = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe task_manager
is a better name?
name=None, | ||
scope_manager: ContextManager | None = None, | ||
|
||
) -> tuple[CancelScope, Task]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
needs to again be dynamically defined at runtime.. soo use the ParamSpec
stuff?
tractor/trionics/_supervisor.py
Outdated
|
||
await trio.sleep(0.6) | ||
print( | ||
'Cancelling and waiting on {err_outcome.lowlevel_task} ' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lul, missing f-string
cda85a3
to
b24f0d5
Compare
tractor/trionics/_supervisor.py
Outdated
# TODO: define a decorator to runtime type check that this a generator | ||
# with a single yield that also delivers a value (of some std type) from | ||
# the yield expression? | ||
# @trio.task_scope_manager |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe just @trio.task_manager
?
tractor/trionics/_supervisor.py
Outdated
def add_task_handle_and_crash_handling( | ||
nursery: Nursery, | ||
|
||
# TODO: is this the only way we can have a per-task scope |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no, we don't need the nursery to allocate any cancel scope per task interally!
need to remove this!
aed71dc
to
0025e09
Compare
0025e09
to
e0c888f
Compare
Allows dynamically importing `pdbp` when enabled and a way for eventually linking with `tractor`'s own debug mode flag.
872dcd7
to
c7e27ad
Compare
As per some discussion in the
trio
gitter and starting somethingtoward #22 and supporting our debugger crash mode from within
a
trio
-compat task nursery.Idea: a
trio.Nursery
API for optionally allowing a user to define a per-task managerBut why?
Well if you wanted any of the following (some of which are implemented
in the module script in this patch):
trio.Nursery
"internals" for anyper-task oriented purpose.
waiting on a task to finish and get its result.
of debugging crashes, (unexpected) cancels or just generally tracing
the
trio
per task control flow.one_for_one
from erlang/elixir and friends.
ToDo:
offer a
task_manager
context-manager-as-generator API suchthat the current
TaskOutcome
struct is not tied to theinternals of the
ScopePerTaskNursery.start_soon()
implementation.outcome: Outcome = yield (cs, ..)
is delivered tothe user defined mngr
via asuch that onlyOutcome.send(mngr)
callraw
trio
(dependency) internal types are provided to the user forwrapping / adjusting.
Outcome.send()
bc it would send the underlyingvalue, not the outcome instance; not sure if that's best?
way better naming since most of this was whipped up on the
first try and getting good names is probably going to be tricky 😂
@task_manager
is maybe better?@task_scope_manager
?ScopePerTaskNursery
is terrible.. how aboutTaskScopedNursery
?TaskManagerNursery
?TaskOutcome.unwrap()
seems wrong, despite it being an example ofa user defined handle.
.wait_for_result()
which is a lot morepythonic in terms of plain-ol'-simple-semantics 🏄
maybe.unwind()
is more descriptive? as in unwinding the outcomefrom the task would imply you both have to:
a function call and the subsequent popping of the stack ..?
an imperative lang in terms of lower level fp abstractions which
is not only confusing but a bit moot: we don't need to hear math
about stuff we can't change in (python and) the semantics of
a method..:joy:
test the waters in
trio
core for interest in this ideaanyio
author and njs butdon't think they understood what i was proposing (hence this proto
😂)
inside
trio
's scheduler would be about here:https://github.com/python-trio/trio/blob/master/trio/_core/_run.py#L1601
since this is where the nursery-global cancel scope's
CancelStatus
is activated on the currently spawning task.
and offer the generator-style
@cancel_scope_manager
thingbecause we'll need to call
__enter__
/__exit__
from twodifferent
Runner
methods ->.spawn_impl()
and.task_exited()
PR (nor care?) and it'll get relegated to our
trionics
secretsauces pantry yet again 😂
figure out if it's possible to swap in this
open_nursery()
for all uses of
trio.open_nursery()
when our users enable debug mode😂
Hopefully much more to come!