Files
hatchet/sdks/python/tests/test_rest_api.py
Matt Kaye 2f33dd4dbd Feat: Misc. Python improvements + Streaming Improvements (#1846)
* fix: contextvars explicit copy

* feat: fix a ton of ruff errors

* fix: couple more ruff rules

* fix: ignore unhelpful rule

* fix: exception group in newer Python versions for improved handling

* fix: workflow docs

* feat: context docs

* feat: simple task counter

* feat: config for setting max tasks

* feat: graceful exit once worker exceeds max tasks

* fix: optional

* fix: docs

* fix: events docs + gen

* chore: gen

* fix: one more dangling task

* feat: add xdist in ci

* fix: CI

* fix: xdist fails me once again

* fix: fix + extend some tests

* fix: test cleanup

* fix: exception group

* fix: ugh

* feat: changelog

* Add Ruff linter callout to post

* refactor: clean up runner error handling

* feat: improved errors

* fix: lint

* feat: hacky serde impl

* fix: improve serde + formatting

* fix: logging

* fix: lint

* fix: unexpected errors

* fix: naming, ruff

* fix: rm cruft

* Fix: Attempt to fix namespacing issue in event waits (#1885)

* feat: add xdist in ci

* fix: attempt to fix namespacing issue in event waits

* fix: namespaced worker names

* fix: applied namespace to the wrong thing

* fix: rm hack

* drive by: namespacing improvement

* fix: delay

* fix: changelog

* fix: initial log work

* fix: more logging work

* fix: rm print cruft

* feat: use a queue to send logs

* fix: sentinel value to stop the loop

* fix: use the log sender everywhere

* fix: make streaming blocking, remove more thread pools

* feat: changelog

* fix: linting issues

* fix: broken test

* chore: bunch more generated stuff

* fix: changelog

* fix: one more

* fix: mypy

* chore: gen

* Feat: Streaming Improvements (#1886)

* Fix: Filter list improvements (#1899)

* fix: uuid validation

* fix: improve filter filtering

* fix: inner join

* fix: bug in workflow cached prop

* chore: bump

* fix: lint

* chore: changelog

* fix: separate filter queries

* feat: improve filter filtering

* fix: queries and the like

* feat: add xdist in ci

* feat: streaming test + gen

* feat: add index to stream event

* fix: rm langfuse dep

* fix: lf

* chore: gen

* feat: impl index for stream on context

* feat: tweak protos

* feat: extend test

* feat: send event index through queue

* feat: first pass + debug logging

* debug: fixes

* debug: more possible issues

* feat: generate new stream event protos

* feat: first pass at using an alternate exchange for replaying incoming stream events

* fix: exchange create timing

* fix: rm unused protos

* chore: gen

* feat: python cleanup

* fix: revert rabbit changes

* fix: unwind a bunch of cruft

* fix: optional index

* chore: gen python

* fix: event index nil handling

* feat: improve test

* fix: stream impl in sdk

* fix: make test faster

* chore: gen a ton more stuff

* fix: test

* fix: sorting helper

* fix: bug

* fix: one more ordering bug

* feat: add some tests for buffering logic

* feat: hangup test

* feat: test no buffering if no index sent

* fix: regular mutex

* fix: pr feedback

* fix: conflicts
2025-06-25 10:11:01 -04:00

62 lines
1.6 KiB
Python

import asyncio
import pytest
from examples.dag.worker import dag_workflow
from hatchet_sdk import Hatchet
@pytest.mark.asyncio(loop_scope="session")
async def test_list_runs(hatchet: Hatchet) -> None:
dag_result = await dag_workflow.aio_run()
runs = await hatchet.runs.aio_list(
limit=10_000,
only_tasks=True,
)
for v in dag_result.values():
assert v in [r.output for r in runs.rows]
@pytest.mark.asyncio(loop_scope="session")
async def test_get_run(hatchet: Hatchet) -> None:
dag_ref = await dag_workflow.aio_run_no_wait()
await asyncio.sleep(5)
run = await hatchet.runs.aio_get(dag_ref.workflow_run_id)
assert dag_workflow.config.name in run.run.display_name
assert run.run.status.value == "COMPLETED"
assert len(run.shape) == 4
assert {t.name for t in dag_workflow.tasks} == {t.task_name for t in run.shape}
@pytest.mark.asyncio(loop_scope="session")
async def test_list_workflows(hatchet: Hatchet) -> None:
workflows = await hatchet.workflows.aio_list(workflow_name=dag_workflow.config.name)
assert workflows.rows
assert len(workflows.rows) >= 1
relevant_wf = next(
iter(
[
wf
for wf in workflows.rows
if wf.name == hatchet.config.apply_namespace(dag_workflow.config.name)
]
),
None,
)
assert relevant_wf is not None
fetched_workflow = await hatchet.workflows.aio_get(relevant_wf.metadata.id)
assert fetched_workflow.name == hatchet.config.apply_namespace(
dag_workflow.config.name
)
assert fetched_workflow.metadata.id == relevant_wf.metadata.id