Conversation
5 tasks
56d312d to
091329d
Compare
9ca5152 to
df6df2e
Compare
701de9f to
4948bae
Compare
Contributor
Author
|
Le's try to land #421 before this. |
4a35fcb to
13d81eb
Compare
2 tasks
Contributor
Author
Test suite TODO:
|
Set each quote-stream by matching the provider for each `Flume` and thus results in some flumes mapping to the same (multiplexed) stream. Monkey-patch the equivalent `tractor.MsgStream._ctx: tractor.Context` on each broadcast-receiver subscription to allow use by feed bus methods as well as other internals which need to reference IPC channel/portal info. Start a `_FeedsBus` subscription management API: - add `.get_subs()` which returns the list of tuples registered for the given key (normally the fqsn). - add `.remove_sub()` which allows removing by key and tuple value and provides encapsulation for sampler task(s) which deal with dropped connections/subscribers.
Previously we would only detect overruns and drop subscriptions on non-throttled feed subs, however you can get the same issue with a wrapping throttler task: - the intermediate mem chan can be blocked either by the throttler task being too slow, in which case we still want to warn about it - the stream's IPC channel actually breaks and we still want to drop the connection and subscription so it doesn't be come a source of stale backpressure.
Allows using `set` ops for subscription management and guarantees no duplicates per `brokerd` actor. New API is simpler for dynamic pause/resume changes per `Feed`: - `_FeedsBus.add_subs()`, `.get_subs()`, `.remove_subs()` all accept multi-sub `set` inputs. - `Feed.pause()` / `.resume()` encapsulates management of *only* sending a msg on each unique underlying IPC msg stream. Use new api in sampler task.
This reverts commit 02fbc0a.
Instead of requiring any `-b` try to import all built-in broker backend python modules by default and only load those detected from the input symbol list's fqsn values. In other words the `piker chart` cmd can be run sin `-b` now and that flag is only required if you only want to load a subset of the built-ins or are trying to load a specific not-yet-builtin backend.
Hopefully helps resolve #435
Seems that by default their history indexing rounds down/back to the previous time step, so make sure we add a minute inside `Client.bars()` when the `end_dt=None`, indicating "get the latest bar". Add a breakpoint block that should trigger whenever the latest bar vs. the latest epoch time is mismatched; we'll remove this after some testing verifying the history bars issue is resolved. Further this drops the legacy `backfill_bars()` endpoint which has been deprecated and unused for a while.
guilledk
approved these changes
Jan 11, 2023
77 tasks
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Enhances our
piker.open_feed()real-time quotes and history managment layerpiker.data.feedto accept multi-fqsn inputs to deliver multi-symbol quote streams and a new internal data streaming abstraction/API:.data.Flumewhich provides the basis for real-time stream mangement, access and measure for the needs of real-time data-flow management and orchestration.Synopsis
The final core-UX feature you always wanted as a chart trader is probably something like:
mult-instrument overlayed real-time and historical data feeds with simultaneous interaction and "current symbol" selectable order mode control..
well, this is finally within reach 😎 and this patch add the "backend" work making it possible 🏄🏼
Notes for manual testing
Ideally reviewers run the new feeds test set with
pytest tests/test_feeds.py.Note that you'll need to install the
piker_pinbranch oftractorin order for the test set to run green:piker_pinbranch fortractorif installed in dev mode locally.to land
fill out commit msg for 7abcb3e which was initial (half-working) patch to get basic funtionality
port all consumer code in clearing, order mode, charting/graphics layer to expect this adjusted
Feedapi.add basic per-
brokerdmulti-symbol real-time feeds workingpiker.open_feed(fqsns=['btcusdt.binance', 'ethusdt.binance']) as feed)where the deliveredFeednow has a.flumes: dict[str, Flume]which enables per-fqsndata flow access, mgmt, measure (see the historical flume for idea behind this abstraction terminology)test_feeds.pybinancemulti-symbol casekrakenmulti-symbol casekrakencurrently seems to depend on abrokers.tomlexisting? we should fix this..add cross-
brokerdmulti-feeds such that `piker.open_feed(fqsns=['btcusdt.binance', 'xbtusdt.kraken']) will work with an aggregate receive channel delivering quotes from both backends?multi-sym case (kraken+binance`)Test suite TODO: see two comments below