Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

switch to solana-tpu-client from solana_client::tpu_client for bench-tps, dos/, LocalCluster, gossip/ #310

Merged
merged 2 commits into from
Mar 21, 2024

Conversation

gregcusack
Copy link

@gregcusack gregcusack commented Mar 19, 2024

A Followup PR to: #258. Removes dependency on solana_client::tpu_client and switches over to solana-tpu-client.

Problem

We may end up removing solana_client::tpu_client and the other associated wrappers in solana_client. So switching the dependency bench-tps, dos/, LocalCluster, gossip/ from solana_client::tpu_client to solana_tpu_client.

This is the 9th PR on the way to remove ThinClient completely.
See

@gregcusack gregcusack force-pushed the solana-tpu-client branch 6 times, most recently from adace3b to 09fadfc Compare March 20, 2024 00:02
@codecov-commenter
Copy link

codecov-commenter commented Mar 20, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 81.9%. Comparing base (8df80d9) to head (28e09bb).
Report is 15 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff            @@
##           master     #310     +/-   ##
=========================================
- Coverage    81.9%    81.9%   -0.1%     
=========================================
  Files         837      837             
  Lines      226896   226896             
=========================================
- Hits       185901   185900      -1     
- Misses      40995    40996      +1     

@gregcusack gregcusack force-pushed the solana-tpu-client branch 2 times, most recently from a65905f to 2550457 Compare March 20, 2024 16:54
@gregcusack gregcusack marked this pull request as ready for review March 20, 2024 17:41
Copy link

@CriesofCarrots CriesofCarrots left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely support :) Just one objection and a couple questions about dependencies.

client/src/tpu_client.rs Outdated Show resolved Hide resolved
Comment on lines 24 to 28
solana-quic-client = { workspace = true }
solana-rpc-client = { workspace = true }
solana-rpc-client-api = { workspace = true }
solana-sdk = { workspace = true }
solana-udp-client = { workspace = true }
Copy link

@CriesofCarrots CriesofCarrots Mar 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to avoid adding either of these dependencies to solana-tpu-client, so that downstream users don't have to import both by default if they only need one. I know you're trying to dedupe things (based on a reviewer request), but I think defining QuicTpuClient and TpuClientWrapper at the call sites or at least further upstream would be better.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

QuicTpuClient can be rearranged easily. TpuClientWrapper is a little harder. We could define it in gossip_service and then have dos import it along with gossip/. Although this requires gossip to import solana-udp-client and solana-quic-client. The other option I see is we put it back up in solana-client as before since solana-client is imported by gossip_service and dos anyway. But then this has the issue that we are now back to relying on solana-client::tpu_client...

Copy link
Author

@gregcusack gregcusack Mar 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so I put TpuClientWrapper back in solana_client since gossip and dos are already importing solana_client anyway. I know it is not ideal. I also tried putting it in gossip but didn't like that we had to import solana-udp-client and solana-quic-client in gossip just for that wrapper. But if that is not a big deal, then lets add it in gossip. Couldn't find another spot further upstream either that made sense.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but didn't like that we had to import solana-udp-client and solana-quic-client in gossip just for that wrapper

To be clear, the fact that solana-gossip imports solana-client means it is already importing solana-udp-client and solana-quic-client transitively. At some point it will actually be better to have it import one or both of those directly. But since it's still stuck on solana_client::connection_cache::ConnectionCache, this seems fine for now.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh shoot ya that's a good point. I should have realized that 🤦‍♂️. ok sounds good. Thank you!

dos/Cargo.toml Outdated Show resolved Hide resolved
dos/Cargo.toml Outdated Show resolved Hide resolved
@gregcusack gregcusack force-pushed the solana-tpu-client branch 2 times, most recently from 8de91f3 to 1dd4741 Compare March 20, 2024 20:03
@gregcusack gregcusack force-pushed the solana-tpu-client branch 2 times, most recently from 2de5a1e to 2d12d8a Compare March 21, 2024 02:40
Copy link

@CriesofCarrots CriesofCarrots left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the rearrangement! Sorry that each one of these points to more cleanup to be done 😅
Lgtm!

@gregcusack gregcusack merged commit 792d745 into anza-xyz:master Mar 21, 2024
48 checks passed
@gregcusack gregcusack deleted the solana-tpu-client branch March 21, 2024 16:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants