Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

start: Working w/ async jobs #987

Merged
merged 2 commits into from
Aug 20, 2021
Merged

start: Working w/ async jobs #987

merged 2 commits into from
Aug 20, 2021

Conversation

tjdevries
Copy link
Member

@tjdevries tjdevries commented Jul 14, 2021

This should just work by checking this only. No backwards incompat changes (hopefully)


old text

Fixes #750
Fixes #777
Fixes #1018

Latest Clip: https://clips.twitch.tv/SincereQuaintNostrilWholeWheat-9VR2k_is7r1F01VN <-- LiveGrep from home directory

Example of some cool improvements :) https://clips.twitch.tv/GrossRenownedOryxKappaClaus-mTa6exkBdFuym-F5

Will keep on working on this cause I'm (obviously) quite excited 😆

@rockerBOO
Copy link
Member

https://clips.twitch.tv/FairCrowdedQuailTooSpicy-LwegLYuYF0aA8l5B

Try out this PR. Needs plenary PR updated as well.

@lynndylanhurley
Copy link

@tjdevries the input lag when typing is 1000% better. this is amazing improvement!

I am still seeing some lag when navigating through the selections before the results have finished processing. The issue is intermittent tho so I can't reliably reproduce

@JimmyCozza
Copy link

This is incredible.

I tested this in the v8 codebase, which I think is pretty large.
Here's a side-by-side of live_grep using the master branches of telescope and plenary vs the async_v2 branches.

Master Branches

(I typed in all of "sparse_array" right away, it just blocks at the first character)

master.mp4

Async_v2 Branches

🔥

async.mp4

@ecosse3
Copy link

ecosse3 commented Jul 16, 2021

That is sick! I really appreciate your hard work ❤️ Will it work with telescope-fzf-native.nvim or it doesn't need it anymore?

@oberblastmeister
Copy link
Contributor

Yes it will still work

@nklmilojevic
Copy link

This is a huge improvement, awesome work!

@tjdevries
Copy link
Member Author

Cool :) I will work on fixing some UI & other related items. I just wanted to confirm that this would work for other people and wasn't some crazy glitch on my machine :)

Thanks!

@n3wborn
Copy link

n3wborn commented Jul 17, 2021

Huge fan of Telescope here. Once again, so many thanks for your work !

@kristijanhusak
Copy link

I gave this a test. It is much faster, but input is swallowing some letters randomly, and results are not always as expected (not on this video, but on my private project). Here's the video of letters issue:

telescope.mp4

@numToStr
Copy link

numToStr commented Jul 17, 2021

I am getting a coroutine failed error whenever I attempt to delete the last letter of the search entry. Once the error pops up the live_grep doesn't respond to any further inputs. FWIW this issue doesn't seem to appear when I delete the word with <C-w>.

Edit: I also noticed a tick icon at the end of the prompt which was also not present earlier. Is this intentional?

telescope_coroutine_failure.mp4

@kblcuk
Copy link

kblcuk commented Jul 17, 2021

Nice! 💪

The cursor movement now seems to behave rather interestingly, when you're scrolling up and then long-running search completes / gets updated results.

@nashio
Copy link

nashio commented Jul 18, 2021

Just tried this in my whole directory of projects:
after typing one word:
before: 17sec
after: 2sec

I did notice one issue, in the async_2 branches, when typing in the search box, the characters sometimes get swallowed. For ex. Typing "pokemon" sometimes I get "p kem n"

@tjdevries
Copy link
Member Author

My last commit messed up some prompt things -- plz ignore anything with the prompt for now. For now I've gotten enough feedback to confirm a few things and track down some others.

Thanks everyone! I'll post again when I have new stuff to test

@JSchrtke
Copy link

JSchrtke commented Jul 19, 2021

I just tried this on windows as well, it's ripping fast, very nice.

I did find a bug, though I'm not sure if that is related to the prompt thing you said you messed up. Anyway, to reproduce:

  • :Telescope live_grep
  • Type one char
  • Wait for all results to show
  • Backspace
  • Error executing lua callback: ...js\.vim\plugged\plenary.nvim\lua\plenary\async\async.lua:14: The coroutine failed with this message: ...js\.vim\plugged\telescope.nvim\lua\telescope\pickers.lua:389: bad argument #1 to 'last' (table expected, got nil)

Edit: Try this with a very large directory, otherwise it's kinda hard to get before all the results are in. You're clearly going too fast already 🚀

Edit2: I should have read the entire PR discussion, this was already reported earlier. My bad

@medwatt
Copy link

medwatt commented Jul 21, 2021

I ran Telescope find_files on my home directory, which has some 200,000 files. It was much faster than before. I then closed Telescope with ESC and opened it again with Telescope find_files on the same directory. This time, I was not able to close it with ESC and I had to kill the neovim process for it to stop.

@metalelf0
Copy link

Ok, I've been trying this feature, and here's my feedback.

The UI is not blocking anymore. This is obviously a good point.

However, in my use case, the "real results" still take a long time to display. My scenario is a Rails app, with a total file number (in the git repo, so including assets etc.) of ~ 2300 files, and the repository size (as in git count-objects -vH is ~50MB.

Let's say, I open the live-grep window, and I start typing in the UserProfile string; I immediately see some results, while typing, that are obviously incorrect - I still haven't finished typing the string - but when I finish typing, it takes like 3-4 seconds to display the final results.

If I instead just paste UserProfile in the search box, results are shown immediately, so I guess it's because the plugin is still elaborating all the searches (U, Us, Use, User, UserP etc.) before getting to displaying the results.

My suggestion would be to add a (configurable?) delay parameter before actually performing the search - e.g., 200 or 300ms. I guess this would dramatically improve the user experience.

Thanks a lot for the amazing work!

@oberblastmeister
Copy link
Contributor

oberblastmeister commented Jul 23, 2021

we could add a configurable sleep call here in addition to the schedule https://github.com/nvim-telescope/telescope.nvim/blob/master/lua/telescope/pickers.lua#L370. It was actually here before but was removed. Another thing we need to do is cancellation on yield instead of just a while loop

@lynndylanhurley
Copy link

Cancellation on yield would be amazing

@oberblastmeister
Copy link
Contributor

there is a pr here nvim-lua/plenary.nvim#171

@tjdevries
Copy link
Member Author

@metalelf0 can you paste your config or send a link to it so I can see exactly what you're running? I'm having a hard time reproducing that.

@metalelf0
Copy link

metalelf0 commented Jul 23, 2021

@tjdevries my config is @ https://github.com/metalelf0/dot-files/tree/master/.config/nvim , I'm sorry but it's ugly! I need to show it a little more love :/

[EDIT]: after some trial and error, it seems the issue goes away if I gitignore a directory with some yml files, the biggest one totaling ~11MB, for a total of 15MB. Is handling "large" files a known limitation?

@tjdevries
Copy link
Member Author

Hmm, I'm not sure.

I just added an option for debounce in pickers, so you can do:

require('builtin').live_grep { debounce = 100 }

to add a 100 ms debounce. You can check it out if it makes things better for you

@metalelf0
Copy link

Hmm, I'm not sure.

I just added an option for debounce in pickers, so you can do:

require('builtin').live_grep { debounce = 100 }

to add a 100 ms debounce. You can check it out if it makes things better for you

Yes, it's much better now. Thanks!

rafaelleru added a commit to rafaelleru/dotfiles that referenced this pull request Jul 27, 2021
some operations like greping a big repository were slow.

see: nvim-telescope/telescope.nvim#987
@Conni2461
Copy link
Member

For everyone testing this and getting a error with finders.new_table picker like git_commits, and way more. I think this is the fix that needs to be added to plenary before merging this PR.

diff --git a/lua/plenary/async/control.lua b/lua/plenary/async/control.lua
index 16f963e..257ba71 100644
--- a/lua/plenary/async/control.lua
+++ b/lua/plenary/async/control.lua
@@ -209,7 +209,7 @@ M.channel.mpsc = function()
     end
     local val = deque:popright()
     deque:clear()
-    return unpack(val)
+    return unpack(val or {})
   end
 
   return Sender, Receiver

We should try not to forget that 😆


local padding = string.rep(" ", vim.api.nvim_win_get_width(prompt_win) - prompt_len - #text - 3)
vim.api.nvim_buf_clear_namespace(prompt_bufnr, ns_telescope_prompt, 0, 1)
local padding = string.rep(" ", vim.api.nvim_win_get_width(prompt_win) - prompt_len - #text)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dont remember why we had a - 3 here but right now the max count is "out of screen"

return self:_insert_container_before(picker, index, node, new_container)
end

if score < 1 and container[2] == score and #entry.ordinal < #container[1].ordinal then
Copy link
Member

@fdschmidt93 fdschmidt93 Aug 11, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be wrapping #entry.ordinal < #container[1].ordinal in a break_ties(current_entry, existing_entry) function that users can override and pass via opts to pickers:new() something we want? Which defaults to preferring shorter ordinals, of course.

If my thinking is correct, I think that would address
#1080 (I guess that's what OP wanted)
#939

and I'm sure allows for cool customization of other stuff down the line :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, let's do that in a follow up PR

Copy link
Contributor

@TC72 TC72 Oct 13, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've arrived from #1080, took me a while but I'm here now.

We could mimic fzf --tiebreak=index just by having a break_ties() which just returns false.

I think we should also have something like fzf --no-sort which bypasses all the score checking for sorting and adds entries until we reach the max number of results. For current_buffer_fuzzy_find this should help performance issues on very large files.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@fdschmidt93 where are we on this, should a PR be linked to this or #1080 or #939?

Copy link
Member

@fdschmidt93 fdschmidt93 Oct 26, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should a PR be linked to this or #1080 or #939

Why not close both? Any approach should be general enough. I'd say, we can then have presets like breaking ties by index (it's just false I suppose, but helps to put a label on it), length (default I guess), etc. for the most common use cases.

where are we on this

I guess someone would have to come up with a PR. #1242 was the only too opinionated attempt so far.

I'm quite busy with what will hopefully turn into an open source project and with regards to telescope have enough open PRs (file browser, vim.diagnostic, customizable filtering), and issues (my favorite: revert buffer previewer) to deal with for the foreseeable future. In other words, PR welcome :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @fdschmidt93 I'll give this a go.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet