-
Notifications
You must be signed in to change notification settings - Fork 391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Empty Results #1823
Comments
Hey! Looking at your Crawler, we did not managed to get any records in. Can you try restarting it? Edit: the sitemap returns a 404, so the Crawler doesn't find any pages. Try adding an other URL to the |
Do I need a sitemap.xml? docsearch docs didnt mention anything about it |
Nope you don't, it's nice to have since it helps the Crawler finding new pages I took a look at your Crawler and it seems to be configured for a Docusaurus website, but isn't one. I fixed the config, it should work better now :) |
So without it the crawler wouldnt find new pages? |
It helps with consistency, especially with client-side rendered websites. Here it seems that it doesn't manage to discover every pages of your website, so it would really help |
Does your website apply some redirects? For example, when we crawl this page |
No |
I got no idea why it redirects there, im using hashrouter ( in my solid-router settings ) , does that affect anything? |
I'd assume so, it seems that I can reproduce by going to the editor -> URL tester with |
HUGE ahaha |
oh what would be the solution to this then? I've to use hashrouter method in solid-router compulsory |
I've never implemented such routing sorry! I know we had user with similar cases, I guess the repo of the tools you use would have the answers |
this could help #648 |
Ah yes for the rendered links, indeed! For the discovering of the pages, I'd say a sitemap would solve all problems |
I guess the plugin is not able to discover your urls too then :/ You should have a |
ok i'll manually try first, do I include the hashes? Wouldnt that redirect to homepage too :( |
Yes, the sitemap should represent the final URL that you can access from a browser. Note that this will only help the Crawler to properly discover URLs, but won't fix the render issue. Hash routing have a huge impact on SEO and discovery in general, I'd recommend to switch to an other history mode if that's possible for you |
I wasnt using hash routing before, but i got those 404 errors whenever reloading a non home route. I didnt know other methods than hashrouting to fix it :( https://github.com/solidjs/solid-router |
@shortcuts ok i stopped using hashrouter, now im using https://github.com/rafgraph/spa-github-pages , does the crawler work now? |
Trying from the URL tester in the editor, it returns 404 on those new URLs 🤔 Is it client-side redirects? See https://crawler.algolia.com/admin/crawlers/e120626d-7695-43f1-b9bf-b19726525e9b/monitoring/list?reason=http_not_found for more details |
i was using the trick provided by the github-spa-pages repo, they use a 404.html and whenever a non home route loads, it probably loads the 404.html first and then redirects to the preferred route |
@shortcuts so the crawler wont work with this method too? 😞 |
Just confiming, if thats the case then I cant use algolia and have to look for an alternative or write my own thing |
Hey, sorry for the delay. I'll investigate internally if we can do something about the hash router problem |
hi, you sure about this? crawling Non home routes redirects to the home page, idk how you'd work this out! Just tell me if supporting hashrouter isnt possible, Im on the way to write my own search functionality cuz of this but kept it on halt cuz your message, if algolia works then it'll save me lots of time! |
ok now the search function works! cuz my site is generated by solid-ssg! :D |
sick!! Congrats |
Hi there, I added docsearch to my site and it shows the auto-compeletion popup but i dont see any autocompletions or results at all
The text was updated successfully, but these errors were encountered: