Welcome to the Question2Answer Q&A. There's also a demo if you just want to try it out.
+3 votes
836 views
in Q2A Core by

I suffer from very slow archiving
I'm not a developer and I don't understand much about robots.txt
Can you tell me which of the following two files is the best?
1 or 2
thank you in advance
1-
User-agent: Mediapartners-Google
Disallow: /login
Disallow: /ask
Disallow: /forgot
Disallow: /register
Disallow: /questions?sort
Disallow: /admin
Disallow: /*?show=
******************************
2-
User-agent: Googlebot
Allow: /
User-agent: *
Disallow: /
************************************

1 Answer

+1 vote
by

Slow archiving? Do u mean slow indexing?

If so, Disallow: / in second file will block indexing of your website.

Mediapartners-Google is Adsense bot.

So in your case better solution is 1st robots, but change User-agent: Mediapartners-Google to User-agent: * (meaning any bot).

by
Yes I meant indexing

done
thanks.

the last file like this:

User-agent: *
Disallow: /login
Disallow: /ask
Disallow: /forgot
Disallow: /register
Disallow: /questions?sort
Disallow: /admin
Disallow: /*?show=
Sitemap: https://www.yourwebsite.com//sitemap.xml
by
sorry:
dose there any code for speeding indexing?
by
No, there is no code for speeding indexing.
First of all, you need to add your website to Google Search Console, look for errors, and fix them.
by
I have more than 80 problem in coverage on my website
I cant solve them up to now
...