Thanks..But should I use robots.txt to stop crawling of pages because I have seen in SEO video that google bot come on site and crawl for some limited time. So, if googlebot crawls not useful pages like users, feed, tag, etc. So I think robots.txt is better than noindex.
see this robots.txt
User-agent: *
Disallow: /*/login
Disallow: /*?qa-rewrite=*
Disallow: /*/forgot
Disallow: /*/register
Disallow: /*/questions/*?sort=views&start=*
Disallow: /*/questions?sort
Disallow: /*/chat
Disallow: /*/activity/*
Disallow: /*/cdn-cgi/*
Disallow: /*/questions?start=*
Disallow: /*/questions?sort=*
Disallow: /*/search?q=*
Disallow: /search/*
Disallow: /?s=*
Disallow: /search/?query=*
Disallow: /ip/*
Disallow: /login
Disallow: /ask
Disallow: /forgot
Disallow: /register
Disallow: /admin
Disallow: /feed
Disallow: /tag
Disallow: /tags
Disallow: /users
Disallow: /user
Disallow: /message/
# allow google image bot to search all images
User-agent: Googlebot-Image
Allow: /*
Sitemap:
https://yourdomain.com/sitemap.xml