When a client sends a request to the server, the server generates some HTTP status codes.There are some HTTP status codes:
Google successfully indexes these pages. If there is a problem (such as an empty page or error messages), Search Console will consider it a "soft 404 error."
This status code denotes website redirects. Googlebot, generally, follows 10 redirect loops on the crawl. But keep in mind that you should try to limit loops to four redirects. If there is an issue on the website, use the https and www versions.
4xx (client errors)
Googlebot does not index pages that return a 4xx status code.After returning the 4xx status code, Google will remove the page from the index.
5xx (server errors)
5xx errors force Google crawlers to slow down during crawling. In this case, Google retains those pages in its index for a period of time. But after then, Googlebot will delete pages from the index.
If the robots.txt file returns a 5xx error for more than 30 days, Google will use a cached copy of that file. If the file is unavailable, Google will consider it to have no crawl restrictions.