for revision, URL is a place where most do not want to be changed. If want to change, is often considered static and dynamic. Due to static and dynamic URL, the best way to handle this is the 301 jump on each page, weight transfer. However, the reality is often unable to control so many pages of the 301 jump, so we determined to take the dynamic Robots.txt shield, catch a dead link page into the. The following is the author during the shield is included drops:
update three, the content of dredging stability and chainThe original
site after the revision of the site and has not changed, only URL had the change, and the website still exists a lot of content, how to make search engines to recognition site, URL included new content and the problems we face. The author here is to stick to the original content updates, and then through the chain and submit the map means to guide the spider into the site to site content. I said at the beginning of the original content block is considered in this regard, let the spider suddenly blocked those dead links, then keep updating the content, let the spider to trust the site, and then re submit the site map or by means of the chain effectively guide the spider into the site included.
, if a URL dynamic variable static change, use the Robots.txt dynamic shield
We are hoping to do
website in normal time, the 404 page effect is not too big. But the site revision, 404 pages are definitely a little not ring. The 404 page can not only restore some of the visitors, but also can save some of the weight of the site, and improve the site’s PV etc.. And for the production of 404 pages is required, the need to pay attention to is, don’t make 404 page jump directly back to the site of the operation, it is easy to search engine that is a means of cheating, so here must pay attention to. I think the best is to take like 10 seconds automatically jump back to the site, at this time, the reader can obviously go to the site navigation is the best. 404 pages can be as below:
two, shielding, 404 page must set
site site do better, in the increasingly high demand, more and more comprehensive functional demon, the site became the only way which must be passed. If not, the structure of URL site address unchanged, the revision do not worry too much, the recovery is also more convenient and simple. If the structural changes of the URL site, it is relatively trouble, but there will be weight loss. Especially due to the changes of URL lead to the emergence of a large number of dead site links, then we should be how to deal with the problem?