August 12

The site could not hold love Shanghai or Google solutions

pages will put some Javascript or meta to refresh, and these may be considered suspicious practices. 302 turn also may cause problems. There are many owners will also have a lot of websites, and cross links in between these sites, it is likely to cause problems. A person has four or five sites, can be understood, but if forty or fifty sites, each site is small, the quality is not high, but everything is connected, it is suspicious.

in the search engine search through the domain name or the site instruction, if the site on all pages in the search engine is not in the database, we can draw a conclusion: the server problem, may be robots, there may be detected by cheating. But if the search engine information and data, but the decline in ranking, only two possible is perhaps the site has some suspicious signs of cheating punishment, there is the search engine algorithm change. The decline in ranking is not from the first dropped to second, but decreased greatly, this will cause the attention of our webmaster.

there is no downtime recently? Server settings is normal? When the search engine spiders to crawl, the server is not 200 status code? If the server has a problem, should be just a temporary phenomenon.

check out the link is not only links to related sites? Is not only a link to high quality website? You link to the website there is no >

believes that the website optimization webmaster in search engine optimization, in charge in the website is more or less their encounter or listen to their comrades at sites were closed or be punished for this problem, what should I do?

there are many specific optimization techniques and methods, such as keyword selection, writing the title tag, keyword density position, website structure etc.. But if you put all of these techniques are used, the exit problem is not far away.

robots.txt file used to stop the search engine directories or some files, although this function is very useful, but it is easy to make mistakes. If you have problems with your robots.txt file settings, search engine spiders can’t grab your site. The normal setting and wrong settings may vary very small, to repeatedly check to ensure correct.

There are many

I would make sure my website is blocked or punished by various means of detection, ruled out one by one, if it is blocked because of what, immediately find the source if it is correct, how should I deal with being punished.

excessive optimization is an important reason for ranking now often be punished. There is a degree of the problem, which is appropriate to do optimization, which is the degree of excessive optimization, only rely on the experience to master. Don’t be such a lucky heart, OK. Let’s get that bad.


Copyright 2021. All rights reserved.

Posted August 12, 2017 by admin in category "xjixixapcvoy

Leave a Reply

Your email address will not be published. Required fields are marked *