Seo recommendations for every Webmaster should know

Seo recommendations for every Webmaster should know

   Step on the road to make sure seo you have prepared in the way its original site. But not everyone knows his way is right or not, today's new tech will be shared online for your post: Seo recommendations for every Webmaster nên know.
Hope you will help to shape the path of our website better.

    SEO recommendations are often highly technical and specific in nature. One common mistake businesses make when implementing changes recommended by their SEO Agency is to not clearly communicate why the changes are needed in the first place.
Recommendations often pass through two or more people before they get implemented – from the SEO Expert, to a Marketing Exec (non-techy), to their internal/external Developer for example. On some occasions, this means the reasoning behind the recommendation is lost along the way and it can have a devastating effect on your SEO.

   This article is collected by himself on a number of articles about seo reputation will help you have a fresh look at seo.
recommendations are misinterpreted during the implementation process…
The tips in this article are considered by Google as “best practices” to increase the value of your site.

The following recommendations will help search engines find, index, and give a good rank to your site even if you choose not to implement these recommendations you should be aware that emphasizes quality recommendations illicit practices that may lead to penalties and even removing the site from the Google index. If a site has been penalized, it will not be listed in a search on Google.com or other search engines.

When the website is ready:
  • Add links to your site to a relevant website.
  • Submit your website to google, bing or other search engines
  • Add your sitemap to Google webmaster tools. Google Sitemaps uses sitemap submissive to “learn” the site structure and to increase “coverage” of internal pages.
  • Add your site to relevant directories such as the Open Directory Project website and Yahoo!, as well as other sites that are in the same area of interest in your site.
Design consultant website content:
  • Build your website in a way that there is a clear hierarchy of pages and links in text. Each page must be read by at least one static link.
  • Provide users with a site map that will lead to important sections of your site. But if the map of the site contains more than 100 links is recommended that you make a separation of its sections.
  • It is very important that the website contains relevant information, presented in a manner that can be easily read by spiders.
  • Make an analysis of the words as you think users will search the search engines and make sure those words will be present in the pages of the site
  • Use text instead of images to display important names, content or links. Google crawlers do not recognize text contained in images.
  • Make sure that the title of the page and ALT’s on your website are relevant and correct.
  • Check for ‘broken links’ (links which take you to a page that no longer exists)
  • If your site is built on technology that generates dynamic pages (the URL contains the “?”) Should be aware that not all search engine spiders read dynamic pages as static ones. Try your best to keep the parameters short and the number small. To be taken into account and rewrite technique of links.
  • Keep links to a page to a reasonable number (less than 100).
Technical of recommendations:
  • Use a text browser such as Lynks to examine the site , because most search engine spiders see your site pages as shown in Lynks . If using technologies such as JavaScript, cookies , session IDs , frames, DHTML , or Flash, they can prevent search engine spiders go through your pages and read the information.
  • Allow search engine spiders go through the site without session IDs or other parameters retain the path followed through. These techniques are useful for tracing the route taken by users, but the algorithm of search engine bots is entirely different. Using these techniques may result in incomplete indexing of your site, and bots can not determine URLs that look different but actually going on the same page .
  • Make sure that the website is on a server that supports the HTTP header If- Modified-Since . This enables the web server to send the search engines if it has been modified since the last indexing. Enabling this option saves you from unnecessary traffic carried by spiders to read an information that has not changed .
  • Use the robots.txt file. It “tells” crawlers which directories can be completed OF THE or not. To learn how to build this file you can visit robotstxt.org. You can test the correctness of the robots.txt with robots.txt analysis tool available in Google Sitemaps.
  • If you use a solution of type CMS (Content Management System), ensure that the system can deliver content in a form that can be read by search engine spiders.
Quality recommendations

These recommendations qualitative covers the most common forms of duplicity or manipulative behavior, but Google may respond negatively to other misleading practices not listed here. Google also uses a system of reports of abusive techniques used by some sites. They can be used to improve recognition algorithms spam tests, etc..

Quality recommendations [basic principles]
  • The pages must be built for users, not search engines. It is not advisable to create content for search engines different from content presented to users, a traditional technique called “cloaking”.
  • Avoid tricks used to increase your rank in search engines. It is important to use techniques that target users and not search engines.
  • Do not enter links schemes to increase awareness of the site or PageRank. Using this technique can lead to the opposite effect by downgrading your site and lower your PageRank.
  • Do not use unauthorized software to submit the page, or to increase PageRank, etc. ..
Quality recommendations [specific recommendations]
  • Avoid hidden text or links (invisible)
  • Do not use techniques like “cloaking” or dubious redirects
  • Do not send automated queries to Google
  • Do not load pages that contain irrelevant words
  • Do not create multiple pages, subdomains, or domains with duplicate content
  • Do not create pages that contain viruses, trojans or other malicious software
  • If your site participates in an affiliate program, make sure you add value to the content. Add unique and relevant content that gives users reasons to visit your website.
     When giving recommendations to improve SEO, you have to be extremely specific with instructions and you must go that extra mile to educate the person actually implementing the changes as to the reasons behind the change.

   Education is the key to this because it will encourage the Development team to think about the changes they are making from an SEO perspective and understand the sometimes very serious implications of getting it wrong.

  As SEO’s we need to be on our toes, and provide as much information as possible to try to second guess potential problems that can occur if our recommendations are interpreted incorrectly.

   Hopefully this article will go some way in ensuring people think that bit more about the potential outcomes that may arise if SEO recommendations are misinterpreted or not articulated in full to the person making the changes.


Good luck to You.

Post a Comment

0 Comments