More About On-Page SEO
More About On-Page SEO With Wostwareld |
Now we will discuss about Keyword density, HTML header tags, sitemap and robot.txt files.Then lets start now.
- Keyword Density: It is the % of no. of times a keyword or phrase appears on a web page or we can say in the content of a webpage as compared to the total number of words on the webpage/ content. When we talk about search engine optimization, keyword density is the factor used in determining whether a web page is relevant to a specified keyword or keyword phrase. Keyword density is an important part of On and Off page SEO. For example, if a keyword appears 4 times in a 100 word text the keyword density would be 4%. From the point of view of search engines, a high keyword density is a good indicator of search engine spam. If a keyword appears too often in a website, search engines will downgrade the website and it will then appear lower down in search results. The density for every keyword you use on a webpage should be between 2 to 4.5%. This % is considered as good practice in terms of SEO.
- HTML Header Tags: HTML Header tags are used to differentiate the headings and sub-headings of a webpage from the rest of the content. These tags are also known heading tags or simply header tags. The most important heading tag is the h1 tag and least important is the h6 tag. In SEO use a mix of header tags is a good practice and increases the readability of the content of a webpage. There a six types of HTML header tags. These tags are H1, H2, H3, H4, H5, H6.
- Sitemap (.xml and .html file): A sitemap is a list of pages of a web site accessible to crawlers or users. It can be either is an XML file that lists URLs for a site along with additional metadata about each URL so that search engines can more intelligently crawl the site or a HTML file for the end users. In SEO both are important but we usually focus on sitemap.xml as it is the file that we send to various search engines with the help of their webmasters to help them crawl our website or blog. If you want to generate .xml or .html file for your website, simply visit www.xml-sitemaps.com and enter your website's home page like "http://www.example.com", set the frequency to according to your choice. (Recommended monthly or weekly depending on the rate at which you update your website). Let the last modification to "Use Server's Response" and priority to "Automatically Set Priority". Click on 'Start' button and in few second, your sitemap.xml and sitemap.html files get generated for downloading.
- Robots.txt File: The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize web sites. Not all robots cooperate with the standard; email harvesters, spambots, malware, and robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites. Include all webpages or URLs that you don't want web spiders, bots or crawlers to crawl with Disallow tag. To learn how to create Robots.txt file, visit here.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.