Click me
Transcribed

5 Aspects of Technical Search for Improved Website Health

5 Aspects of Technical Search for Improved WEBSITE HEALTH ROBOTS.TXT This is a public file that is used to block access to specified files and folders on your server. Blocked content will be disallowed from search engine results pages (SERPS). Blocked content can be things like search results, query parameters, or e template files additional use for the robots.txt file is to include an XML Sitemap reference ave no SEO relevance. One should you have an XML Sitemap. This gives the search engines a direct path to a special file that includes all relevant URLS in an easy to digest feed for the crawlers. 2 XML SITEMAP An XML Sitemap is a feed that lists all relevant URLS for a given website in one file used by search engines. This is not the same as an HTML Site Map which is for human consumption (helps visitors find content on a website). The XML Sitemap is the best way to ensure that your site will have a good chance of being indexed properly. This file reduces the chances of a search spider aborting a crawl because of poor coding or inaccessibility of your site. WEBMASTER TOOLS ACCOUNTS If you don't already have a Google or Bing Webmaster Tools (WMT) account, create one and get your site(s) verified. These accounts provide reports related to indexation status, page errors, HTML recommendations, search queries, etc. Webmaster Tools accounts are key if you want to maintain a healthy website. Enabling alerts also gives you the peace of mind that if something isn't right on the site, like inaccessibility, you'll get notified. WEBMASTER TOOLS MANUAL SUBMISSIONS The ability to submit XML Sitemaps to WMT accounts provides the search crawlers with direct access to the file. Submit your XML Sitemap to help with indexation. You'll also want to submit your URLS when you make significant changes to page content on your website. This process is often referred to as performing a "Fetch". In Google's case one would do a "Fetch as Googlebot" which then returns an exact instance of your source code as the crawler sees it. Once you have confirmed that the changes you have made are present, you can then submit the page(s) to the search engine for indexing. Results are not always immediate but often times you can see an instant result when reviewing keywords and terms within the SERPS. TITLE AND META TAG OPTIMIZATION Page titles are required on all HTML documents (web pages) and should always be unique to the page content. If your Home, About, Contact and Services page all have the same title tag, edit them to be more unique and relevant. A change like this will help improve your organic position. Title Tags Similarly, description tags should also be unique and relevant to the page Meta Tags title as well as the content found on the page it resides on. While description tags aren't always used in search engine algorithms, the benefit to having an optimized description tag is that the right content presented to a human searcher will get the most attention and hopefully the most clicks. Consider this: if your website was a book and each web page was a chapter, you'd want unique titles and descriptions for each chapter to show the reader that the content was different throughout the whole book. This approach should be taken when writing and setting SEO for your website's pages. Designed by Hall Internet Marketing © 2014 HALL INTERNET MARKETING www.hallme.com

5 Aspects of Technical Search for Improved Website Health

shared by hallme on Mar 19
105 views
2 shares
0 comments
Performing SEO on a website is extremely important if that website is expected to rank and perform well within search engine results pages. But SEO isn’t just about keywords and search terms. In thi...

Category

Computers
Did you work on this visual? Claim credit!

Get a Quote

Embed Code

For hosted site:

Click the code to copy

For wordpress.com:

Click the code to copy
Customize size