Tuesday, March 14, 2017

What does Google think?

Google seems to be constantly changing and upgrading but here is an area they seem to be constant on:

Google has been very clear: mobile pages are going to be more important than the desktop versions of your web pages. In the near future, Google's index will be a mobile-first index. Are your pages ready for Google's new index? How doe your web pages look like when they are crawled by a mobile bot?

More on this:

What is Google's mobile-first index?

Some weeks ago, Google announced that web pages will be ranked based on the mobile version of the pages:
"Today, most people are searching on Google using a mobile device. However, our ranking systems still typically look at the desktop version of a page’s content to evaluate its relevance to the user.
This can cause issues when the mobile page has less content than the desktop page because our algorithms are not evaluating the actual page that is seen by a mobile searcher.
To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results."
The mobile version of your website is going to be more important than the desktop version of your website soon. If your website cannot be displayed correctly on mobile devices, your rankings might drop dramatically.

More:

        Can there be problems with the mobile version?
Some website owners want to make their mobile pages more streamlined. For that reason, not all the content of the desktop version is included in the mobile version.
When Google uses a new index that is based on the mobile version of your pages, these contents will be lost and your rankings might change dramatically.

Reasons why your web pages do not get ranked
There can be several "invisible" reasons why your web pages do not get high rankings on Google and other search engines:
1. The robots.txt file is not correct
If your content management system offers a development mode, chances are that the robots.txt file of your website blocked all search engines when you developed the website.
If you did not change the robots.txt file of your website, the search engine robots will still be blocked. Remove the "Disallow:" lines from your robots.txt file to make sure that your web pages can be accessed by search engine robots.
2. The HTTP status code of your pages is not correct
When search engine robots and normal visitors request a page from your server, your server answers with a so-called HTTP status code. This status code cannot be seen by the visitor as it is targeted at the program that requests the page.
The status code for a normal page should be '200 OK'. All other status codes mean that there is something special with the page. For example, 4xx status codes mean that the page is broken, 5xx status codes mean that there is a problem with the server, etc.
Some servers have configuration errors and they deliver the wrong HTTP status code. This does not matter for human website visitors and you cannot see it in your browser. Search engine robots, however, won't index your web pages if they get the wrong HTTP status code.
3. There are other technical errors
Other technical errors can also have a negative influence on your rankings. For example, the HTTPS settings on your website could not be correct, or the pages might load too slowly.
In addition, websites automatically get errors over time. Some links on the pages become obsolete, old pages do not fit on the new website, etc. If a website contains too many of these errors, it will look like a low quality website.

Rossini.com can help you find these problems! Call me at 913-244-6132 or e mail me at jrossini@rossini.com.




No comments: