HomeSearchDr. Hain Clinic website Information for Dizzy Patients MusicFLWVarious and Sundry

This space holds helpful information about software and hardware and vendors that Dr. Hain has discovered by trial and error. Perhaps if enough of us do this, searches on these devices or companies will bring up more relevant information.

SEO -- Search engine optimization tools

Last edited: March 14, 2021

Changes to search engines over the last 10 years have made it more difficult for content creators, as now whether one's work appears to searchers depends far more on appearance than on content. This is an unfortunate degradation in web quality, but I guess you really can't complain about free stuff like Google search. This page outlines my efforts to comply with goggle mandates to make web pages better suited for google's.

As a general comment, complying with this stuff could be a LOT easier. One could, for example, easily write a locally run tool (lets say in javascript or PHP) that directly fixes mobile ready and directly fixes missing metatags. One could have a prototype .htaccess file that fixes the redundancies noted below. There are locally run tools but they don't do this (e.g. screaming frog). Well, anyway -- lets proceed.

The main steps are mobile ready, eliminate errors such as links that no longer work, add meta-descriptions, and eliminate "redundant" ways to get to the same content.

Your site must be mobile ready

  1. This is an important first step -- see this page for details about making your site suitable for cell phones. So in other words, if you write a 10 page discussion of some academic topic, google forces you to make this suitable for display on a smartphone. Not always that easy.
  2. Check with google search console. This is a little complicated as you have to prove you own the website. This is still the best for a big website.
  3. Use the Chrome developer tools, and click on the little mobile box on the top. Once you figure out how to access it, this is pretty easy and quick for a single page. It works better than Google when you are tweaking as it shows the entire page rather than just a "snapshot" of the mobile crawler page.

Use a tool to scrub your site for errors including links that no longer work -- these all are website crawlers

Screaming frog -- this is a paid tool (about $150/year) that explicitly is designed for SEO. It has an unfortunate bug, that it goes into a very high CPU use mode if left on overnight, which limits its usability as the next morning will find you with a greatly crippled PC, requiring the task manager to stop. We use this program, and put up with it's bugginess. It is very useful to find 402 files, missing meta-descriptions. It is pretty but the interface is very busy.

Xenu -- this is a free tool that crawls your website and produces a very useful list of broken links. It is a little strange, not very pretty, but works well. Highly recommended.

ahrefs -- this is a very expensive and buggy tool, that has a one month trial. After the trial, the monthly price (for the "lite" plan) is $99. Whew -- 1200/year. It is easy to use but frequently it is just wrong. It also doesn't work on chrome (can't sign in). Not recommended other than the free trial (why not). These folks need to have a less expensive tier -- perhap $100/year.

Regarding the general subject of external links, while this is probably heresy, we don't really see why one needs to have external links. These are called "backlinks" when they are from another site to yours. Links from your site to another, called "external links", are trouble. External links are -- external. They may vanish, and they are often problematic to the crawler tools. You might be able to browse to them, but your website crawler tool can't. It is a whole lot easier to just get rid of them all. What we have generally done is to just convert them into text. Of course, this breaks the general process of ranking sites through counting of links. Of course, if everyone did this, there would be no links. Still, for a small website without a team of web developers to keep it up to date -- we think better to get rid of them.

Get rid of redundancy as Google search is confused by redundancy.

Force https so there is not an "http" and "https" version of the same material. This confuses google and is also safer. This can be done using an .htaccess file.

# from https://www.34sp.com/kb/152/how-to-force-https-using-the-htaccess-file
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Force either the www version or the non-www version of your site, but not both. Having two versions confuses google. This again can be done using an .htaccess file. The code below is used on this website, to block the longer "www" versions, and convert them all into non-www versions.

  # This line is to rewrite www to non-www. https://www.siteground.com/kb/how_to_redirect_www_urls_to_nonwww/
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.tchain.com [NC]
RewriteRule ^(.*)$ http://tchain.com/$1 [L,R=301]
Note that your .htaccess file has no need to be executable, and should be writeable only by the owner. This means permissions should be 644.

If you have two or more domain names that point to the same material, get rid of all but one. Having two domains seems to again confuse goggle, as it decides there is duplicate content or perhaps plagiarism. The most straightforward way to do this is simply to disconnect the nameserver from that extra domain(s), or perhaps send it to a subdirectory of your main site, that provides a message (this domain has been moved). One can also use an htaccess file to rewrite the duplicate domain to the preferred domain. This is messier than either of the methods above as it adds overhead to your site and prevents the "learning process" for users to stop using the duplicate domain. .


© Copyright May 3, 2021 , Timothy C. Hain, M.D. All rights reserved.
Dr Hain's CV Clinic dizziness-and-hearing.com FLW Various and Sundry Dr. Hain's CV