Website Best Practices; HTTPS, SEO, Sitemaps, Robots, Redirects & Analytics – ECT 14, Mike Hoyles


Now we get to the actual website itself and
its reach. So, yours or your client’s footprint on
the internet. Most people’s first interaction with the
brand. What is the first impression? What was your first impression? First impressions matter. Were you able to find the site easily? Once you’re on the site, can you easily
find the content that you’re looking for? Is navigation simple or is it a nightmare? Is the messaging and content clear and concise? Are there hurdles or unnecessary pages, popups, errors, broken links – things in your way
as you are trying to accomplish something? Did you have the urge, even one time, to leave the page or leave the website? Was it loading fast enough? Was it loading slowly? Was it not loading at all? There are several organizations online that
annually release the top 200 ranking factors that Google considers when ranking your site
in their search engine index or their search engine results page. 200! The fact of the matter is 200 is insane and
Google really only puts emphasis or weighting on about 10 to maybe 15 things. However, I do encourage you to Google the full list, Top 200 Google Ranking Factors,
as there may be some binary things that are simply missing from your site or from your
client’s site that are beneficial to have in place. They’re either there or they’re not. With all that said, I look at 4 critical factors
whenever I look at a client’s website: Is it 100% indexed? Is it 100% accessible? What are they doing on-page which are factors
that they can control, on-page or on-site factors? And how are they performing with off-page
or off-site factors – which are things that they can’t control, you can only really
influence at best. Which is usually things like 3rd party blogs,
social media posts, things that are pointing in with either positive or negative reviews. So that’s it. Four things. High level. Getting found. Being accessible or being indexed. What you control on page. What you don’t control off page. However, each one of these things can then
go a mile deep. Ok. That’s what matters. Getting indexed, or getting ranked. So this simply means that you want to tell Google that you are open for business. Please crawl my website, please list all my
pages, my content, my video, my imagery, my products, etc – or perhaps you’re not? I had a client reach out to me once, they
said that their website had been live for weeks, but they were not getting a single
visit from organic search – hey what’s going on? Caveat – right now – I didn’t create this
website. They had their site created third party. I knew them through a colleague. They reached out to me. They said, we have this brand new website,
it lists all of our products, our services, we think we have the necessary content, etc – why
are we not getting any traffic through search? I took one look at a tiny little file on your
website called a robots text file or robots.txt It has three lines – and I’m gonna put it
up on the screen afterward in editing. But it has three lines. DISALLOW, ALLOW and SITEMAP. That’s it. DISALLOW tells Google what not to crawl on
your page, where not to look. What not to crawl on your site and where not
to look for pages to index. These are things like your admin panel or
your back-end analytics, or log files. Things that don’t need to be on Google. ALLOW tells Google everywhere else to crawl
on your site, which is usually everywhere, so it’s marked often with just an asterisk,
which means all or everything. Old school it meant wildcard. Right, you nerds out there that did old school
programming. Then lastly, SITEMAP is simply where to find
your sitemap. It’s just a location, it’s a URL like
website.com/sitemap.xml Typically. That’s it. ALLOW. DISALLOW. SITEMAP. It’s technical but it is very important
as it’s only 3 lines in a plain text file, that’s relatively straightforward, but has
the power if it is improperly implemented, to tell Google to DISALLOW your whole website. So the client who reached out to me had his
entire website accidentally blocked from Google’s index by telling Google to disallow his entire
site. That DISALLOW had the asterisk. So Google hits your robots file first, sees
DISALLOW ALL, and just says OK and walks away. Because they’re not going to search the
rest of your page. That DISALLOW line is like the bouncer at
the door that says no. Literally one keystroke, deleting one character,
I deleted the asterisk from the disallow field, permitted Google to see and crawl and index
their entire site and gradually they started to see the organic traffic flow in. Go to any major site on the internet. Any site. Any airline, hotel, Dell.com, or HP, whatever. Any massive website. And after the dot com, put in a /robots.txt
and you’ll see exactly what one of these looks like. Now to be fair, those websites are massive,
and that might go all the way down the page because they’re going to have a back end
portal and a thousand disallows, whereas your average ecommerce storefront, blog or third
party site is just gonna have maybe this much, a couple little fields, disallow wordpress
admin backend panel or your dashboard, and your analytics. And then it’s going to say allow everything
and then sitemap URL. And it’s actually 3 lines or 4 lines. So now, check your website. Do you have one? Importantly, if you do have one, is it correct? So next is being found and accessibility. Ensuring that your site and your content are
being easily found is going to consist of things like that sitemap. So the robots points to the sitemap but do
you even have the sitemap? Sitemap.xml, same thing, it’s off the root
of your website: website.com/sitemap.xml It’s basically what Google sees when they
come to your website. Google’s crawlers, Googlebot, it’s algorithm
and it’s crawlers, doesn’t see the front end of a website like you or I look at it
and scroll down and see images and content and everything is all templated and nice in
it’s little containers and margins and so on down the page. Not at all what Google sees. If you use Chrome and you are on a Mac, go
Command, it’s either Command + U or Command + Shift + U. And you’ll see the source code. Or you can actually just go view and click
on source code. And you just see that mess on the page. That is what Google sees. So a sitemap helps Google make some sense
of your holistic website and basically just what the hell is going on in the back end. So an XML sitemap for search engines and a
HTML sitemap for users. That’s your front end, that lives in your
footer or your navigation to help people actually find what they’re looking for on your site. Assuming that your site has a bunch of different
pages, contents, categories, products or services, etc. The sitemap just tells Google where to find
all of this content and it’s categorized that way. Being secure – so, HTTPS. Google has actually, in the last couple years,
they’ve indicated – and now they’re making good on it, that HTTPS, which is just the
padlock in the address bar, it’s called a Secure Socket Layer, an SSL Certificate,
that makes your HTTP into HTTPS. That tells Google that now your website encrypts
sensitive data, and it’s capable of secure transactions with personal information such
as credit cards or passwords. So if someone has to log into your website
or you accept any form of payment online, you’re going to want the SSL Certificate,
the secure socket layer, the padlock encryption to make your HTTP, HTTPS. Naturally accessibility consists of things
like SEO – search engine optimization – tactics, strategies which are a part of the on page
content. But also technical aspects. Accessibility is also your page load speed,
your mobile accessibility. These are huge ranking factors. Earlier when I mentioned that Google considers
200 things but only 10-15 matter, these are all the way up there. Your page load speed and is your website secure,
and those types of things, are massive, massive ranking factors. You can have better content than the next
person, but if your site is not structurally and technically sound, a lot of the times
that content just won’t get the same kind of traction or engagement or impact online,
as if it was technically sound and now you post that valuable content. Poor page load speed will not only hurt you
in search engine ranking but it will also have a significant impact on user experience
and bounce rates. Your abandonment is going to go super high. Your bounce is going to go super high. And your conversion is ultimately going to
go down. People don’t stick around to wait for a
page to load. If they don’t find exactly what they are
looking for, when they are looking for it – they will leave and find an alternative
way of getting that information, learning about that product, or learning about that
service. The average time that people are willing to wait gets shorter every year. Back in 2013 it was something like 3 or 4
seconds, 3.5 seconds. By 2015, just two years later, that was down
to 3 seconds flat. The latest statistic I saw in 2019 was closer
to 1.5 seconds or 1.6 seconds. So it is soon to be, if not already, instantaneous
expectations. So make sure your content is loading quickly. Have your larger images optimized so the file
size is not too big. Implement what’s called a lazy loader – basically
what a lazy loader is, it’s for your content on the site that’s considered below the
fold. If this is your screen, the content down here
that you haven’t gotten to yet, it doesn’t load in advance of the user scrolling down,
because that’s ultimately going to slow down your site speed even more. If your website is this long and it’s like
4 screens worth of scrolling, you don’t need 2, 3 and 4 to load until the user actually
goes down to reach that content. That way the very first screen is going to
load much faster with a lazy loader, so look that up and check into that. A huge part of accessibility is also redirects;
broken pages, proper webpage decommissioning. I have encountered this one – not, many times
– too many times. Too, too, too, too many times. I’ve had people contact me and they’re
saying what’s going on with our website, or what’s going on online, their page used
to be ranked for all these terms, and now they’re not. And I go into their back end and I take a
look at what’s going on and I see that they had pages that were essentially outdated or
no longer applicable, so the client, their internal team or their external team, their
agency partners, whomever, I’ve seen it all. Flat out deleted the pages. Just flat out deleted the category or the
page or the content entirely. Here is why that’s a terrible idea. Once that page or its content has been live for let’s say 6 to 12 months online, it
will have been indexed in Google’s search results for a number of different search terms
or keyword groups or phrases. Now you are providing your potential customers
with a broken experience. Yes, Google will correct it once it learns
that the page is no longer there, it will no longer be indexed in Google’s search
results, but right away it can have immediate impact, Google won’t de-index it instantly,
it could take a couple weeks for some smaller sites for that link to come out of search
results. So now everyone that’s searching for those
keywords or those phrases or keyword groups that see your result that they click on it,
it’s broken, they get 404’d and they bounce back to the Google search results listings
page. Secondly, that URL, at this point in time
has been live for 6 months or 12 months, it has very likely acquired back links or other
sites online pointing into that page. Social media posts, third party websites,
blogs, image and video links, all very valuable things to have pointing into your website. All things that Google take into consideration
when they consider ranking your overall website for specific search terms or keyword groups. You sell running shoes. You have 500 other sites online pointing into
yours saying, these guys sell the “best running shoes”. And that’s the link. The words “best running shoes” is the
actual clickable text that drops them into your page. Google sees that 500 times on 500 different
websites, they’re going to start associating your website to “best running shoes”,
or that particular page, your page that sells running shoes is now associated in Google
to “best running shoes”. That same page you just deleted because you
updated your site or you migrated, you moved the content or the website elsewhere. Now it’s a broken experience. That’s what redirects are for. And there are two types of redirects. Permanent – which is a “301”, which
means the page is gone and it’s not coming back – that’s called a 301 redirect. Temporary – or a “302”, that’s typically
a seasonal or promotional, you plan to use that exact same URL in the future, etc. That’s a 302 redirect. Smart migration of any website or its content
consists of one-to-one page mapping. So what that means is this page was dedicated
to men’s running shoes, which could be – website.com/products/mens-running-shoes/ – that now points to newwebsite.com/products/mens-running-shoes. That’s a one-to-one correlation. I’m expecting, when I click on this link,
from any one of those 500 sites pointing in, when I click on Best Running Shoes, I’m
expecting to be dropped into a running shoes product page. One to one is pertinent. Lazy migration is what’s known as a catch-all redirect, or a root-level redirect, meaning
every single page, every single path on the old website, Mens Running Shoes, Men’s Tennis
Shoes, Women’s Running Shoes, Women’s Tennis Shoes, every single product path on
the old page, now just points into newwebsite.com home page and you make your guest or customer,
the user, find what they were looking for all over again. People want what they were looking for when
they were looking for it. There’s no question about it. If that third party site pointing in says
“Best Running Shoes” and it points to the product page, that’s called a deep link
– and now that product page has moved or has changed or was deleted, without pointing
that URL into the NEW product page – it’s now going to break or bounce or point to the
homepage, which is essentially going to make your customer, your potential customer, click
on Navigation, Mens, Shoes, Runners, they’ve now got 3 or 4 additional steps or hurdles
put in their way – they’re going to bail. They’re just going to leave. Most people aren’t going to put in the effort
to do it all over again. I’ve already researched, I found this thing,
it pointed me to you, I thought I was going to get dropped on a product page where I could
just purchase or browse, instead it’s the homepage, it’s the full experience all over
again. You lost me, I’m not willing to do this. So an additional 3 or 4 steps and unnecessary
hurdles into the conversion process is going to kill your conversion rates. Make the path to purchase as frictionless as possible. Do not delete pages. It hurts your conversion rate. It hurts your user experience. It hurts your search rankings. Now all those 500 sites that were linking in are useless, they provide no value. They weren’t captured, they weren’t redirected
so your holistic domain suffers from deleting pages. Your bounce rate will go higher. Your bottom line revenue will be impacted. Stop deleting pages.

, , , , , , , , , , , , , , , , , , , ,

Post navigation

One thought on “Website Best Practices; HTTPS, SEO, Sitemaps, Robots, Redirects & Analytics – ECT 14, Mike Hoyles

Leave a Reply

Your email address will not be published. Required fields are marked *