SEO 0 comments on An Inside Look Into Google: Low Page Quality

An Inside Look Into Google: Low Page Quality


We all know what makes up a good web page. But how many of you know what a poor web page consist of? Google routinely publishes their Search Quality Evaluator Guidelines which contains key information for SEO. This document has an interesting section called Page Quality Rating Guideline. This document provides a unique glimpse into how the search giant ranks and indexes web pages.

By understanding page quality we can optimize our sites for success.

Introduction to Page Quality Rating

Essentially the goal of Page Quality rating is to evaluate how well the page achieves its purpose. Sense there are different websites with different purposes, the expectations differ for various types of sites.

These are a sequence of steps that are followed to determine the PQ score of a web page. These include:

  • Understanding the Purpose of the Webpage
  • Understanding Web Page Content
  • Main Content Quality and Amount
  • Finding the Webmaster
  • Website Reputation
  • Expertise, Authoritativeness, Trustworthiness

What Comprises A Low Quality Website

According, to the Search Quality Evaluator Guidelines: Low quality pages are unsatisfying or lacking in some element that prevents them from adequately achieving their purpose.

It is important to note that there are very different standards for pages on professionally-produced business websites and personal hobbyist websites. Google utilizes a dynamic ranking scale.

Unsatisfying Amount of Main Content

Important to note: An unsatisfying amount of MC is a sufficient reason to give a page a Low quality rating. This could include a lack of a company bio, mission statement, contact us pages and thin service or product pages.

Distracting/Disrupting/ Ads and Supplementary content

Ads which disrupt the usage of MC

This includes pop up ads which make the mobile experience difficult or ads that float over the main content portion as searchers scroll.

Prominent Presence of Distraction SC or Ads

Some webpages have ads and content designed to lure searchers to highly monetized web pages. An over abundance of ads  provides a poor overall user experience.

Negative Reputation

Google uses third party data to compile information on a business’s reputation. Negative reputation is a sufficient reason to give a page a Low quality rating.

Lacking Expertise, Authoritativeness, or Trustworthiness

Topics that are centered around medical advice, legal advice, financial advice should come from authoritative sources in those fields. Google takes into consideration who is responsible for the content of a website or page.

Key Takeaways:

Build out your content. Since the Fred algorithm update Google has been proactively attacking websites that have poor or an inadequate amount of content.

Keep Ads to a Minimum. Google has been known to levy penalties against website that use a disproportionate amount of ads compared to the content on their site. If you need an example look no further than forbes.

Monitor & Improve Your Reputation. It is critical to not only monitor and respond to negative reviews. But also, be proactive in acquiring positive reviews in Google and other other major syndicators.

Establish Expertise, Authoritativeness, & Trustworthiness

Build your websites reputation as an authority through regular blogging. Build out appropriate supplementary content such as, about section, company profile etc.



SEO 0 comments on Don’t Forget Your 301 Redirects When Doing a Redesign

Don’t Forget Your 301 Redirects When Doing a Redesign

SEO is tricky and web redesign projects can add an additional layer of confusion. Fortunately, I have compiled a checklist to ensure you correctly implement the appropriate 301 redirects to preserve your the SEO integrity of your site.

Step 1: Obtain a list of all URL’s that will be changing.

I am well aware this isn’t always possible when a client is undergoing a website redesign. However, it doesn’t hurt to ask them to provide you with a list of all URL’s that their development team plans on changing. From here you can crawl their development/ test site and find appropriate redirects for the pages that are changing.

If they do not have any idea which pages are changing are do not give you a list we go on to the next step.

Step 2: Login into Google Analytics and export a list of the top landing pages that receive the most traffic.

Take the website into account and pull landing pages that have a decent volume. Don’t waste time redirecting a page with minimal traffic.

Step 3: Find their equivalent match within the test or development site.

Step 4: Do a find & replace within excel to switch out the domain for the production domain. All that should change essentially is the URI.

Step 5: Correctly compile and implement your redirect list.

As a side note if the website I am working on is an ongoing digital marketing client I like to add tracking parameters to the 301 redirects.

Over time too many redirects can wreak havoc on page speed. Therefore, it’s important to only use a 301 redirect when it’s necessary which is why we add the tracking parameters. It is a good idea to double back after a few months and remove any redirects that aren’t being used.

There you have it 5 simple steps to implement 301 redirects during a site redesign and preserve SEO equity.

Let me know about some of your best practices when it comes to website redesigns!

SEO 0 comments on Must Have SEO Tools

Must Have SEO Tools

I was recently inspired by a recent email to write an article based on software tools I recommend. Throughout the day I use a variety of software tools to accomplish a variety of tasks. Probably like most SEO’s and digital marketer’s I find myself using a variety of tools throughout the day. Without further ado, here are some of my favorite tools for keyword research, content strategy, On-Page optimization and Link Building.

Keyword Research Tools

First off, I have a few tools I enjoy using for Keyword Research depending on the types of keywords I am targeting. My go to tool for Keyword Research is Moz’s keyword explorer. Moz’s keyword research tool lets you research a variety of keywords and segment by type. You can easily sort to research informationally based keywords, closely related topics, and include a mix of types. Unfortunately, this is a paid tool but it is well worth it.

Moz’s Keyword Explorer Tool – Paid

However, if you are strapped for cash there are a few FREE keyword research tools I like to use as well. SEOBOOK has a great keyword research tool that lets you do some pretty extensive keyword research for free. This tool rivals Google’s keyword planner tool and also gives search volume within Bing’s search engine. Finally, for more semantic and long tail keywords I like to occasionally use Twinw rd ideas to fill in any keyword gaps.

Backlink Analysis

There are a few ways I like to conduct Backlink Analysis. First, I like to use Moz’s Open Site Explorer to quickly check the spam score of a site. This tool will let you get a high-level overview of any inbound links pointing to your site at a quick glance.


For additional clarity, I like to dig into Google’s Webmaster Tools. To do your own backlink analysis within Webmaster Tools login to your account, click “Search Traffic”, and “Links to Your Site”. From there you will get a complete breakdown of all the backlinks pointing to your website.

On Page SEO Optimizations

I don’t have a specific software I tend to use for on page optimization update. I tend to rely on a custom keyword map that I create for my client to keep up with all keywords I am targeting for each page. For additional granularity, though I sometimes use Moz’s On Page Grader. With this tool, you can associate a keyword with a specific URL and get on page optimization recommendations.


Rank Tracking

For Rank Tracking Tools I tend to use two online tools to track keyword fluctuations. I prefer a platform called SEOClarity. SEOClarity is an enterprise level rank tracking tool. Although the user interface isn’t the greatest the tool is extremely powerful. You have access to mobile and desktop rankings, organic competitor content gaps, and as an added bonus you can do your own keyword ranking forecast.

Bonus Round Free SEO Tools

  • Webmaster Forum has a host of free tools listed to get the job down
  • Saijo George maintains a master list of any SEO/Digital Marketing tool you will ever need. There are also a swarm of Free tools as well.
SEO 0 comments on 3 Ways to Use Google Tag Manager for SEO

3 Ways to Use Google Tag Manager for SEO

For those living under a rock, Google Tag Manager is a free tool for marketers to implement website tags that include conversion tracking, enhanced site analytics, remarketing and more. Tag manager makes it easy for marketing teams to implement enhanced tracking without having to get a development team involved. Google Tag Manager runs on a Tag Management system which allows marketers to integrate additional tracking into websites dynamically or for individual pages. Some of the benefits to using Tag Manager include:

  • Agility
  • Performance
  • Cost Savings
  • Built In Debugger Tool

If you aren’t convinced learn more about Google Tag Manager. Otherwise, let’s rock-n-roll!

If I haven’t lost you yet there are several advantages Tag Manager offers for SEO’s. For example, you can implement schema markup for local clients, insert meta no index tags, and implement rel canonical tags in a breeze.

Implementing Structured Data through Tag Manager

Structured data in its base form is information formatted in a way that can be universally understood. Schema Markup is a vocabulary for structured data developed by Google, Microsoft, Yahoo, and Yandex. The goal is to create structured data markup that all search engines can understand. You can start to see how structured data can provide an edge when trying to gain organic visibility.

The benefits to Schema Markup really come into play if you are managing local SEO for a client or personal site. However, you can also leverage schema markup and micro data to mark up blogs, product information, aggregate reviews, key contact information and more.

To implement Schema Markup for a local business with Tag Manager we first need to generate our code. Fortunately, there is no need to code Schema manually we can make use of a JSON-LD generator. For local businesses and organizations I prefer to use J.D Flynn’s JSON-LD generator it is straightforward and includes a variety of additional schema elements I have yet to find with other generators.

Simply, follow the instructions and fill out all necessary fields. The schema tool will generate the necessary JSON-LD Schema Markup so you can implement it via Tag Manager.

Next, you should already have Tag Manager properly configured and set up on your site. From there you will go to your workspace and click “NEW”.

workspace tag manager





From here you will want to name your tag accordingly in case you need to reference back to it for updates.

create custom html tag

From here the tag type we will create is a custom HTML Tag. Then you will want to set it to fire on your desired page. For our example, we will choose the contact page to fire the Schema Markup. We want to give search engines additional structured data on important information related to our website’s location, key phone numbers, business type, and address.











Once you have your HTML tag built out in Tag Manager you want to build out the trigger next to fire on the Contact page specifically. There is no point in firing the Schema tag site wide.


You can extrapolate this out further and implement Schema on additional pages depending on your website’s structure. You can add Schema on specific product pages, testimonials pages, and the blog section of your website.

As a side note – Some CMS’s may already inject Schema Markup into your website automatically. To check and see if your website has Schema Markup you can use Google’s structured data testing tool to see what if anything is missing.

Implementing Meta No Index Tags
Next, you can implement Meta No Index tags through Google Tag Manager. No Index tags are a special HTML <META> tag that tells robots not to index the content of a page, and/or not to scan it for links to follow.

To make this work it helps to have some familiarity with Javascript, however, it isn’t necessary. It is important to note that just because you add a Meta No Index tag it does not mean that Google will automatically remove that particular page from their index. It generally takes anywhere from 2 weeks to a month or so.






Our script will look like the above image. By default, Google Tag Manager will inject code within the Body Section of a website. However, this is not ideal for implementing canonical tags or other types of metadata. We have to use Javascript to tell Tag Manager to fire this code in the head section.

To accomplish this successfully we have to specify variables and tell Tag Manager to fire this code in the head section. The breakdown will look like:

Var meta = document.createElement(‘meta’);
meta .name = ”robots”;
Meta.content = “noindex”;

The next step is to specify which page this code will fire on. It should go without saying you do not want to accidentally fire this tag site wide and remove your website from all major search engines.

After you have your custom HTML Tag built it’s time to specify what page this code should fire on.


As you can see above I am telling Tag Manager to fire this tag on the specific page above. The workflow looks like: “Page URL” – > “Contains” – > “”.

We can check that this tag is firing properly a number of different ways. First, I like to make use of Google Tag Manager’s debugger tool within Tag Manager.

Hit the ‘Preview’ button next to the blue Submit button to enter into preview mode. From there open up the specific web page where the container script is located.


From here we get a breakdown of Tag Manager containers that are firing off and ones that are not. We can see here that our meta no index tag is in fact, firing on the correct page. You can take this one step further and check with the Mozbar to see if the no index tag can be recognized.

Implementing Canonicals Through Tag Manager
It is typically best practices to implement self-referring canonical tags sitewide for SEO purposes. If you are kinda fuzzy on why we would do this it is probably a good idea to do a bit more research into canonical tags.

Since Moz published an in depth article recently on how to implement dynamic canonical tags I will not be going into that. However, you can learn how to implement dynamic canonical tags from Moz. It follows a similar format as the meta tag in our previous example.

Implementing One Off Canonical Tags
To implement one off canonical tags we will insert a custom HTML tag into the specific page we want to canonicalize. First, you need to create your HTML tag in Google Tag Manager.










This will be a bit easier since we don’t have to use any Javascript to tell the canonical tag to fire. The next step once you have written your canonical tag is to specify when to fire the code. We want to tell Tag Manager to fire it on the Home Page for our example. Once you set up the trigger you are done and we can move on to test it.

Testing Canonical Tags
We have a few methods to test to see if the Tag is firing properly. First, you can use Google’s debugger tool for Tag Manager. Or, you can make use of the MozBar to get a quick read on whether or not it is recognizable. Chances are if Moz can find it then so will Google and probably Bing. The MozBar is a free browser add on that allows you to quickly get critical web page metrics.

moz html tag verification




Once you have it installed you can click on the ‘Page Analysis’ tab.


From there click ‘General Attributes’ tab and you can see whether or not the canonical tag is recognized.

Key Takeaways
Whenever possible it’s important to always hard code Schema Markup, no index tags, and canonical tags. However, there are times when it’s not possible to hard code these meta tags. With that being said Tag Manager makes it a breeze to implement schema markup, meta no index tags and canonical tags on the fly.

SEO 0 comments on 3 Ways To Improve Site Speed

3 Ways To Improve Site Speed

Site speed and site performance. Believe it or not, it can have a serious impact on your businesses website performance. If a website loads too slow chances are people will leave and restart their search within Google. With the introduction of AMP pages and the growing focus on a singular search algorithm now is the time to focus on site speed.

If you still need a reason to care – Gary Illyes has been quoted as saying in the past page speed is a ranking factor.
I don’t know about you but that’s all I need to hear before I start making changes quick fast and in a hurry.
To see if you have a potential site speed issue log into Google analytics. Look under Behavior – Site Speed – Overview. This view will give you an overview of the average load time of your website.
site speed issue

The Issue:

I had a client site in the past with page speed issues and it became a concern. This got me thinking there has to be a quick and easy solution that I can implement. There is and I used my blog as the test subject.
The Solution:

If you aren’t familiar with the tool now Google page speed insights will be your best friend by the end of this project. It will grade your web page according to specific attributes search engines look for. The top issues you will need to fix initially are: optimizing images, leverage browser caching, and minify HTML and CSS.

This is not a guide on how to get a perfect score with Google page speed insights but rather how to get maximum results with the most efficient amount of effort exuded.

Step 1: Optimizing Images

Images are one of the main culprits of slow load time. To fix this we need to condense images and reduce load time. There are a few ways to achieve this. Don’t be deceived by WP smush while it provides a relatively quick and easy way to optimize images for WordPress it does a poor job. According to Google’s page speed insights test my site was still kicking up a red flag in the image department.

The solution I stumbled across was to use an online tool called for large Homepage images. This reduced my homepage images by roughly 60% whereas WP smush reduced them by maybe 15%. For compressing multiple images for a site doing them, one off doesn’t quite make sense. So I use PNG gauntlet. This simple tool is a standalone piece of software you can download and set up to compress multiple images simultaneously.

Compressing images alone will work to improve load time significantly.

compressed image

Key Takeaway: Continually look for ways to reduce images. Avoid large sliders unless they serve a specific purpose. Avoid flash like the plague.

Step 2: Leverage Browser Caching

The next step is to leverage browser caching. This is handy because if your site has a lot of static resources that do not change often you can cache those assets on a searcher’s computer. When they visit a second time the page will load significantly faster since those resources have been previously cached.

Now I did this a few different ways and the easiest by far is to utilize a CDN or content delivery network. The changes to caching have to happen in your HTTP access file. If you do not feel comfortable making changes then a CDN is your safest option.

I choose to use Cloud Flare as my CDN of choice. It is free and relatively easy to setup. You will need to make changes to your DNS Nameservers. If you host with Godaddy login to your account and click manage my domains. From there find your domain and click your site you want to make changes to within your account. Go to nameservers and click manage. You will need to change your nameservers to whatever Cloud Flare prompts you to change it to.









Once you’re within Cloud Flare’s dashboard you will see a wide range of icons. It might be overwhelming but for now, focus on the Caching section. Depending on your business objectives adjust your browser caching accordingly. Personally, for my blog, I do not publish new content or make changes on a weekly or even a bi-weekly basis so I set the caching to 16 days. This will be adjusted depending on your goals and content creation cadence.

Key Takeaway: You want to enable browser caching for resources that do not change that frequently. While Cloud Flare does not make much of a distinction it is still a good idea to cache depending on how frequently you update your website.

Step 3: Minify HTML and CSS
Next, we want to minify HTML and CSS to further improve load time. By minifying the file of your source code your web pages will load quicker. Fortunately, we do not have to do this manually. Cloud Flare will actually take care of that as well. I know what you’re thinking I don’t want the program potentially messing up my source code. Well, Cloud Flare makes use of a PHP script to dynamically minify the source code as the page loads.
To enable this click the speed icon at the top of the dashboard. At the top of the page under Auto Minify check HTML, CSS, and JS. This will enable the PHP script to fire off and dynamically reduce the file size of your source code throughout your site.

Key Takeaway: Source code with a lot of extra white space causes web pages to load slower. Sometimes we have to balance web development best practices with User Experience in mind. Utilizing minification via a dynamic PHP script allows us to achieve this.

SEO 0 comments on How To Create A Nested XML Sitemap

How To Create A Nested XML Sitemap

Why Do I need an XML Sitemap?

You might be asking yourself why do I need a sitemap? Well, a sitemap is like a table of contents for your website. It basically provides search engines a road map of your web pages tells them about the organization of your site content. You can also include valuable metadata associated with the pages listed in the sitemap. A sitemap is also a way we can alert search engines to new or changed content quickly.
With that out of the way let’s get started.

The Purpose of a Nested XML Sitemap

Quickly identify pages within an XML sitemap that are not currently being indexed and break up large XML sitemaps into their respectful groups.
Often times working with e-commerce sites products get added and dropped frequently. Landing pages get added, and ….hopefully, blogs get written. It can be difficult to look into webmaster tools and see you have 2,400 URL’s submitted but only 1200 indexed. It becomes mind numbing combing through them to determine what is and is not indexed. This does no one any good.
This provides no indication as to why it might not be indexed. Have you used too much of your crawl budget? Did your developer leave ’meta no index tags’ on the site? With a nested XML sitemap we can get a much granular view into the indexation health of our site.

Preface to Creating Your Nested XML Sitemap

This guide assumes you already have Screaming Frog downloaded and that you have a basic understanding of the tool. Be sure to download the newest version too if you haven’t already. The tools you will need are:

Regex tester
Screaming frog 
Webmaster tools 
Notepad ++

Phase 1

First, power up screaming frog and pop your URL into the search bar. If it is a new site I like to crawl it without any excludes or includes present.
Once you do your first initial crawl look for common groups of folders and categories you can segment. 
As you are crawling make note of any odd parameters coming through. You can open up notepad and make note of it to exclude those URL’s moving forward. If you work on an e-commerce site you want to identify product level URL’s and their URL taxonomy. As you can see once you begin the crawl you should quickly be able to identify categories fairly quickly. 

The end goal here is to identify categories and segments you want to group and segment moving forward.

screaming frog crawl

As you can see from the crawl above once I have done my initial crawl it becomes fairly easy to identify common groups. I can explore these further and see if it segmenting them is justified.

On this particular example of Colfax furniture’s site I identified both category and product level folders I want to segment. I will ultimately create sitemaps out of them. The folder structure look like this; 

Product level URL’s  
Category level URL’s

I have also identified odd parameters, 301’s and other anomalies I want to exclude from my nested sitemap as well. 

Phase 2

Next, once you identified the folder’s you want to segment it is time to add the proper excludes and includes to get a more optimal crawl for your XML sitemap. You can use a free regex tester to debug your regex for screaming frog.  Screaming Frog has some good documentation on excludes and includes.

regex tester

As an example, I am using the regex tester to exclude all URL’s that fall under /folder/. This tool allows me to quickly test a variety of combinations without starting and stopping crawls in Screaming Frog to see if my regex is working. You can access your current excludes and includes set up in Screaming Frog by clicking configuration. Under the drop down menu you will see ‘exclude’ and ‘include’.

Crash Course in Screaming Frog Regex Syntax   

In case you are not familiar with Screaming Frog, you can exclude and include certain parameters and folders using regex.
Let’s say you want to exclude a certain folder from being crawled it would look like:


This will exclude all URL’s that are within that particular folder. This comes in handy when you want to segment certain groups of URL’s.
Let’s say you found some parameters when crawling your site you don’t want to include and they look like:


A quick solution is to use regex to exclude all URL’s containing a common syntax like this:


This syntax will exclude all URL’s that contain this particular character string. This comes in handy when you are trying to avoid including paginated URL’s and other odd parameters your CMS may generate. Again, if you are unsure play around with the regex tester a bit.

This is just a basic crash course in basic syntax to get you started. For a more complex overview of Screaming Frog visit Seer Interactive’s guide to Screaming Frog. 

Continuing on with our example I have identified folders I want to segment and build a dedicated sitemap for. Here is the sample include regex for each section:


I can also circle back and exclude each of these sections to build a sitemap with the remaining URL’s as well.

You also want to exclude images, javascript, CSS etc while building your nested sitemap. You can always come back and build a dedicated image based sitemap as well. You can do this by clicking ‘configuration’ and ‘spider’. Once there you can uncheck the option to include images, CSS, JavaScript etc within your crawl.

image excludes

Phase 3 

Once you have accounted for all parameters and identified your folders it’s time to run the crawl and save your clean crawl. You can either save the crawl in Screaming Frog or go ahead and export it as an XML sitemap.

sitemap file

Now, once you have a clean crawl from your first set of URL’s you want to lump together in one XML sitemap you can save it. For product level sitemaps I tend to save it as “p-sitemap”, then category level URL’s as “c-sitemap” respectfully. You will later reference these in your master XML sitemap.

Next, repeat that process for the remainder of the categories you have identified for your nested XML sitemap.  

***As a side note – you want to make sure you are not polluting your XML sitemap with junk URL’s. If you are unsure of what to include brush up on what to include and avoid in your XML sitemap.

Once you have all your separate XML sitemaps crawled and generated it’s time to piece them together. You can do this by using Notepad ++ and referencing your other sitemaps. Just follow the basic framework here: 


The end result should be essentially an XML sitemap referencing multiple sitemaps within it. Hence the term ‘nested XML sitemap’.


Not only are nested sitemaps pretty neat they help you cut down on spending time crawling and generating massive sitemaps every month. They also allow for greater granularity when troubleshooting indexation problems. By nesting your sitemap you can quickly pinpoint trouble spots and make adjustments accordingly.

If you are having trouble creating your own nested xml sitemap shoot me an email. Let’s troubleshoot the issue!


Social Media 0 comments on Free Social Media Graphics Pack

Free Social Media Graphics Pack

gray social media graphics








Download Now


color social media graphics








Download Now


social media lego graphic








Download Now

These social media graphics are free to a good a home. Simply click the link and a pdf will open in a new page. From there you can click the download button and use them however you see fit. All I ask is that if you found these social media graphics useful give me a link pack. Enjoy your free social media graphics. There are more free designs on the way.

SEO 3 comments on Why Google Hasn’t Indexed Your Website

Why Google Hasn’t Indexed Your Website

There is nothing more frustrating than working hard on a website just to find out that Google isn’t playing nice with it. It is disheartening to run into indexation issues within search engines after you have put in all the hard work to optimize and create the perfect website. So, we are going to explore why Google hates your website.

Set Up Search Console

Before we begin I am assuming you have set up Google’s search console to submit your XML sitemap and check your indexation status within their index. If you haven’t done this check out the basics of what to include in an XML sitemap and setup search console.
Assuming you have search console set up navigate to ‘Google Index’ -> and ‘Index Status’. This will give you an overview of what pages Google has discovered so far.
index status
However, if you noticed that Search Console says you have 100 URL’s submitted and 1 indexed fear not!

Before we begin, check your XML sitemap and make sure it is in the correct format. Most sitemaps are located at: ““. It should look similar to
If it looks nothing like the above image then you may just have a formatting issue and you can adjust it accordingly. However, if your sitemap is in the correct format then we must dig deeper.

Meta No Index

The first step to figuring out why Google hasn’t indexed your website is to make sure you have not accidentally left the ‘meta no index’ tag on your site. The easiest way to check this is by using screaming frog. Simply crawl your website and check under the directives in the side panel of screaming frog. You will be able to tell pretty quickly whether or not you have left the meta tag on.
The other option if you do not have a crawler at your disposal is to view the source code of a webpage. Granted this will be a manual process but navigate to your homepage, right click and press ‘view page source’. Then search for this line of code “<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>”. If you don’t see if then you are in the clear sort of. You can cross reference the pages that have not been indexed by Google and check each one using this method.


The next step is to determine if you have blocked Google from crawling your website in your robots.txt. It is fairly common to block all major search engines when developing a website. To see if you have anything blocked simply access your robots.txt file by typing this will pull up your current robots.txt configuration. Here is what you want to avoid:

If everything looks good we can move on to the next step if not you can learn more about how a robots.txt file functions, and update it accordingly.

Are your pages correctly linked?

Next, this might go without saying but if your pages aren’t all linked together then search engines will have a difficult time accessing the page. Generally, when a web crawler or search engine hits a page it crawls all the links on that particular page and indexes it accordingly. Think of a spider crawling through its web. Therefore, if you have created a bunch of pages that are not linked or “orphaned” then Google will have no way to discover it. Fortunately, there are a few solutions:

Include orphaned pages in main nav
Include orphaned pages in sub nav
Include orphaned page in a HTML / XML sitemap
Link to the page internally

The solutions above will help search engines discover your content once they visit your website and begin crawling and indexing.
Once you have checked to make sure your sitemap is in the correct format, and there are no meta no index tags present and you aren’t blocking search engines it is time to resubmit your website. First, go to your search console dashboard and click ‘crawl’ and from the drop down select ‘fetch as google’. This will let Google fetch the current page. It will also let you view it as Google would. Also, it gives you a breakdown of any issues. Copy and paste either a specific page in question or leave it blank to ‘fetch and render’ the home page.

fetch and render


If everything checks out then click ‘resubmit to index’ and select ‘crawl this URL and its direct links’. This will notify Google’s web crawler to revisit your website and crawl all its associated links and submit it to its index. Remember when building a website always remember to check no index tags, robots.txt files and correctly format your XML sitemap. 

Until next time.

SEO 0 comments on What Weightlifting Has Taught Me About SEO

What Weightlifting Has Taught Me About SEO

seo and fitness

I have been working out for a long time now in some form or another. Whether it was tearing up the streets skateboarding, tossing people around in jiu-jitsu or lifting weights. However, recently when I have been weightlifting I find myself thinking about SEO. Not sure it is healthy but oh well here we go. The common theme I keep coming back to is consistency in your SEO strategy, patience when anticipating results, and integrating a holistic approach with your SEO campaigns. 
In weightlifting much like SEO you need consistency to see results. As they say, Rome wasn’t built in a day. While it is natural to want to see results to our SEO campaigns in real time sadly this is not possible. This is also the case when it comes to working out or strength training. It takes consistency it is important to embrace the slow grind that comes with SEO. 
Google like many other search engines crawls the web to rebuild its index. Its crawl process is algorithmic. The computer programs determine when to crawl a site and at what frequency a website should be updated in its results. Google’s search results are a snapshot into its massive index of web pages. You can view any cache pages of your website by using a basic search query “”. Among the search results, you can click “view cached results” and see when Google’s crawlers indexed your web pages. If you don’t see results immediately don’t get disheartened SEO does in fact, work it is more of a long-term strategy. You can utilize your blog for quick wins and bake in Social Media to amplify your content too.
It’s important to keep these factors in mind when structuring SEO campaigns. It’s also critical to educate clients that the search results do not update in real time. 
Holistic Approach 
When weightlifting you get the most bang for your buck by approaching lifting in an integrated approach. You might get some results by eating Cheetos, drinking beer and regularly ordering late night pizza while lifting. However, you will get the best results by having a proper diet and maintaining a regular sleep schedule … when possible. 
SEO is very similar. When performed as an integrated digital marketing strategy you can achieve some great results. 
Leverage Social Media 
When developing optimized content and blog posts think about how you can share your work through social channels. Reach out to social influencers and niche bloggers to amplify your content as well. Leveraging tools like Epicbeat and Buzzsumo makes it easier than ever to achieve this. 
Utilize Email Marketing 
Assuming you are currently leveraging email marketing think about creative ways you can integrate SEO. For example, say you publish a new whitepaper or informational blog posts. You can email your subscriber lists and entice them back to your website. Be sure to test various email formats and always remember to track your links with the Google URL builder to see what is working and what isn’t. 
You won’t be able to bench press 250lbs your first day, or your first month. Hell, you probably won’t be able to max out at that weight within your first year. This same principle applies in SEO. There are no magic techniques or shortcuts in SEO anymore. The era of quick tips to get traffic are mostly gone. 
SEO takes patience, search engines do not operate in real time. It is important to be patient and build a strong foundation, to begin with. 
Conduct keyword research and understand the searcher intent behind your target keywords. Build appropriate landing pages with your keywords relevant calls to action. Optimize landing pages around your target keywords and include a call to action on every page to encourage engagement. Don’t let your blog get stale. Build content to target informational types of searches and increase brand awareness. 


Optimize meta descriptions and title tags to drive click through rates. Don’t be too focused on stuffing all your keywords in those fields. Utilize your H1 tags and limit them to “one” per page. 
By having a consistent strategy, approaching SEO holistically and being patient you can have a successful SEO campaign. 

Social Media 0 comments on Social Media Strategies For Small Businesses

Social Media Strategies For Small Businesses

Social media can be a great equalizer for any small business looking to gain a competitive edge in their industry. Utilizing social media for small businesses is a great tool however often times it is used either poorly or improperly. There are a few key areas you need to take into consideration when starting any campaign.

Define Your Goals

social media goals

Often times businesses do a poor job defining their goals and tracking overall performance. The first step in dealing with this issue is to determine what you want to be known for on social media or your company’s overall objective or goal. A sample goal could be “I want to be the go-to source for xy & z”. Once you have decided what your objective is then the next step is to determine what key performance indicators you are going to measure to determine if you are achieving your objective. It is important to track social engagement such as likes , follower growth, shares etc. I also highly recommend tagging all links posted on social media with the Google URL builder to see where traffic is coming from in your analytics dashboard to gauge the effectiveness and measure ROI of your social campaigns. Also, Google tends to generate some very ugly tracking URL’s so you are going to want to run these through a neat program called Bitly to get a smaller condensed URL for other social profiles where space is at a premium.

Produce Engaging Content

engaging content

Yeah, I admit it can be fairly difficult to produce engaging content on a regular basis. I admit I find it difficult even with the plethora of available tools and all the creativity at my disposal. Fortunately, there are some pretty good resources to work around that slight issue. Before you begin it is important to have a good mix of original content and reshared content but just remember to tag whatever URLs you are sharing through your social platforms. This will help you track what traffic is coming through to your website in your analytics platform. When it comes to sharing or repurposing content to your social platforms you can design engaging graphics whether in the adobe creative suite or with a semi-new platform like Canva if you are not very design savvy. Canva will allow you to produce fairly aesthetic designs with ease so you can make a post on your social platforms with a high-quality image attached and just remember to always link back to your site or blog. The next step is to hunt down some content to reshare to supplement your social strategy. For this, I like to use Buzz Sumo, Klout or Epic Beat to do some in-depth research on what people are sharing and engaging with on social platforms within my niche. These programs are great for re-sharing content and just doing some additional research to see what types of content are popular so you can structure blog posts around the research and information you find.

Create A Dialogue With Social Media

social media dialogue

The beautiful thing about social media platforms is it gives individuals a voice who may not have had one otherwise. Social media marketing does not work if you are simply broadcasting your company’s activities or using hard sales tactics on social. This is why I believe ‘social selling’ is ineffective and very annoying. Personally, I hate it when people follow me on Linkedin or Twitter just to send me a message about some product or service offering that has no benefit to me. Social media is wonderful for reaching potential customers in the early stage of the buying process and funneling them back to your businesses website for a micro conversion. You can utilize your website for micro-conversions such as, email signups, newsletter sign ups or registering for product demos rather than going straight to some sort of hard sell. Generally, social media has a very low conversion rate when it comes to making sales however it can have wonderful engagement metrics and be used for attracting customers who are higher up in the sales funnel.

Good Luck and Happy Sharing!