seo - Katakuru

What makes Google Search Console so important?

By | blog | No Comments

Google Search ConsoleWhen I first started with SEO, I was taught that you start with the on-site and on-page optimization long before you even consider doing anything off-page. A website that lacks solid on-site optimization, won’t respond well to any off-site work anyway. It makes sense to ensure that your website’s engine (on-page) is in a good condition, before pouring in the rocket fuel (off-page factors).

Every single time I start SEO on a new site for a new Cape Town SEO client, the first step I take is to configure Webmaster Tools (renamed Google Search Console recently). Search Console contains a wealth of information on the health of your website as it is seen by Google. Seeing that Google is the number 1 search engine, and that it is exactly where we want to appear well in the SERPs, this is an invaluable resource.

To power up Search Console even further, setup sharing with Google Analytics inside of the Analytics dashboard. This will ensure that the valuable search queries are displayed inside of Analytics, helping us to make informed decisions.

What does Google Webmaster Tools do?

WMT/Search Console assists webmasters in fixing issues that might be holding their sites back in the search engines. When logging in, you will see information like:

Site Health Information

  • Messages or warnings about the health of your site
  • Crawl Errors – this tells you if have pages on your site that can’t be crawled and which pages are the problem so you can fix them
  • Crawl Stats – how many pages have been crawled on your site and when the search engines crawled them
  • Fetch as Google or Bing is a neat little tool that allows you to submit a URL and see your site the way the search engines see it
  • Index Status – the total number of URLs that have been added to the search engine index
  • Malware – the webmaster tools will let you know if any malware has been detected on your site

 Site Traffic Data

  •  Search Queries – the top keywords and pages for your site
  • Links to your site – how many links you have, who’s linking to you, and what pages they are linking to
  • Internal Links – how are your pages linked to each other

Site Optimization Information:

  •  HTML Improvements – this alerts you to issues with the code on your site that may be adversely affecting your SEO.  It also helps you identify duplicate content issues with meta descriptions and page titles.
  • Content Keywords – a list of the most significant keywords and variations Google found on your site. (note if you see words like “viagra” or “payday” it’s possible that your site has been hacked.)

Special thanks to Beth Browning for her original article here

Once you have setup WMT, you can also submit your sitemap. This helps the Google bots and spiders to crawl your website, improves your indexing and this leads to better search visibility.

If you do not currently have WMT setup for your website, you are missing out on a whole lot of useful information that Google provides free of charge.

Not sure how to setup WMT? Drop me an email here, and I would love to assist you.

New client on 2 week trial: Pretoria East Gate Motors

By | Uncategorized | No Comments

pretoria east gate motors logo

I have been in talks with Sias from Pretoria East Gate Motors. During this seo services trial, I will aim to move their current website (http://www.pretoriaeastgatemotors.co.za/) up in the search rankings. Currently Sias’ website is ranking on page 2 for “garage doors pretoria east“, and I hope to reach page 1 within the 14 days.

This keyword can bring in a good deal of targeted traffic for Sias’ company specializing in garage door automation in the eastern suburbs of Pretoria, and can result in an increase in phone calls and hopefully more business.

Keep an eye out for a follow-up blog post in two weeks from now, to see if I was able to meet the deadline.

In the meanwhile, if you live in Pretoria East, and have trouble with your garage door, give Sias a shout. He would love to help you whether you live in Garsfontein, Moreleta Park, Constantia Park, Faerie Glen, or anywhere else in the eastern suburbs of Pretoria.

New client on 2 week trial: Drain Ratz

By | Uncategorized | No Comments

drain ratz logoI am delighted to announce that Drain Ratz Plumbing have signed on to a new 2 week trial for local seo in the pretoria east area. During this trial, I will attempt to move their website from page 2 onto page 1 for the keyword “plumber pretoria east“.

Drain Ratz offers a variety of different services in the Pretoria East suburbs (amongst these are Elardus Park, Wingate Park, Erasmuskloof, Erasmusrand, Monument Park and Waterkloof). Their services include the following:

  • Emergency 24 Hour Plumber
  • General Plumbing issues
  • Commercial and Industrial Plumbers
  • Geyser Repair
  • Geyser Installation
  • Leak detection in Pretoria East
  • Leaking Radiators and Heaters
  • Burst and Leaking Pipes
  • Leaking Taps
  • Solar Geyser Repair and Installers
  • Central Heating system repair
  • Water pressure testing
  • Drain Unblocking
  • Unblocking toilets and sinks
  • Burst Water Pipes

If you live in Garsfontein, Moreleta Park, Faerie Glen, Boardwalk, or anywhere else in Pretoria East, Drain Ratz comes highly recommended as a plumbing company. They will take care of your plumbing needs.

35 Ways To Make Your Site Search Friendly Before You Hire An SEO

By | blog | No Comments

Credit: SearchEngineLand.com

For all of those web designers and web developers out there, this is a great checklist to ensure any of your new websites are friendly before you launch the site.

Some of these tasks are technical, and you might struggle with it just a little bit.

Pay attention these factors, and you will make my job so much easier 😉

Just kidding, of course.

I could never do what you designers do, and I wouldn’t even try.

Let me know if you have trouble implementing any of these, and your friendly Pretoria SEO expert would love to lend a hand.

Below I have handpicked a few of the factors I see neglected far too often.


3. Allow spidering of site via robots.txt. Every now and then when a new site rolls out, the developer forgets to change the robots.txt file to allow the search engines to crawl the pages. If your Web marketer doesn’t think about checking this file, you could spend months wondering why you’re not getting the traffic you should be. Double-check your robots.txt file to make sure it does not “disallow” search engines from crawling your site.

18. Write unique title tags. Every page of the site should start with its own unique title tag. You don’t have to go all SEO on it if time doesn’t permit, but having a title that represents the content of the page is a must for rolling the site out. Keep each one between 35 and 55 characters.

19. Write unique meta descriptions. See above. A good description should be between 100 and 155 characters.

29. Find ways to increase page load speed. There are always things you can do to improve site speed. Look for even the smallest of opportunities to make your pages load even faster.

34. Use search-engine-friendly links. Make sure all your links (except those you deliberately want to keep away from search engines) are using search-engine-friendly link code. Using the wrong link code can inadvertently keep search engines away from very valuable content.


What you should know about the new local map pack\snack pack

By | blog | No Comments
Google's new snack pack in action

Google’s new snack pack in action

Over the last month Google has been rolling out a more compact version of the map pack. Users have often been presented with map packs consisting of 3, 5 or even 7 listings in the past. The new snack pack as some call it, have been in testing in a few different markets in the United States.

Apart from the fewer listings, the new pack are consistently ranking much higher compared to the 7 pack. Where the old 7 pack used to rank in the first spot in only 25% of searches according to this report, the new snack pack is reported to take top spot in 93% of searches.

That is an astounding number!

This could have a significant impact for businesses that were previously shown in the 7 pack. This also means seo companies will have their work cut out for them.

The good news is that the new pack isn’t showing up in first spot as much in South Africa. I did a few searches, and in only a small number of them did the new pack show up first.

In some of the searches the map pack were beaten out by 3 organic listings (perhaps the new map pack hasn’t rolled out to those industries in SA yet).

Is your business verified on Google My Business? Is your business listed in the 3 pack?

If not, give me a call on 0766406339 or drop me an email. I would love to help you!

Should you be afraid of negative SEO?

By | blog | No Comments
Credit: Adam Whitaker //adamwhitaker.com

Credit: Adam Whitaker //adamwhitaker.com

I came across this article on Search Engine Watch a few days ago, and wanted to give my take on negative . While negative SEO is definitely a very real thing, and happens on a daily basis, it is not something which most business owners will encounter in their lives.

I have been doing SEO in a bunch of different industries over the last two years, and have never had any of my, or my client’s sites hit by negative SEO. In saying that, I steer clear of certain industries that are known for negative SEO.

Read the below article, but keep in mind that negative SEO is not as prevalent as the article make it seem. Keep in mind that local business owners are very unlikely to ever deal with this tactic, unless perhaps in industries like insurance.

The article also mentions ways to protect your website from negative SEO, while there isn’t any real pro-active measures you can take to prevent negative seo from happening to your site. Dealing with this, entails your search engine optimization company making use of the Disavow Tool. With this tool, webmasters and SEO’s can ask Google to disregard certain links.

We’ve been talking about Negative SEO since the advent of Penguin and many still wonder whether it’s a real threat. The lore is that Google will protect websites from losing traffic due to negative SEO, but is this the truth, or a just myth propagated by Google?

You can find dozens of these gigs on Fiverr – as well as many individuals on sites, like freelancer.com and upwork.com – who offer these services. And for those who used to be in the black hat game, firing up software to drop blog comments, forum and profile links doesn’t take much and costs next to nothing

There are entire companies that have survived by relying on these tactics. Instead of trying to compete in the “content marketing” game, where the ante has been seriously upped, why not stay in their game by obliterating the competition?

Think about it: what happened to the thousands of individuals who used to make a living offering cheap link-building services with 10,000 directory submissions and 500 article submissions? Since this black hat approach simply doesn’t work to rank sites anymore – in fact, the effect is the opposite – why not use the same skills for a different product or service?

Yes, you read it right: large volumes of spam and fake links to destroy your competitor’s rankings. Why spend a fortune on things like content marketing and link attraction, when you can surgically remove your clients with Negative SEO?

We’ve been talking about Negative SEO since the advent of Penguin and many still wonder whether it’s a real threat. The lore is that Google will protect websites from losing traffic due to negative SEO, but is this the truth, or a just myth propagated by Google?

What to do when you get the “Googlebot Cannot Access Your JavaScript and CSS Files” warning

By | blog | No Comments


Google sent out mass warnings to users of Search Console (previously known as Webmaster Tools). Many webmasters were alerted that: “Googlebot Cannot Access Your JavaScript and CSS Files”.

This warning was issued inside of Search Console, as well as sent via email to webmasters. The email and Search Console notifications also informed webmasters that Googlebot’s inability to crawl these files might result in “suboptimal rankings”.

They make it sound very dramatic, but I am hear to bring some good news. A quick edit to your robots.txt file is like to resolve this issue for you, and get your site back to optimal rankings.

The full warning reads like this:

Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.

googlebot cannot access css and js files on website

This is what the email notification looks like

In October of last year Google made it widely known that blocking CSS and JavaScript from being crawled would be a transgression of their Webmaster guidelines. The warnings were only issued very recently.

If you received this warning because of your site blocking robots, be happy that you were notified. You know about the issue, and you can take the necessary action to resolve it.

The easy way to fix this, is to open up your robots.txt file. (PLEASE NOTE: if you are even just a little uncomfortable working on your robots.txt file, I would suggest you call your web designer or to make these changes for you.)

Search for the below text:

Disallow: /.js$*

Disallow: /.inc$*

Disallow: /.css$*

Disallow: /.php$*

If any of these lines are present in your robots.txt, you can go ahead and completely remove them. These lines are blocking Googlebot from crawling your CSS and JavaScript.

Once you have removed these lines, head on over to Google’s Fetch and Render tool. This tool will confirm whether Googlebot can now crawl your site without hindrance. If there is something else blocking Googlebot, the utility will advise you of the required steps to be taken.

If all of this sounds G(r)eek to you, give me a call on 076 640 6339 or drop me an email here and I would gladly assist you in fixing this crawl error.

Give your feedback on Google Search Console messages and notifications

By | blog | No Comments

Google Search ConsoleI came across this post on Search Engine Land this morning. Google is asking end users for some feedback on their Search Console as messages and notifications.

Google only recently renamed the service that was well-known as Google Webmaster Tools to . This tool is incredibly important for any search engine optimization company, as well as webmasters.

With the help of Google Search Console, you can learn about crawl errors, whether certain pages are indexed or not, which backlinks are pointing to your site, which search queries are being used to find your site, as well as a number of other very useful things.

Whenever I speak with a potential client, my first questions is always whether the client is using Google Search Console. Closely followed with the same question with regards to Google Analytics.

Search Console gives me a great idea of whether the client’s site has technical difficulties that must first be overcome before the seo can start. It also gives me a great idea of what types of searches are bringing in some traffic.

If you are already using Search Console, go to the survey here and let’s make this tool even better!

The survey can be accessed over here, and it starts off asking if your role is webmaster, SEO, marketer, etc. It then goes into more detail, asking how many sites you manage, how many messages you get per week, if you read them, where you read those messages, which ones you act on versus which ones you ignore, and why you might skip acting on them.

Clearly, Google is looking for ways to make the messaging through the Google Search Console better. Recently, Google announced they will reduce the notifications one sees based on which sites they technically own within the Google Search Console.

Google’s Mariya Moeva is seeking feedback on how the Google Search Console can do better at sending out messages and notifications to webmasters and those who verify their websites in the Google Search Console.