For many business, especially within the property sector, the majority of leads are generated online, be it through a property portals such as RightMove or Zoopla, direct through Google or through a social network such as Facebook and Twitter.
Google, as we all know, is an extremely powerful platform for lead generation. For prospects searching for your services it offers 2 options:
The cost will depend on how broad your marketing campaign is – the wider the net, the higher your costs will be. You can keep costs low by creating targeted remarketing campaigns. However, remarketing provides the most cost efficient way to improve the performance of PPC.
You can segment your visitors. For example, the visitors who view your sales pages and do not convert can be retargeted separately and encouraged to take action.
For a very long time SEO was the core digital strategy most businesses in need of leads would invest in. This has drastically changed over the past couple of years, mainly down to several algorithm updates Google launched. This meant that the larger companies with bigger marketing budgets were the ones able to stay and dominate the first page of Google for the more generic and higher volume key search phrases. On the other hand, smaller agencies had to adapt quickly by introducing more creative methods of SEO and focusing on local marketing.
It’s all about market share. The higher your market share, the higher your brand exposure, traffic and leads. Although SEO is no longer seen as the main driver of a digital strategy, it is still contributing to the overall market share, exposure and lead generation process and should not be disregarded.
It starts by convincing Google that your website will provide the answer the query or question that a potential vendor, buyer or landlord has entered into an online search bar.
The good news is that Google has taken another giant step towards rewarding websites that deliver a good user experience by updating what it calls its Core Algorithm.
This complex tool works by relying on more than 200 unique signals that make it possible to surface what anyone carrying out an online search might be looking for.
These signals include specific words that appear on websites, how fresh the content is, where your business is based and PageRank – a tool named after Google co-founder Larry Page that counts the number and quality of links to a page to determine how useful a website is. The underlying assumption is that the best websites are likely to receive more links from other websites.
But PageRank is no longer the only part of Google’s Core Algorithm that measures the quality of links your website might have.
Google has long been aware that some search engine optimisation practitioners have tried to trick it into ranking a website higher than it deserves on a search results page by engaging in unfair link building practices, also known as Black Hat SEO.
That’s why in 2012 Google introduced Penguin as a standalone algorithm update. This was designed to better catch sites deemed to be spamming its search results, in particular those buying links or obtaining them through link networks designed primarily to boost Google rankings.
Penguin 4.0 – which went live last month – is now part of Google’s Core Algorithm and is one of the 200-plus signals the search giant uses to decide your ranking on its search results page.
Not only that, Penguin is now what Google describes as granular. This means Google devalues non-genuine links by adjusting ranking of the offending page, rather than affecting the whole site.
More importantly, Penguin data is now refreshed in real time. Any changes to a web page’s ranking is now made as soon as the affected page has been recrawled and reindexed.
It’s too early to draw any firm conclusions but Google reminds us: “The web has significantly changed over the years, but… webmasters should be free to focus on creating amazing, compelling websites. It’s also important to remember that updates like Penguin are just one of more than 200 signals we use to determine rank.”
The most obvious way to identify that your site has suffered a Google penalty is by noting that it has experienced a reduction in organic traffic. To do that you will need to login to your Google Analytics account and compare your organic traffic of say the last 6 months with the 6 months prior to that.
If you see an obvious reduction in traffic you should access your Google Webmaster Tool (now known as Search Console) which can potentially analyse your links profile and work out if there are any potential issues that will result in a Google penalty.
In extreme situations, and especially in the event of a manual penalty Google itself will notify the site administrators that the website has been penalised.
There are several other tools on the market that allow additional website analysis, but the information that these tools provide must be interpreted correctly.
We at Art Division use several tools to help us assess a client’s website links portfolio and technical SEO and below we have listed 2 we use more often:
This tool allows you to understand the link profile of a particular website, its anchor text as well as its trust, citation flow and authority of a particular site and all those linking to it.
This tool gives us an in-depth analysis of the technical SEO which is implemented on a client’s site and allows us to better understand not only what we can do better but also what our competitors are doing.
Both of those tools allow some level of free use, after which you will need to have a paid subscription, although sometimes even with a basic test you can identify that something is not right and needs addressing.
Art Division has helped several clients whose sites have been penalised as a result of the Penguin algorithm update. However, it can take as long as 12 months to lift any penalty Google imposes on a website.
If you have identified that your website may be hit by a Penguin update, the best way to rectify this is to call in a website specialist who can identify the offending links and advise you on the steps to go about removing them.
Tags: Data Analytics, SEO