The UnAward17-winning best guest post author is sharing more valuable insights, this time with a colleague sharing lessons on how to get the most from Google.
by Lauren Kelly
Having won best blog for 2017 in the comms2point0 UnAwards, I’ll admit I’ve hovered over the send button on this one but hopefully, others will find it useful!
A few weeks ago we found we had been successful with our Cheshire East Council application to Google News. It was an effort encouraged by our communications team but relied heavily on our web team’s expertise to actually make it happen.
The benefits of being on Google News are written in quite a few blogs. They range from signifying that your website is endorsed by Google to increased authenticity and web traffic. It also means that we have raised the bar on our organic SEO on Google. We are now being listed in Google’s main index, meaning we are more likely to show at the top of a search, without having to pay a penny- so great value for money for our tax payers!
What has it meant so far?
Six weeks on, when we compare our google analytic stats from this time last year on our press releases they are up 97.3% in terms of views and our bounce rate has reduced by 13% - basically we are attracting more people and keeping them on our new look Media Hub for longer.
There are a couple of factors that have contributed, we think to being approved;
1. The technical
This is the (very) technical side to what our webmaster Steve Bennett managed to do – for some, you may want to copy and paste this to your web team then scroll to point 2(!);
After reading the general Google News guidelines, it was apparent that because we have mixed content (non-news and news), we had two options: one was to have a sub domain (www.news.cheshireeast.gov.uk) or to edit our robots.txt file.
The sub domain route would have meant creating a separate site, so we decided on the robots.txt route.
The robots.txt file sits on the domain root and allows us to advise bots such as google what not to crawl/include. There is no ‘include’ rule in the robots.txt file so I had to disallow everything apart from the folder where the media pages were contained.
Our robots.txt file first has a general disallow rule for any bot: ‘User-agent: *’. Here we place the path of pages/files that we wouldn’t want any bot crawling. Example: ‘Disallow: /pdf/’.
Care must be taken to ensure that the next set of ‘disallow’ rules only apply to the Google News bot. This was achieved by next specifying ‘User-agent: Googlebot-News‘. Below this, the path to all folder structures or pages that are not press releases must be specified. Example: Disallow: /benefits_housing_council_tax/. There is no need to duplicate listing of anything already covered in the previous wildcard bot declaration.
After publishing the updated robots.txt file, we logged on to the Google News Publisher Center and requested inclusion into the Google news index. Our application was reviewed within 48 hours by Google staff and approved.
In the publisher ‘center’, we are able to troubleshoot articles by entering the URL. Google have strict rules for inclusion to their news section and should anything be declined, the issue will be listed here. It even provides you with an opportunity to fix the issue and mark it as resolved. An important note to be aware of is that Google will only currently collect articles that are 2 days old or less.
2. Additional Google work
Prior to the technical work, Steve and I have been doing as much with Google as possible. Whilst we are not sure if this has lead to our success on Google News, it can’t have hurt in our attempt to be recognised by Google.
We…
- Post press releases to Google+
- Post other content to Google+ too (when we remember!)
- Upload videos to YouTube as and when we have them
- Registered and updated the Google profiles of most of our buildings (including cemeteries)
- Added photos to some of our registered places and linked them to the account that is now linked to the Google News (and our Google+ account)
- Use Google URL shortener for most of links
- Use Google Campaigns and campaign URL builders where possible
Next steps
We continue to learn about what this will mean for us but so far the results look positive.
After an initial hiccup of working out how to get our thumbnail images to show on Google News, we are now keen to develop, learn and increase the reach of our council news content – attracting more traffic to our media hub.
We think we may be the first public sector organisation to get onto Google News – if you are on there too, we would love to hear from you and share our metrics.
Lauren Kelly is a Senior Marketing and Social Media Officer at Cheshire East Council @Comms_Kelly Steve Bennett is a Webmaster at Cheshire East Council @SteveBennett100
image via ediluke