Understanding Your “Invisible” Audience – Bots! | Omnizant

Your law firm’s website has a secret, secondary audience: bots! While the opinions of your site visitors are most important, search engine crawlers may actually have a greater impact on the reach and ranking of your lawyer’s website.

We frequently remind companies that they write for multiple audiences, including current customers, potential customers and colleagues. But it’s important to understand that you’re also writing for an invisible audience that isn’t human at all.

This article explains the basics of search engine crawlers, how (and if) you can write to attract them, and how to communicate with Google crawlers if necessary.

What are bots and why do search engines use them?

A bot is a web crawler, or piece of code, that simulates a user and collects information about your website.. Bots collect data to help search engines properly index your website with the most up-to-date information.

During this process, Google sends several robots to “crawl” the pages of your site to learn more about the quality of your content. Basically, bots follow links and form a map of your site.

Whenever a bot finds new information on your site, it notifies the search engine so that your site’s entry on the search engine results page is accurate.

Bots search for keywords, but they also look at factors like freshness and loading speed. Why? A recently updated website means greater value, and a fast loading time means you have healthy servers. It is important to impress the bots so that your website ranks high in the search engine results page.

Writing for bots: to do or not to do?

It’s a sneaky trick question – let’s find out why.

Your primary audience are real humans and your secondary audience is bots. But the real purpose of bots is to determine if your site has good content for humans!

For example, if someone stays on your site for a long time or shares a link, these are human data points that matter to search engines. The best way to write for bots is to write high-quality content for humans.

Here are some best practices for producing bot-friendly and human-friendly content:

  • Use simple language that an average 10 year old could understand (no jargon!).
  • Create useful content with user intent in mind.
  • Readability of the scale, SEO and keyword placement.
  • Write short sentences and short paragraphs.
  • Use external links with a high domain authority when it makes sense.

How to Communicate with Google Bots

You are not at the mercy of search engines! You can tell Google how often you want their bots to crawl your website (if any). It’s called a exploration budget. You can’t increase it (because that wouldn’t be fair) but you can decrease it.

Why would you want to reduce or disable exploration?

  • You are worried that bot visits will interfere with the user experience of real humans.
  • You want to hide less useful content (like duplicate content) from Google.

Here is a good overview of how to maximize a crawl budget. (It is important to note that most law firms have relatively “small” websites and there is no need to manage your crawl budget. However, if you are part of an AM100 firm or if you work in-house for a large company, this can be useful). If you want to completely disable crawling, see the robots.txt file. You can also limit crawling via Search Console.

Note that there is a difference between preventing crawling and prevent indexing. To prevent indexing you can use something called a noindex directive.

Ask your developer for help Strategize how to present the best possible online experience to Google bots.

Review and next steps

Bots are an essential part of running a website. But once you know how bots work, you can use them to your advantage!

  • Bots collect information about your website and share it with search engines.
  • To impress the bots, write friendly content.
  • You can reduce bot crawling if needed.

Understanding your invisible bot audience is key to the success of your lawyer website. Talk to your online marketing agency to come up with a plan that takes bots into account.

Comments are closed.