Website Marketing for Attorneys: How to Prevent Bots From Crawling Your Website

Good2bSocial
Contact

Good2bSocial

Generally speaking, you want your website content to be as “crawlable” as possible. It’s important for spiders – such as those from Google – to be able to view your site quickly and easily. However, there might be times you want to block bots. Keep reading to learn why you would want to block some bots and how to do so in order to improve website marketing for attorneys.

Website Marketing for Attorneyswebsite marketing for attorneys

What is a Bot? 

Many people aren’t really sure what a bot is and that makes them difficult to prevent. Short for “robot”, a bot is a software application that’s designed to repeat a particular task over and over. SEO professionals can utilize bots to scale their SEO campaigns by automating as many tasks as possible. They can help digital teams to work smarter instead of harder, for example scraping useful data from search engines. 

Are Bots and Spiders Harmless? 

For the most part, both spiders and bots are harmless. You actually need them in many cases. For example, you need Google’s bots to crawl and index your site in order to appear in search. Occasionally, though, bots can pose problems and provide unwanted traffic. This matters because: 

  • They can confuse the matter of where your traffic is coming from. 
  • They can muddle reports and make them hard to understand (and less useful). 
  • You may encounter misattribution in Google Analytics. 
  • Bandwidth can be increased to accommodate additional traffic, which can add to your costs. 
  • Unwanted traffic can lead to other small nuisances that take up resources to deal with. 

Essentially, there are good bots and bad bots. The bots that you want are running in the background and not attacking another user or website. Bad bots, on the other hand, break the security behind a website and can be used as a large-scale botnet to serve DDOS attacks against certain organizations. In these cases, a botnet can do what a single machine could not. 

By preventing certain bots from visiting your site, you can protect your data and see other benefits such as: 

  • Securing sensitive client data and other information from forms  
  • Preventing software from taking advantage of security vulnerability to add bad links on your site
  • Limiting bandwidth costs by preventing an influx of traffic you don’t want

How to Prevent Bad Bots from Crawling Your Site

Fortunately, there are things you can do to decrease the chances of the negative bots getting into your website. It’s not easy to discover every bot that can crawl your site, but you can usually find malicious ones that you wouldn’t want visiting. 

One method is through robots.txt. This is a file that is on the foundation of your web server. Sometimes it is there by default, but usually, it needs to be created. Here are some of the files you might find useful. 

1. To disallow Googlebot from your server 

Note: don’t use this one lightly. This is for instances when you want to stop Googlebot from crawling your server at all, such as preventing it from crawling your staging site. 

  • User-agent: Googlebot
  • Disallow:/

2. To disallow all bots from your server

To avoid bots altogether, use this code. You might use this when you want to keep your site private for a while before a broad launch. 

  • User-agent: *
  • Disallow: /

3. To keep bots from crawling a specific folder

You may want to keep bots from crawling a certain folder. To do so, use this code: 

  • User-agent: *
  • Disallow: /folder-name/

It’s important to avoid certain mistakes that are common. The top errors include using both disallow in robots.txt and noindex, not including the correct path, or not testing the robots.txt file. 

There are other methods of blocking bots that enable users to be more specific about which bots to block and how to do so, but they can get pretty technical. Unless you have a developer on staff, you may want to ask your web design partner for some guidance. 

Takeaway: 

Blocking bots and spiders does require some extra steps, but it’s worth the time. Doing so keeps your site safer and ensures that you don’t fall into certain traps. By controlling certain bots, you have a better ability to automate your SEO processes and improve website marketing for attorneys. All of these things will enable a much stronger site that will be useful and optimized for years to come. 

Written by:

Good2bSocial
Contact
more
less

Good2bSocial on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide