The views expressed by contributors are their own and not the view of The Hill

The good, bad and ugly of ‘bots’ online

Thinkstock


For many government agencies, traffic from bots – aggregators, scrapers, crawlers – can account for a significant percentage of their overall website traffic. This includes both good bots engaged in essential business tasks and bad bots performing harmful activities. Agency CIOs might know how much of their traffic is from bots, but what they may not know is the impact those bots may be having on their agency’s site. Even the good bots can have a negative effect on an agency’s website.

The good

{mosads}Constituents today are engaging more with government agencies through a digital channel and bots are a big part of that. When people are looking for something they first conduct an online search. During such a search, search engines create bots to crawl websites, and return information on a site’s content to help shape how those websites are prioritized in search results. This is part of Search Engine Optimization (SEO). It can be done via bots created internally or via a third-party service. While this is a necessary and important need, agencies must ensure that these bots get the information they need without negatively impacting constituents’ digital experiences. Agencies must ensure high website performance for search-engine crawlers as well as users, because slow site load times can negatively affect rankings and citizen service response time.

 

The bad

A lot of data that agencies possess is sensitive and, therefore, attractive to bad actors who can use bots to regularly and automatically crawl agency websites to “scrape” this data. This is the basis of some of the largest and damaging cybersecurity attacks in general. But the stakes are much higher for government agencies since the outcome impacts reputation and public trust. Unfortunately, there are a lot of malicious tools that make this very easy to do for anyone with bad intents.

The ugly

Today’s constituents are influenced by their daily interactions on social media and other platforms. They expect all digital experiences to be high performing and lightning fast, and have limited tolerance for slow page load times. Too many bots operating too freely, regardless of the type or intention, will result in a degradation of website performance, causing legitimate, human traffic to have a negative experience, which can lead to site abandonment.

In addition, agencies use data from their website to gain greater insight and provide relevant information to constituents, uniquely tailored to their specific needs. This, in turn, enables them to deliver a more customized user experience, leading to more satisfied citizens. However, a by-product of the proliferation of bot traffic is that data, which drives key tactical and strategic decisions, can become polluted. That includes both legitimate human activities and bot activities. Bots skew the data and misrepresent the true nature of constituents, invalidating conclusions drawn from the data set.

The answer

Agencies need to manage bot traffic to ensure the best possible outcome — maximizing the positive results and minimizing the negative — depending upon the type of bots they see. For example, while you want to allow good bots to do their job, there are circumstances where they may need to be ratcheted back to ensure the human web traffic can access agency websites without performance issues.

When an agency encounters bad bots, blocking them is only a temporary solution, which can be ineffective in the long run; blocked bots will simply return smarter and faster. By managing how those bots are allowed to interact with the site, you can minimize the negative impact of those bots without tipping off the operator that you are on to them. Two common solutions are simply slowing them down to reduce the value and timeliness of the information they are gathering or serving them alternative information such as pushing them to a page with intentionally inaccurate content.

Bots are an essential part of our Internet ecosystem and improving citizen services. However, bot traffic can reduce website performance for legitimate users and increase IT costs. To address this, agencies need a flexible framework to better manage their interaction with different categories of bots and the impact bots have on their agency and IT infrastructure.

Tom Ruff is the public sector vice president at Akamai Technologies.

Tags Bots cybersecurity Technology

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Top Stories

See All

Most Popular

Load more