Aliya Fatima
9 min readApr 18, 2022

--

Bots — The excellent automation

Technological advancements have reduced many repetitive tasks of human beings. The repetitive tasks are programmed to conduct independently with minimal or no human interference. And this programming of repetitive tasks is known as automation. Automation is the creation and application of technologies to minimize labor or substitute humans in the most menial or repetitive tasks. Automation is present in virtually all verticals and niches such as manufacturing, security, and transportation. So where does a bot fits into the picture?

What are Bots?

A bot is a software application that is programmed to do certain tasks.

Bots are automated, which means they run according to a set of instructions without a human user needing to manually start them up every time.

Bots typically imitate or replace human user behavior. Because they are automated, they operate much faster than human users.

It is estimated that up to half of the internet traffic today is made up of computer bots carrying out certain tasks, such as automating customer service, interacting with web pages, simulating human communication on social networks, helping companies search online for content, and assisting with search engine optimization.

Most bots are harmless and crucial for making the internet valuable and useful, but bots can also be malignant and destructive when they are deployed by cybercriminals.

What are the different types of bots?

In the context of web security, there are two types of bots: good bots and bad bots. Both attempt to access web resources (pages, web applications, APIs, etc.) or perform other typical web activities of a human user, but they do so for different purposes.

Good bots: These carry out useful tasks, such as search engine bots that index content for search, site owners generally welcome these bots, because it keeps their sites visible in the search engines and ideally will result in more users/customers.

Or customer service bots that help users on different web pages.

Another common “good” bot is a data aggregator, which is meant to update some sort of directory or other content listings with information about the sites being visited.

Bad bots or hostile bots: These are deployed for malicious purposes. Their effects on the targeted sites and applications range from mildly harmful to potentially catastrophic attacks.

Any automated actions by a bot that violate a website owner’s intentions, the site’s Terms of Service, or the site’s Robots.txt rules for bot behavior can be considered malicious.

Bad bots are programmed to break into user accounts, scan the web for contact information for sending spam, or perform other malicious activities.

Bots that attempt to carry out cybercrime, such as identity theft or account takeover, are also “bad” bots.

WHAT’S A BOTNET?

To carry out malicious attacks and disguise the source of the attack traffic, attackers may distribute bad bots in a botnet — i.e., a bot network.

A botnet is a number of internet-connected devices, each running one or more bots, often without the device owners’ knowledge.

Because each device has its own IP address, botnet traffic comes from numerous IP addresses, making it harder to identify and block the source of the malicious bot traffic.

Botnets can often grow themselves by using devices to send out spam emails, which can infect more machines.

HOW CAN BAD BOTS AFFECT A COMPUTER?

Downloads are the most common way bots can infect our computers.

Malware is delivered in download format via social media or email messages that advise clicking a link.

The link is often in picture or video form, either containing viruses or other malware. If our computer is infected with malware, it may be part of a botnet.

A bot can also appear as a warning saying that our computer will get a virus if we do not click on the associated link. Clicking the link subsequently infects our computer with a virus.

These can carry out data and identity theft, keylogging sensitive information such as passwords, bank details, addresses, and phishing which are potential threats to the organization as well as ours or the consumer data.

MALICIOUS BOT ACTIVITY INCLUDES:

  • Credential stuffing and Brute force password cracking: Hackers steal credential sets (personal identification data, account logins, passwords, contact data, etc.) in massive data breaches.

Or, they discover credentials by sending out bots to wage brute-force attacks; the bots attempt to gain access to a web application by trying every possible combination of letters, numbers, and symbols, to see which combinations work.

Valid credentials can then be used in a variety of cyberattacks, and can also be sold in illicit marketplaces for others to use.

Credential stuffing is to “stuff” the credentials into the login pages of many other web applications (especially high-value targets like bank websites, payment providers, and so on).

Credential stuffing allows an attacker to leverage a single data breach into the successful takeover of multiple accounts across different websites.

  • Distributed Denial of Service (DDoS): It is considered one of the most dramatic and most feared botnet attacks.

It uses large networks of bots to create coordinated attacks on a massive scale.

This leads to the disruption of services of the targeted organization by overwhelming their web applications or APIs with incoming requests, making them unavailable for normal use.

If the victim cannot filter out the attack traffic, the disruption will last for as long as the attacker wishes.

Bots used for this purpose are termed DDoS bots.

  • Inventory hoarding: Inventory hoarding or denial is when a user selects and holds an item in a basket that is usually, limited in availability. Because that stock is held in a basket, it becomes unavailable for others to purchase.

Web applications which offer online purchasing or reservations are vulnerable to inventory hoarding when hostile bots make inventory unavailable to legitimate customers.

Malicious bots attack eCommerce sites by adding products to shopping carts but never completing the purchases.

Bots are used to hoard inventory in various areas of the travel industry. For instance, bots are programmed to carry out a flight reservation up until the point of payment. At this point, the seat is reserved for up to 20 minutes and real customers perceive there to be no availability. While the seat is being “hoarded”, the threat actor is attempting to sell the seat for a profit.

If they don’t get a buyer, the seat drops out of their basket and becomes available once again. At which point a new bot can pick up that available stock and repeat the process until the inventory is successfully sold.

  • Scrapping and data theft: Data scraping and web scraping are two different automated techniques that achieve the same end.

They harvest data from systems owned by third parties.

They extract the data, collate it, and store it in ways that facilitate its reuse.

Typically this means putting it into a database or into a portable format like CSV.

The information it contains will be used for crimes such as phishing attacks, spear-phishing attacks, social engineering attacks, and other financial frauds.

This can be seen in E-commerce sites that contain prices and other product data; possession of this data can be a competitive advantage, so scrapers are used to steal it.

  • Spam: Many websites accept user-submitted content like posts on forums, reviews on e-commerce sites and marketplaces, and so on.

These sites usually experience large numbers of bots continually posting spam comments, links, etc.

It may harvest email addresses from contact or guestbook pages.

Alternatively, they may post promotional content in forums or comment sections to drive traffic to specific websites.

The bots involved in carrying out spam are known as spambots.

  • Click fraud: Advertising bot attacks are quite serious and can cause a lot of damage. Click fraud produces a huge amount of malicious bot traffic specifically targeting paid ads to engage in ad fraud.

Bots involved in this are called Click fraud bots which are responsible for fraudulently clicking paid ads, this non-human traffic costs advertisers billions every year and is often disguised as legitimate traffic.

This will cause the advertiser to invest poorly and spend the ad budget in the wrong places.

Thus bot management solutions are able to sort out harmful bot activity from helpful bot activity via machine learning.

There are solutions that stop malicious behavior without impacting the user experience or blocking good bots.

Bot management solutions identify and block malicious bots based on behavioral analysis that detects anomalies, and still allows helpful bots to access web properties.

Apart from malware bots, there are other bots that are immensely helpful in our day-to-day lives.

  • Chatbots: A chatbot is a type of bot designed to interact with humans conversationally, based on its programming.

They automate the process of interacting with website visitors and social media followers in an attempt to create the best user experience.

Ideally, this helps the site maintain the presence of a helping hand, even when the team can’t respond.

Different types of chatbots include:

Rule-Based Chatbots — In general, these are simple chatbots that highly depend on user input. If customer queries fall outside the pre-defined rules, these chatbots fall short of recognizing conversation context and won’t be able to identify advanced scenarios.

AI Chatbots — Artificial intelligence (or machine learning) chatbots, on the other hand, use Natural Language Processing (NLP) technologies to understand the intent behind the question and solve the customer’s problem without any human assistance.

  • Spider bots or web crawlers: A web crawler, spider, or search engine bot downloads and indexes content from all over the Internet.

These bots learn (almost) what every webpage is about on the web so that the information can be retrieved when needed.

Spider bots or web crawlers are operated by search engines.

When a search algorithm is applied to the data collected by web crawlers, search engines can provide relevant links in response to user search queries, generating the list of webpages that show up after a user types a search into a search engine like Google or Bing.

Because it is not possible to know how many total web pages there are on the Internet, web crawler bots start from a seed or a list of known URLs.

They crawl the web pages at those URLs first. As they crawl those web pages, they will find hyperlinks to other URLs, and they add those to the list of pages to crawl next.

Given the vast number of web pages on the Internet that could be indexed for search, this process could go on almost indefinitely.

However, a web crawler will follow certain policies that make it more selective about which pages to crawl, in what order to crawl them, and how often they should crawl them again to check for content updates. That’s a whole big concept to study.

  • Social Bots: These bots are programmed to generate messages on social media.

These are automated to act as followers of users and generate messages, advocate ideas also as fake accounts to gain followers.

It has been found that around 9 to 15% of Twitter accounts are social bots.

  • Ticketing Bots: The bots are programmed to buy tickets to favorite known events.

The intention is to resell the tickets for a margin of profit.

The bots are designed to imitate human behavior as humans buy tickets.

The automated bots are estimated to buy tickets of around 40 to 95%.

  • Download Bots: The Download Bots are programmed to automatically download software or mobile applications. The bots are used to boost the download statistics.

These bots are used to gain a huge number of downloads on known app stores to aid the new applications to get to the top of the chart.

The bots are also used to create several fake downloads as the initial phase of the DoS (Denial of Service) attack.

Advantages of Bots:

  • Faster than humans in managing repetitive tasks.
  • Enhanced user experience improved customer satisfaction.
  • Multipurpose.
  • Customizable.
  • Reduced labor costs for organizations.
  • Available 24/7.
  • Can handle multiple users at once.
  • Organizations can reach out to a larger audience through messenger.
  • Saves time for customers and clients.
  • Gather customer insights.

To conclude I must say the internet wouldn’t be possible without bots. From web crawlers like Googlebot which allow us to quickly find useful information by searching through millions of web pages in seconds to the use of chatbots that has become a vital dialog window for all types of websites, it had made the internet a powerful and valuable tool they also present great threat when created by criminals. Nevertheless, Looking forward to the IT industry which will develop smarter bots and continually make the internet a better place.

--

--

Aliya Fatima

Artificial Intelligence and Data Science undergrad | Exploring UI/UX, Python, & Data analysis | Avid Learner