Rising threats to Hosting Providers

Hosting servers have been the target for unethical hackers, to spread malwares. One in three websites are either hacked or waiting to be hacked. Site owners are not aware till their site gets blacklisted. This post says 30,000 sites are hacked every day. Site owners have unknowingly become an accomplice for the cyber criminals to run their malware campaigns. Lets take a look at recent breaches targeting Hosting providers.

Web.com

Web.com is US based web hosting company having 93,000 customers. Credit card information of clients were reported to be stolen.

000Webhost

000Webhost, established in 2007, offers free and premium hosting services. In October this year, 13.5 million user account credentials of 000Webhost, were reported to be leaked. Apparently, the forum site was running old version of vBulletin, which was vulnerable. It was found that usernames and passwords were stored in plain text, and the signup page was not encrypted either.

What was the impact of this breach?

  • Customers were asked to move to premium service, as it was not affected by this breach
  • Though the service is free, it is the clients who lost their confidential data, which could have credentials to login elsewhere.

Hard lessons learned from reactive approach, when a proactive approach could have saved all the fuzz.

Easily

Easily is UK based web hosting provider, with over 100,000 customers worldwide. A news today reported that exposure of unspecified number of customer domain names. There were reports of phishing attempts in bad English, and confusion, when mails were received asking to reset password.

WP Engine

WordPress hosting service WP Engine informed customers this week that credentials may have been compromised. The provider hosts 40,000 customers worldwide. The details have not been disclosed, yet.

WordPress is a very popular CMS(Content Management System), and powers major share of internet websites. Hosting providers have become a very lucrative target, purely cause of reachability to mass surfing population of 3.3 billion.

Most of the site breaches are because of lack of security awareness among the clients, and not because of hosting provider's mistake. These breaches don't make it to the news, cause they happen in small numbers at several different places, and obviously it would flood the news to cover all.

But having said this, Hosting providers would have to evolve to tackle these situations, and offer security as part of the Hosting service. Or least the ones who do, will have an edge over others. At the end of the day, there is no such thing as completely secure service. It just the matter of making it very difficult for unethical hackers. Evolving security measures will always be a cat and mouse game, with unethical hackers.

Here is my view of what can be done, to make a safer and reliable hosting environment, and I will stick to Linux. I will state what problem can be solved, with its remedies.

Overview

Conventional(actually old) hosting architectures have web servers and database running on the same server, on single server.

Problem : One data breach, exposes not only other site's contents, but also exposes all databases. One infected site affects other sites

Remedy : CloudLinux offers CageFS, which isolates site files from each other

Problem : One site eats server resources, leaving other sites slow and unresponsive

Remedy : By routing web traffic via loadbalancer, it is possible to throttle inbound connections as per need. Also, by using image caching at the loadbalancer, significant amount of IO can be reduced.

Problem : The webserver is a single point of failure

Remedy :

  • Using distributed file system, it is possible to replicate data to multiple servers, and hence distribute sites across set of servers, and multiple sets of servers.
  • Using loadbalancer, it is easier to customize traffic routing, when a server fails.
  • Sharing single storage for all webservers may not be a good idea, as it can become a bottleneck, affecting all sites. It is better to break into smaller chunks to distribute risks.
  • Application level loadbalancers can scale linearly.

Problem : Unethical hackers are attacking my site

Remedy : There are multiple ways to mitigate:-

  • Third party web application firewall services typically works by changing DNS, which can be bypassed. Having application firewall in front of webservers, ensures that it cannot by bypassed.
  • Logs can be of great help to detect attacks. For example, lots of 404(file not found) or 50x(server errors) can reveal suspicious activity. Unethical hackers typically take one vulnerability and scan for domains. Using a centralized log system and having intelligence to alert loadbalancer, can mitigate attacks.
  • Using IDS tools like Bro, can be highly customized

Problem : Site being DDOSed

Remedy : To certain extent, DDOS can be mitigated at the loadbalancer level.

Problem : Site blacklisted cause of malware

Remedy :

  • How did the malware get uploaded in the first place?

    • Vulnerability in the site allowed uploading malware - Using several open source vulnerability scanners, it is mostly possible to detect. We are very likely to be looking for common well known vulnerabilities.
    • FTP credentials stolen and malware uploaded - FTP traffic can be sniffed, and should be replaced by FTPS.
    • Some other site affected and uploaded malware on to another site - CloudLinux takes care of this
  • How quickly a malware can be caught?

    • Scanning new files and frequently scanning modified files, with malware engine, can detect before it is too late.
    • Running malware engine can be a challenge and may need manual intervention for analysing certain files(exe, pdf, swf, zip etc), which can be queued up.

Problem : Site has been defaced

Remedy : Maintaining versions of site content, elsewhere(not server cause some directories and files may not be a good idea) can help quickly revert to last clean state. Though, databases may have to be manually restored from backups.

Problem : Daily site scanning, really necessary?

Remedy : How often does a site need to be scanned?

  • I reckon, sites containing blog or comment, allowing anyone to edit, are the ones that need scanning everyday.
  • Commonly used CMSes, need to be scanned only when a new vulnerability is out, for a specific application version. Vulnerability scanners will have to keep track of application version.
  • Vulnerability scanners can also be triggered, when there is a change in a file, along with invoking malware engines.
  • Other sites only need one time testing.
  • The scanners don't have to run on live environment, and hence clients will never find anything in the logs.

This is a minimalistic and cost effective approach, and debatable on few aspects. I believe that many Hosting providers will start thinking about security from within, cause that is the root.

Thank you for reading!

Dinesh Gunasekar - | Tags : Hosting, Cloud
comments powered by Disqus