Three New Threats To Your ... Assets (and how to avoid them)

Three New Threats To Your Web
Assets (and how to avoid them)
Three New Threats To Your Web Assets (and how to avoid them)
Introduction
Today’s Internet-connected business is facing a bewildering diversity of threats.
If your business has web assets, it is not a question of whether you will be attacked; it is merely a
question of when. A recent survey of data center operators showed that 94 percent experience at least
one Distributed Denial of Service (DDoS) attack per month, and 22 percent experience 11-100 attacks
per month.1
The costs associated with a successful attack are
substantial. According to a Ponemon Institute survey, the
average cost per data breach in the United States is $5.4
million. On average, an organization suffers a cost of
$277 for each customer record that is lost or stolen by a
malicious attack.2 Victims of DDoS attacks lose, on
average, $22,000 per minute of downtime. 3
Internet threats come in many forms, but three in
particular are growing rapidly worse: network intrusions,
DDoS attacks, and data scraping.
These threats are not new. What is new is the rapidly
escalating scale and sophistication of these attacks—and
the growing inadequacy of traditional defenses to protect
against them.
Executives need to understand:
• The nature of each threat.
• Why each threat is getting worse.
• And why traditional countermeasures have grown inadequate.
To fully protect against contemporary Internet threats, a robust next-generation solution is required. One
will be discussed at the end of this paper.
1 Worldwide Infrastructure Security Report 2012, Arbor Networks, p. 47
2 2013 Cost of Data Breach Study: United States, Symantec/Ponemon Institute, pp. 1, 3
3 Cyber Security on the Offense: A Study of IT Security Experts, Radware/Ponemon Institute, November 2012
1
Three New Threats To Your Web Assets (and how to avoid them)
Threat 1: System Intrusion
What It Is
A system intrusion is any event that involves malicious entry into your network. It includes a wide range of
attack techniques, including SQL injection, cross-site scripting, and more.
The motive for such intrusions can vary widely. Hackers often break into web servers merely to deface the
sites for political purposes. On the opposite side of the spectrum is the company recently described in a
speech by F.B.I. Executive Assistant Director Shawn Henry, which was the victim of an intrusion and “lost
10 years worth of research and development—valued at $1 billion—virtually overnight.” 4
Why It’s Getting Worse
As cybercrime has grown more lucrative, its practitioners have grown professional. (According to a
Washington Post article, organized crime now represents the majority of data breaches. 5) Today’s criminal
hackers are highly skilled, and often well financed. Their attacks are systematic and thorough.
There are three major forces driving the increasing
technical sophistication of hacker groups: the increasing
presence of organized crime, the rise of governmentsponsored groups, and new attack techniques introduced
by national ‘weaponized software’ packages.
• Organized crime groups (especially in Eastern
Europe) now treat computer crime as a profession,
and a mature industry has arisen around criminal
Example of services offered on
Russian hacker sites
Programming service; Perl, PHP, C, $250
Java, etc.
Trojan for bank account stealing
$1,300
Trojan for web page data replacement in a client’s browser
WebMoney Keeper Trojan
Credit card checker
hacking. There are underground marketplaces
Backdoor
where every possible resource or skill is available for
hire. Botnets and other attack resources can be
LiveJournal spammer
rented by the hour, day, week, or month. Hackers
with specific attack skills offer their services as
$850
$450
$70
$400
$70
$15–20
$2,500–3,000
Fakes of different programs
Eleonore Exploit Pack v. 1.6.2
Source: Russian Underground 101, Trend Micro Inc.,
2012
freelancers. Customized malware is available directly
from its authors, and the malware itself is
technologically advanced.
Government-sponsored cybercrime is the second force driving the increased professionalization of
illicit hacking. China in particular has a large program devoted to plundering technology from the West.
Media reports about foreign hacking tend to focus on high-profile weapons systems like the U.S. F-35
advanced fighter jet (the secrets of which were stolen by China), but the Chinese are equally interested in
stealing from private businesses. According to a New York Times article, “They have stolen product
blueprints, manufacturing plans, clinical trial results, pricing documents, negotiation strategies and other
proprietary information” from dozens of major corporations, including Coca-Cola, RSA, Scheider Electric,
and Lockheed Martin.6
4 www.fbi.gov/news/speeches/responding-to-the-cyber-threat
5 Brian Krebs, “Organized Crime Behind a Majority of Data Breaches,” Washington Post, April 15, 2009
6 David Sanger and Nicole Perlroth, “Hackers From China Resume Attacks on U.S. Targets,” The New York Times, May 19, 2013
2
Three New Threats To Your Web Assets (and how to avoid them)
Weaponized software is the third driver of increasing malware sophistication. In the last few years, there
have been several instances of governments attacking rival nations with software, such as the Stuxnet
computer worm (which destroyed much of Iran’s nuclear program), and other related applications.7
Many experts believe that these applications were supposed to delete themselves after their work was
done, but for various reasons, this failed to happen. In any case, these programs were discovered, and
copies have now been distributed around the world.
As former U.S. cyberterrorism czar Rickard Clarke explained:
“If you’re a computer whiz you can take it apart and you can say, ‘Oh, let’s change this over here,
let’s change that over there.’ Now I’ve got a really sophisticated weapon. So thousands of people
around the world have it and are playing with it... The best cyberweapon the United States has
ever developed, it then gave the world for free.” 8
These ‘weaponized software’ packages are very advanced, including multiple new attack techniques
never seen before. Now they’re being studied closely and adapted by criminal gangs for use in private
attacks against businesses. This is a very ominous trend.
Why Traditional Defenses Are Inadequate
Traditional countermeasures against intrusion usually involve on-site hardware and/or software solutions.
The problems with these are numerous:
• The wide variety of threats requires a multitude of products. It’s difficult to choose an optimal
combination that adequately defends against all possible threats. It’s also challenging for staff to
properly operate and maintain a diverse suite of software and hardware products from multiple
vendors.
• The threat environment is constantly evolving. Staying current on the newest forms of attacks is
mandatory to maintain adequate defenses, but organizational staff often find it difficult to do this.
• The solutions themselves can have vulnerabilities. Security devices and software packages are
complicated, and are (by their nature) in constant direct contact with attacking hackers, who are
always probing for weaknesses. Not infrequently, the hackers are successful in finding some.
Vendors must then issue revisions, but sometimes these revisions aren’t issued right away. Even if
patches are issued promptly, individual organizations might not deploy them immediately, possibly
from an inattentive IT staff, or poor communication with the vendor, or even an executive decision to
delay the updates (typically from reluctance to modify the web infrastructure during a busy, highrevenue period, such as holiday shopping seasons for e-commerce sites). Whatever the reason, as
long as unpatched problems exist in your defenses, your web assets are vulnerable.
• Continuous education is required for on-site staff. As security vendors update their products,
organizational staff must receive continuous education to keep up with the new capabilities.
All considered, these deficiencies make robust security extremely difficult today.
7 Related applications include Flame, Gauss, and DuQu.
8 “Richard Clarke on Who Was Behind the Stuxnet Attack,” Smithsonian Magazine, April 2012.
3
Three New Threats To Your Web Assets (and how to avoid them)
Threat 2: Massive Distributed Denial of Service (DDoS) Attacks
What It Is
A DDoS is an attempt to make some or all of your web resources unavailable to the intended users.
This is the most straightforward form of Internet attack. The assailant is not trying to break into the
targeted system; he merely intends to overwhelm it and make it unresponsive to legitimate traffic.
Why It’s Getting Worse
The scale and sophistication of DDoS attacks are growing to unprecedented levels. According to eWeek,
the average size of the largest DDoS attacks in early 2013 increased sevenfold since the previous year.9
There are three primary reasons for the increasing scale and success of DDoS assaults:
• Hacking software and tactics have grown increasingly powerful. New tools like High Orbit Ion
Cannon can target up to 256 web addresses simultaneously, leveraging the attacker’s efforts.
Meanwhile, attacks themselves are getting more sophisticated; for example, an application-layer
DDoS can exhaust the target’s resources merely by exploiting logical weaknesses in its software,
rather than using the cruder traditional method of overwhelming the server with incoming network
traffic.
• Attack resources are abundant and cheap. A DDoS attacker will frequently use a botnet (a
network of malware-infected computers which can be controlled remotely) to flood the target with a
deluge of traffic from a variety of locations. In the past, hackers had to build their own botnets,
which required considerable time and resources. Today, botnets can be rented very cheaply. For
example, DDoS services are available from the Russian underground for just $10 per hour, or $150
per week. 10
• “Hacktivism” has made DDoS accessible even to non-hackers. Hacktivist groups such as
Anonymous—hackers who mount cyber-assaults to promote a political or social cause—often
consist of a few technically skilled individuals who recruit and supervise a much larger number of
non-skilled attackers. To equip these attackers, hacktivist programmers have created simple but
powerful software that enables a mob of ordinary Internet users to successfully overwhelm a target
with no technical skill whatsoever. In some cases, all an attacker needs to do is browse to a web
page. The page can hit the target with 200 requests per second, for as long as the attacker keeps it
open in his browser.11 Thus, groups like Anonymous are no longer restrained by technical limitations;
for hacktivists, mounting a large-scale assault has become more of a sociological exercise (i.e., how
large of a mob can be recruited via the Internet to browse to a specified page and keep it open for
hours or even days).
DDoS attacks are cheaper and easier to mount than ever before, and they’re growing in ferocity and
sophistication.
9 “Top-End DDoS Attack Bandwidth Surges Sevenfold: Report,” www.eweek.com/security/top-end-ddos-attack-bandwidth-surges-sevenfold-report/
10 Russian Underground 101, Trend Micro Inc., 2012, p. 8
11 Imperva’s Hacker Intelligence Summary Report; The Anatomy of an Anonymous Attack, Imperva, 2012, p. 15
4
Three New Threats To Your Web Assets (and how to avoid them)
Why Current Defenses Are Inadequate
The most popular current solutions to DDoS threats can be sorted into two categories: on-site hardware
appliances and offsite services offered by vendors such as telcos and ISPs (Internet Service Providers).
Both are inadequate to fully protect an organization against a large-scale contemporary DDoS.
On-site hardware appliances can process incoming packets, to filter and discard illegitimate traffic. This
“bandwidth management” will provide some resistance, but not immunity, to DDoS attacks. Even if some
appliances can increase your effective bandwidth, your bandwidth will still have a limit, which can be
exceeded by a large-enough attack. Thus, even after investing in an expensive hardware platform, your
site can still be brought down.
Also, an on-site appliance only addresses the last stage of the problem. When a large DDoS attack
occurs against your site, the traffic flows through your upstream bandwidth provider, which must defend
itself against the onslaught. The easiest solution is often to ‘black hole’ (discard) all your traffic. Obviously,
your appliances will do no good if your service provider cuts them off from the Internet.
Lastly, hardware solutions can be challenging to deploy. They require a well-trained IT staff as operators,
who in turn require substantial education with frequent refresher training to stay current as the threat
environment evolves.
Off-site vendors such as telcos and ISPs avoid many of the downsides of onsite hardware solutions.
However, they have two major weaknesses of their own.
First, ISPs and telcos are in the business of selling bandwidth. Their staff personnel are generally not
DDoS specialists. (Indeed, for an ISP such specialized capabilities would be a cost center rather than a
profit center.) However, modern DDoS attacks are frequently very difficult to diagnose.
A skilled DDoS assailant will use techniques that masquerade as legitimate traffic. Although it might be
obvious that a massive DDoS is in progress, an ISP might still be helpless and unable to distinguish the
garbage traffic from the legitimate visitors.
For example, a Slowloris attack (named after the software tool written by security researcher Robert
Hansen) sends partial HTTP GET requests to a Web server, adding to these requests slowly over time,
but never completing them. Eventually, all available connections are occupied with Slowloris activity, and
the server becomes unavailable to any other traffic. A similar attack is SlowPost, which sends a very slow
series of HTTP POST requests to accomplish the same goal.
Slowloris and Slowpost are simple examples of a much broader trend: application-layer attacks that
target vulnerabilities in how specific server applications work. Attacks like these are very difficult for nonspecialist vendors to mitigate, because the requests being sent to the server appear to be legitimate. As
a result, they're becoming more popular among attackers.
According to Gartner Research, application-based attacks like these are expected to rise to 25 percent of
all DDoS activity in 2013.12 Organizations which rely on ISPs and other non-specialist vendors for
protection will be increasingly vulnerable.
12 Arming Financial and E-Commerce Services Against Top 2013 Cyberthreats, Gartner Research, January 29, 2013
5
Three New Threats To Your Web Assets (and how to avoid them)
The second major weakness of relying on ISPs for protection is the ballooning scale of current DDoS
assaults. According to a report from Arbor Networks, more than 21 percent of all DDoS attacks now
range from 2,000 Mbps (Megabits per second) to 10,000 Mbps. 13 Compare this to the bandwidth of
some commonly-used Internet pipes:
Connection
Mbps
Connection
Mbps
Connection
Mbps
T1
1.5
OC3
155.5
OC48
2,488
T3
43.2
OC12
622.1
OC192
9,953
OC1
51.8
OC24
1,244
OC768
39,813
In other words, if your organization has any connection below OC48, and you are hit with a DDoS, there's
a one in five chance that the DDoS will exceed the capacity of your entire pipe.
Consider also that if the assailant is truly motivated, even the largest pipes can be overwhelmed. The
record so far is a March 2013 attack on Spamhaus.org, which exceeded 120 Gbps (120,000 Mbps).
According to Prolexic Technologies, more than 10 percent of DDoS attacks now exceed 60 Gbps
(60,000 Mbps). 14
Few organizations have enough available bandwidth to withstand these levels. Again, if such an attack
occurs, the ISP will often be forced to ‘solve’ it by null routing (discarding) all the traffic. This cuts the
organization off from the Internet—which is exactly what the attacker wants to accomplish.
The growing scale and sophistication of contemporary DDoS attacks require a new approach for
mitigation.
Threat 3: Scraping
What It Is
Scraping is the automated copying of information from a web site. Some forms of scraping are
benevolent, such as Google’s indexing bots, which constantly crawl the web. Conversely, malicious
scraping bots harvest information for purposes that are harmful to the interests of the web site’s owner.
Scraping can be tremendously costly to the victim. The damage can be as subtle as a long-term loss of
sales, as when competitors scrape your site and set their prices to undercut yours. Or it can be as blatant
as the theft and publishing of your information on other websites, which can make your site less
important or even irrelevant in your market.
Scrapers account for a substantial percentage of web traffic today. These bots usually masquerade as
benevolent traffic, such as Googlebot. (In one study, 16 percent of the surveyed websites had been
targeted by Google impersonation attacks. Among these sites, 21 percent of the bots claiming to be
Googlebot were imposters.15)
13 Q1 Key Findings from ATLAS, Arbor Networks, Inc., www.arbornetworks.com/corporate/blog/4855-q1-key-findings-from-atlas, April 22, 2013
14 Q1 2013 Global DDoS Attack Report, Prolexic Technologies,
www.prolexic.com/news-events-pr-giant-ddos-attacks-overwheliming-appliances-isps-carriers-content-delivery-networks-q1-2013-report.html, April 17, 2013
15 www.incapsula.com/the-incapsula-blog/item/369-was-that-really-a-google-bot-crawling-my-site
6
Three New Threats To Your Web Assets (and how to avoid them)
Why It’s Getting Worse
By their nature, scrapers are difficult to exclude, because the information they harvest is usually meant to
be accessible to public web traffic. Therefore, your site must be able to distinguish an actual human
visitor from a malevolent scraper bot, so that one can be granted access, while the other is not.
This has grown very problematic, as hackers have created bots that masquerade not just as benevolent
bots, but as human visitors. These bots appear to be genuine browser traffic by sending keystrokes,
mouse movements, and mouse click events to the web site.
Why Current Defenses Are Inadequate
Traditional defenses against scraping include challenges such as CAPTCHA16 tests embedded in the
site’s code: tests that humans can pass, but bots cannot. Unfortunately, despite the ubiquity of
CAPTCHAs on the Internet, hackers have cracked many of these schemes.
A Stanford University study found that 13 out of 15
CAPTCHA schemes used by popular web sites were
vulnerable to automated attacks. The research team
demonstrated a tool called Decaptcha, which can crack
not only many text-based CAPTCHAs, but audio ones
as well. Even prominent sites are vulnerable: for
example, Decaptcha can break the CAPTCHAs for
Authorize.net (part of the Visa credit card network) twothirds of the time.17
A typical CAPTCHA test
CAPTCHA vendors have responded by trying to make the tests harder for bots to read. Unfortunately,
this has resulted in tests that even humans have difficulty decoding. Web site owners who use these new
schemes risk alienating their users and even preventing people from using their web sites at all.
Ultimately, CAPTCHA tests and similar efforts are doomed to failure. In addition to automated tools,
scrapers and hackers always have the ultimate trump card: paying live humans to pass the tests.
Abusive Internet practices like scraping and (especially) spamming are so lucrative that a thriving market
has arisen for CAPTCHA-passing services, including brokers and middlemen. (The going rate is anywhere
from $.80 to $1.20 per 1,000 deciphered boxes.) 18
The traditional defenses against scraping have failed. A new approach is needed.
16 “Completely Automated Public Turing test to tell Computers and Humans Apart,” a trademark of Carnegie Mellon University
17 Bursztein et al, “Text-based CAPTCHA Strengths and Weaknesses,” ACM Computer and Communication Security 2011. Available at
cdn.ly.tl/publications/text-based-captcha-strengths-and-weaknesses.pdf. See also news.stanford.edu/news/2011/may/captcha-security-flaw-052311.html
18 Vikas Bajaj, “Spammers Pay Others to Answer Security Tests,” The New York Times, April 25, 2010. Accessible at
www.nytimes.com/2010/04/26/technology/26captcha.html
7
Three New Threats To Your Web Assets (and how to avoid them)
A Next-Generation Solution to These Problems
As has been shown, the current generation of security solutions is inadequate to provide full protection.
Much of the problem originates in the typical network topology and data flow for a website:
In this configuration, all traffic—whether it’s legitimate or malevolent—flows directly to the datacenter
hosting the web site. The site must then defend itself against whatever threats are present in that traffic.
Reblaze is a next-generation cloud-based platform that restructures the traffic flow like this:
8
Three New Threats To Your Web Assets (and how to avoid them)
Here, traffic does not go directly to the web site. Instead, it is routed to Reblaze’s security gateways,
which are found around the world. Legitimate traffic is allowed to pass through the gateways to the web
site, while hostile traffic is blocked.
Reblaze blocks system intruders because:
• Dynamic address allocation hides the target’s network from the attackers.
• Would-be intruders must first make it through a Reblaze Security Gateway, a hardened portal
operated and monitored by a team of security experts. Even as attackers get more sophisticated
and new attack techniques are used, immediate countermeasures are deployed across the Reblaze
network.
• Protection is 24x7 and automatic. Reblaze’s goal is to make clients “secure by default.” On-site
organizational expertise is no longer needed. Nor is there a need for clients to keep on top of the
constantly changing threat environment on the web, because Reblaze’s experts do this already.
DDoS attacks are defeated because Reblaze provides:
• Instant bandwidth scaling. The cloud is used to absorb even the largest DDoS attacks, with
additional resources coming online as needed, instantly and automatically.
• Elastic load balancing, distributing attacks across Reblaze’s global clusters. This eliminates the
stress on both the targeted site and the site’s upstream provider, so that the provider has no reason
to throttle the traffic even during a large-scale assault.
• A team of specialists monitoring Reblaze’s gateways, detecting and defeating DDoS attacks as
they occur. Client organizations and businesses no longer need to maintain this expertise in-house,
nor do they need to worry about potentially unreliable support from telcos and ISPs.
• Automatic protection. Client organizations do not need to monitor their web assets 24/7, waiting
to respond to an attack. DDoS protection is continuously active—built into the design of Reblaze.
• Content Delivery Network (CDN) integration, providing the bandwidth of Amazon’s CloudFront to
client sites on-demand as needed. This is far larger than the bandwidth being offered to site owners
by telcos and ISPs.
• A private cloud network for each client. Every Reblaze client network operates in a secured
private environment. Thus, each client is unaffected by whatever attacks (DDoS or otherwise) other
clients might be experiencing.
Scraper bots and data thieves are excluded with:
• Next-generation bot identification algorithms, preventing malevolent bots from accessing
protected data.
• Advanced traffic analysis, using visitor behavior, challenges, and honeypots to detect even
advanced bots powered by full-stack browsers such as Webkit, Chromium/V8, and IEWebBrowserControl that bypass traditional bot detection methods. (This also defends against
automated penetration tools and attempted DDoS attacks.)
• Unique learning capabilities that allow Reblaze to identify and respond to scraping and other
threats even as they become more sophisticated.
9
Three New Threats To Your Web Assets (and how to avoid them)
In addition to the above, Reblaze offers many more benefits that traditional approaches cannot provide,
including:
• Advanced human detection and behavior detection algorithms, to exclude non-human traffic.
This alone greatly increases security, because it forces would-be assailants to personally conduct
each attack—something few hackers are willing to do (since large-scale automated attacks are
much more profitable).
• Site acceleration, making client sites more responsive to visitors.
• Real-time email and text message alerts and notifications, with alert levels configurable by the
client.
• Unmatched fine-tuning and control of Internet traffic to each web site. Every Reblaze client can
allow or deny access from specific countries, cities, networks, companies, anonymizer networks,
and more. Clients can watch, control, and re-route their traffic—all in real time.
• An affordable all-in-one solution. Organizations no longer need to choose among multiple
vendors and the countless security products being offered. Reblaze does it all.
• Easy deployment, with no installation required. Reblaze protection for an entire domain is enabled
merely with a DNS change. Clients can even keep all their existing security measures in place, and
merely add Reblaze as an extra layer of protection between their assets and the Internet.
Traditional web security approaches are inadequate and are rapidly growing obsolete. Reblaze is a nextgeneration solution, not only for today’s problems, but also tomorrow’s.
To learn more about protecting your web assets with Reblaze, visit reblaze.com
__
© All Rights Reserved
10