breachapalooza

We're halfway through 2011 and the breachapalooza* continues unabated!

Sony have been hit so many times in fact there's a new term for it: "Sownage". Add to the ever-growing list senate.gov, Citibank, Honda Canada and the IMF.

Although it isn't really news to Security folk, the mainstream media has picked up on it (largely thanks to the scale of Sony's woes) and are continuing to report on the never ending tide of high profile defacements and smash-and-grabs. A quick look at datalossdb shows the number of incidents so far this year (322) is only slightly up on this time last year (300) and behind 2009 (376); while Sony's 77 million records lost is still well behind Heartland's 130 million back in 2008.

With mainstream media interest undoubtably leading to increased interest in boardrooms with executive asking "Can it happen to us?" and "what do we need to do to stop it happening to us?" the question has to be asked are the actions of lulzsec good or bad for the industry? Patrick Gray ruffled a few feathers with his thought-provoking "Why we secretly love LulzSec":

LulzSec is running around pummelling some of the world's most powerful organisations into the ground... for laughs! For lulz! For shits and giggles! Surely that tells you what you need to know about computer security: there isn't any.
Which lead to an equally interesting response from Adam over at the Newschool site.

I think the answer may be a little from column A and a little from column B. In Patrick's defence, he's probably right to some degree. Every Security guy or gal who has ever been overruled or just plain ignored when explaining the need for better security testing, implementation, tools, monitoring, etc etc; probably has a little voice somewhere saying 'I told you so'.
Adam is right too when he says:
We’re being out-communicated by folks who can’t spell.
Why are we being out-communicated? Because we expect management to learn to understand us, rather than framing problems in terms that matter to them. We come in talking about 0days, whale pharts, cross-site request jacking and a whole alphabet soup of things whose impact to the business are so crystal clear obvious that they go without saying.

Although I would point out that sometimes even framing the problem in the right language to the right audience still doesn't result in the desired outcome. The old 'you can lead a horse to water, but you can't make him drink' problem exists if a mentality of 'it can't happen to us' rules. The only plus out of LulzSec actions is that they may be breaking down some of that mentality.

However the most disappointing, or possibly telling, thing is that from what has been reported, is that very little of what lulzsec has accomplished has been particularly difficult or sophisticated. This is not really surprising as it matches what Verizon revealed earlier in the year [pdf] when they reported that 92% of the breaches investigated where 'not particularly sophisticated'. SQL injection may be old school, but it's more popular than ever.

In the meantime, Paul Ducklin from Spohos issued a challenge to the LulzSec group to use their skills, and there obvious spare time, to do something worthwhile like supporting Johnny Long's Hackers for Charity.

That may have to wait until after LulzSec are done warring with 4chan/anonymous, which at the very least may provide some relief to Sony and may give other companies a break.**


*just heard Patrick Gray's risky.biz podcast from last week call it the pwnpocalypse. Why didn't I think of that?

**Edit 18/6:  or maybe they're not as they're still exposing records.

Managing Geeks

Not specifically security-related, but a friend recently posted a link to a old (2009) article about managing IT Staff that is one of the best I have read.


From personal and anecdotal experience it really hit the nail on the head in a few areas about how 'geeks' respond to authority and management and the currency of respect.

National Cyber Security Week

Next week is National Cyber Security Week here in Australia.

Click on the image to go to the Government site for details and great resources such as factsheets, security quizzes and a small business security assessment tool

So-oh no-ny

Sony's woes continue, as although they have restored their PSN network, they are being accused of still having plenty to do with flaws in their password reset function and multiple vulnerabilites being discovered by researchers in their other websites.
Adding salt into the very public wound, an investigation into Sony's data protection measures by the UK Information Commissioner's Office mirrors the announced investigation by the Australian Privacy Commissioner. It will be interesting to see the findings.

Sony are learning the hard way a lesson that many other organizations should be heeding, computer networks are incredibly complex and difficult and defending them is even more complex and difficult. If your business is providing online services to a large customer base, security needs to be part of the culture of the company - it needs to be evaluated, implemented and questioned at every level with every developer, every DBA, every sysadmin, every network engineer taking responsibility to proactively secure their area and every project manager and every business manager understanding the importance of security and the potential damage of a significant breach. Maybe it's too much to ask...?

 To my mind it is quite a surprise that Sony did did not have a CISO and unfortunate that it took such a major incident for them to appoint one. It seems it may have been a typical 'it can't happen to us' attitude that many managers and executives adopt.

Hopefully the major publicity surrounding this breach will lead to other organizations to reassess their data security efforts.

Breach, breach, baby...

Data breaches are big news recently and it seems no-one is immune...

From Sony Online Entertainment's huge breach (and criticized response) to the Australian Government and the (slightly less recent) incredible embarrassment of Security Vendor RSA's breach and the Epsilon breach, which was largely publicized in Australia as the 'Dell Australia' breach.

Will the sheer number of high-profile data breaches provide some more motivation for businesses to employ better security safeguards and to demand vendors provide more secure products? Will they wake up the general populace to the importance of not using the same password for everything and opening every attachment that promises dancing pigs?

I won't hold my breath, but I will cross my fingers and hope.

The always interesting Verizon Annual Data Breach report [pdf] is out for 2011 and is (as always) as interesting as it is depressing. A big upswing (+22%) in externally-sourced attacks and a change of targets from Financial institutions to hospitality and retail are interesting. The fourth-highest number of breaches resulting from default or easily-guessable passwords is depressing.
Download it as it is well worth a read.

Cloud Concerns

The cloud has arrived down under! Well at any rate it has registered on the radar (weather radar?) of our Government officials.

Last month the Defence Signals Directorate (DSD) has issued a paper on Cloud Computing considerations [pdf] that aims to “assist agencies to perform a risk assessment to determine the viability of using cloud computing services.”

This came hot on the heels of the Federal Privacy Minister voicing his concerns with the compatibility of cloud services and the National Privacy Principles and has been followed up by the Victorian Privacy Commissioner releasing a Cloud Computing information sheet to give "a brief overview of how the Information Privacy Act 2000 (Vic) applies to cloud computing technologies"

Personally, I'm happy to see privacy concerns are getting some serious consideration. I'm certainly not anti-cloud, in many ways it is very cool, but I don't want to see businesses running headlong into a potentially disastrous (security & privacy-wise) situation without giving the consequences due consideration. Firm Cloud standards and Government guidelines (and industry guidelines -eg: ASIC) will go a long way to helping any move to the Cloud be successful in the long run (again from a security & privacy perspective).
Assuming you cloud service is up and running that is! (sorry for the cheap shot Amazon!)

Finally on privacy, this week is Privacy Awareness Week, so go and check your facebook privacy settings (because they change pretty often!)

Be prepared!

Being horribly sick at home with a nasty chest infection has some small benefits - such as being able to catch up on some TV. I just finished watching 'The Egyptian Job'  which is a speculative recreation of the robbing of the tomb of pharaoh Amenemhat III.

Amenemhat III was one of the richest rulers of the Middle Kingdom and had a state-of-the-art pyramid protected by the best security of the time - blind passageways, dead-ends, massive immovable stone doors and a 45 ton slab of quartzite sealing the burial chamber.

But it was all for nought! Using stone tools, ingenuity and elbowgrease, a determined group of thieves managed to dig a 100 metre tunnel, move several 15+ ton doors and crack through the 45 ton quartzite slab to pull off one of the richest heists in history.

So what's the lesson? The usual one that no matter how good your defences may seem, a truly determined attacker with time on his/her side will find a way through. Amenemhat III's pyramid used static defences, giant blocks that were 'set and forget'. If the graverobbers hadn't made off with his loot 3700 years ago, Egyptologist Flinders Petrie, with 'modern' tools and techniques would have taken the lot in 1888.

So bearing in mind that one day your defences will fail, the next important step is to be properly prepared for that eventuality. In the aftermath of September 11, and more recently the Christchurch earthquake and Queensland cyclone Yasi, many businesses created or updated their Disaster Recovery and Business Continuity Plans.

While DR/BCP plans are important, such large scale disasters (or even smaller ones, such as your building catching fire) are relatively rare. A statistically more likely occurrence would be for a business to lose critical data - through either malicious or accidental means, or to suffer some other type of network breach such as a large scale virus outbreak or website defacement. But how many businesses have response plans in place to deal with these types of incidents?

Regardless of the business size, having some type of incident response plan to deal with these types of occurrences is a good idea. The very basics of clearly defining who needs to be notified internally (and has the authority to make decisions such as if a compromised critical system can be/should be shut down or if law enforcement needs to be informed) or under what circumstances external bodies must be informed (regulatory bodies or reporting the loss of PII data) is a solid starting point. Predefined statements for the media (or at least determining who is allowed to talk to the media) are also a good idea in case the breach is made public.

Identifying who has the skills to perform an investigation (internally or externally) and has budget authority to engage investigators (nothing is ever free!) is the next steps as it is far better to have this sort of thing defined well in advance in calm circumstances that making high-pressure decisions on the fly at 3am when a major data breach may or may not have occurred (or indeed still be in progress!).

Where investigations are handled internally, having adequately trained and resourced staff is essential - you can't just rely on your 'regular' I.T. staff or Information Security staff to be able to collect evidence and perform forensics without specialized training - and these skills need to be kept up to date through regular incident response drills that expose a sufficient number of staff to the response process (primary responders and backup team members so that a missing key team member doesn't derail the response process).

If a third-party is to be used, ensure they have the employees with sufficient skills to investigate and collect evidence - this is especially important if the incident ends up going to court - and preferably has a proven track record of performing such investigations. Understand how long different types of investigations take and how much they're likely to cost - the cost of the investigation always has to be balanced against the damage of the incident.

Finally of course is being able to tell if an incident has occurred. Sometimes it is easy, but sometimes an organization may not know for months that its network and information systems have been compromised. Sometimes it may be a false positive and no incident may have occurred at all. Understanding what is 'normal' in your environment is critical - as is being able to quickly detect when something is not normal.

powered by Blogger | WordPress by Newwpthemes | Converted by BloggerTheme