I'm currently studying Digital Forensics and a recent bit of google-inspired research lead me to one of the big stories of late last year (which I vaguely remembered) where a Microsoft forensic tool designed for use by law enforcement called COFEE (Computer Online Forensic Evidence Extractor) was leaked on the internet.

Given the prevelance of computer-based crime and the level of skill required to perform proper forensic analysis, it makes sense for Microsoft (or someone else) to develop a simple-to-use wrapper for what apparently was a number of common forensic tools available elsewhere on the internet.

The reaction to the leak seems to have been mixed, with Microsoft claiming they weren't bothered by the release of the software, although noting it is licenced for use by Law enforcement only, to someone developing a counter-forensic tool called (of course..) DECAF. What was the thinking in creating this counter to COFEE? One of the developers said:

"We saw Microsoft released COFEE and that it got leaked, and we checked it out," the man said. "And just like any kid's first day at the fair, when you walk up to that cotton-candy machine and it smells so good and you see it, it's all fluffy – just so good. You get up there and you grab it and you bite into it, it's nothing in your mouth.

"That's the same thing we did with COFEE. So, knowing that and knowing that forensics is a pretty important factor, and that a lot of other pretty good forensic tools are getting overlooked, we decided to put a stop to COFEE."

This arguement seems fairly disingenuous as COFEE seems to hardly have been aimed to replace any existing tools, but to simply make them easier for a less-well trained law enforcement operator to use in order gather crucial forensic evidence. The fact the tool was released by Microsoft probably had more to do with creating a counter-tool than noble thoughts of 'better tools being overlooked'.

No matter what the task, there is almost always a 'better tool', whose use might not be desirable because of cost, complexity or the expert knowledge required to operate it. Much of the history of software innovation has been designed around making complex tasks easier so more people can perform them, Windows being the prime example as it took desktop computers from the realm of geeky hobbyists to mainstream use in businesses and in homes. While simplifying (or as some may call it 'dumbing down') tasks may grate the nerves of the some, it is an inevitable and in many ways, desirable end goal.

Following the Road Rules

It struck me this evening while driving home that there is a nice analogy to be made between information security and road safety. All that maintains our roads in the organised state of chaos that they are, rather than total anarchy, is a set of conventions that ensure that we drive on the left (in Australia), stop at stop signs and give way to the right at round abouts.

I would imagine, though I have nothing to back this up with, that a large proportion of car accidents happen in situations where it is unclear what is expected of the driver. As a case in point, as I drive home there is a place where two lanes merge into one, however, there is nothing to indicate which lane is ending. This lack of direction causes the occasional irritated honk of the horn or shake of the fist from drivers who believe they have been wronged and, if it hasn't happened already, at some stage a minor collision is inevitable.

The same applies for information security, whether browsing the internet, opening an email from an unknown source or disposing of sensitive documents, where a well known course of action exists the decision is easy, it is when users are presented with the unfamiliar that trouble strikes (scammers are well aware of this and utilise the familiar to make targets feel comfortable). Ensuring that users know the correct course of action requires an ongoing education program coupled with a strong set of policies to guide users on the right course of action.

I have this picture in my head of the users of a network, be it a corporate network or the internet, as drivers in vehicles of all different sorts, some in Abrams tanks, others on mopeds (the ones in the Abrams are likely Mac users blindly driving around opening files without regard to the consequences).

Other parallels exist too, particularly in corporate networks where user activity is much more heavily regulated, particularly the use of incentives both positive and negative to ensure compliance with the rules. When drivers don't comply with the regulations they may be fined and if caught infringing enough times may lose privileges or be compelled to take remedial training. In much the same way users of a corporate network may be more inclined to comply with and contribute to information security endeavours where it is assessed as part of their job performance and tied back to bonuses, pay increases and advancement within the company. A points system similar to that used with Australian drivers licenses may actually work quite well to identify users requiring remedial training. More on incentives in a later post.

Some credit for the ideas in this post has to be given to the paper I am currently reading from he Internet Security Alliance (ISA) and the American National Standards Institute (ANSI)

Photocopier peril

Affinity Health in the US has had to notify @400,000 customers and staff of a potential data breach. A firm suffering a data breach? "Nothing new there!" you say.

In this case though, the method the data was lost is a little more unusual (as was the method of discovery). You see, CBS was investigating the ticking "digital time bomb" of office photocopiers and purchased 4 copiers. Upon removing the hard drives and running a forensic tool over them they found confidential police data on 2 machines, construction plans and payroll data on a third and on the fourth - patient information from Affinity Health.

A quick search on datalossdb shows a few entries for fax machine breaches (mostly by sending a fax to the wrong number), but only one entry for copiers - the Affinity Health breach.

The CBS article asks, "Has the industry inform the general public of the potential risks involved with a copier?" to which the President of Sharp Imaging says "yes".

They do point out all the major manufacturers offer 'encryption options' or security packages, but without providing any information on what percentage of buyers are willing to pay the extra dollars.

Here's a thought - include it by default! Make it impossible to buy a digital photocopier without encryption or secure deletion!

I think it was in the Mitnick book "Stealing the Network" (or perhaps it was in "The Art of Intrusion") that a hacker stealthily entered a network and took control of a digital copier.

In the meantime, what does you organization do with it's old copiers when the lease ends or they end-of-life?

Security Incidents in Australia & New Zealand

One of the difficulties of working in the infosec space in Australia can be the lack of region-specific information available. I blogged recently about a Ponemon institute study that was Australian-based and have recently come discovered Chris Gatford of had started maintaining a record of security incidents in Australia and New Zealand.

This is a nice addition to some of the existing resouces available, such as (which records all different kinds of data loss) and which keeps a good record of website defacements.

The enemy of my enemy is my.....enemy?

Oh McAfee what have you done? Last week McAfee released an update for their antivirus software that crippled Windows XP SP3 machines. This is not the first time McAfee have had this problem, having crippled machines last year with a bad update as well.

Of course, the 'bad guys' have immediately jumped on the bandwagon as well, flooding google with links scareware sites promising to fix the problem.

What to do? Well I'm not here to bash McAfee (they have enough angry customers right now to do that), and all the big vendors make mistakes, but this does expose a serious problem in the quality control of another big AV vendor.

Last year I sat through a presentation by McAfee where they talked about the massive rise in malware and viruses, a comment that was echoed by Symantec in a presentation around the same time. The Sophos 2010 Security Threat Report [pdf] states that "Sophos’s global network of labs received around 50,000 new malware samples every day during 2009".

Combine that with the constant need to beat the competitors to market with the latest protection and it's no wonder a mistake like McAfee's recent one was made. It seems almost inevitable it will happen again.

But what can be done to protect your servers and desktops? Do AV updates need to be treated like patches and be run through a testing regime before deployment? Is this even feasible in an era of daily (or multiple times daily) signature updates?

I'm no developer and not in the AV business, but it would seem to me having a 'whitelist' of known good items (such as critical windows components) might be a way to stop something like this occurring again...

Government and Google

I came across this very cool tool, from Google while perusing the Tech section of SMH Online today, it gives a breakdown of the number of requests from governments around the world to Google to have content removed either from search results or sites controlled by Google and for information about users of Google's products.

A nice bit of transparency from Google with data which I doubt many governments would be as forthcoming with.

Escaping Documents

This goes squarely in the 'oh dear' category and comes only months after this lapse, it appears government bodies need to be a little more mindful of where they are putting sensitive information...


Just wanted to shout out a congratulations to fellow Security Circus blogger Richard on graduating (with distinction!) his Master of Information Systems Security degree this week.

Great work Richard!

Airport Security Antics

Not strictly Information Security, but certainly pertaining to organizational security culture, ran a story today that just makes me sad..or is that mad? Or both?

A security gate at Dubbo Airport has been found to have the access pin number to a printed out and stuck above the keypad.

According to the article, Government officials will review security at Dubbo airport next week. I wonder what else they'll find?

Something this balantly idiotic is a sign of a generally poor (or non-existent?) security culture. Sure you may have one 'helpful' person who decides to post the PIN number (along with the helpful "please touch pad softly" message), but for others using the gate to not step in and remove the sticker is a worrying sign. Some more of those airport security dollars may need to be spent on basic staff security awareness and less on security theatre like confiscating nail clippers but not cigarette lighters...

Rewarding Failure?

Richard pointed out to me a great little blog post entitled "Does Software Development Have A Culture Of Rewarding Failure".

The post asks why those who bring home projects over budget and over time with a huge flurry of last minute effort seem to be more rewarded than those who get it all done on time and on budget.

Unfortunately it is not only during software development that this type of behaviour occurs, it can permeate many other industries and business environments. But why is this so?

Is it simply everyone loves a hero, the underdog, fighting against incredible odds to achieve the near-impossible?
They certainly are more visible, appearing incredibly dedicated, sacrificing their evenings and weekends as they struggle to complete that big project on time (or at all) as opposed to the other team who 'easily' got it all done on time and within budget.
The author makes the point "...everyone expected the project to go well and when it did, no-one was surprised, everything went according to plan, why would anyone reward or even acknowledge it when things go according to plan?"

It reminds me a little of the old Y2K bug (remember that one?). Lots of people working very hard to ensure nothing went wrong. And when nothing did go wrong (ie: success!) the question was asked by some management: "Geez what did we spend all that time and effort for? Nothing happened!"

Information Security is in a similar boat. Money and resources are allocated to security projects can seem to be wasted when, well, nothing happens! Which of course in many cases was the point of the expenditure; to stop the bad thing from occurring.

I'm reading Nassim Nicholas Taleb's excellent book 'The Black Swan: The Impact of the highly Improbable'* at the moment, which discusses (amongst many things) our cognitive bias towards narratives. We like a story, a bit of colour, and this can affect our rational view of facts. In regards to the current topic, consider the following:

  • The Project finished on time.
  • The Project finished on time because we all worked 7 days a week, 16 hours a day for the last two weeks to meet the deadline.
Which statement seems more likely? I'd wager that, from the gut, for most people it is the second one.

There can also be a mindset of "if you're not running around in crisis mode at crunch time, then you must have budgeted too much time to start with!". We value effort, and in 'deadline crisis mode' the effort is more visible.
Some of this may also be the result of the vicious circle created by 'rewarding failure' in the past because in people's experience all the projects that arrive with a big bang and flurry of 'crunch time' activity to meet the deadline are the ones most valued (ie: rewarded).
Never mind the hidden costs of the project deadline death-march, which may be represented by cut corners, resulting in quality and security problems to be addressed 'sometime' down the track.

This whole topic brings to mind an old Dilbert comic about an employee getting an award for working overtime and weekends fixing the mistakes he caused in the first place.

I can only agree with Alan Skorkin on this one when he states "I for one would love to see a little bit more appreciation from everyone for projects where things go according to plan and especially for the people on those projects...rather than celebrating the belated delivery of the latest death-march, how about digging into it and trying to figure out why it was 6 months late and why people had to work 80 hour weeks to keep it from complete disaster".

*If you're involved at all in looking at (or trying to second guess!) future events or trends - like many Infomation Security professionals - I highly recommend Mr Taleb's book.

Cost of a Data Breach

The Australian has reported that the Ponemon Institute has released a report on the Cost of a Data Breach based on data from the Australian market.

For those of us 'down under' it is great to see some reporting based on the local conditions, rather than the usual reports from the US and Europe. Unfortunately the report is only based on the 16 completed responses from the 114 companies that were asked to participate, however I see it as a good start that I hope will continue.

powered by Blogger | WordPress by Newwpthemes | Converted by BloggerTheme