What are information security rules for, if not for breaking? If nobody breaks your rules, then how will you know the true impact of such a violation?

It is naive to think that the user community is sitting idly by and following the rules and policies. You can be sure they are busy finding ways to get around your controls, and often they are forced to do so to become more productive.

I often hear IT departments being referred to as Dr. No, and this doesn't have to be the case.

Both technology and user sophistication are growing exponentially, and the trick is to find a way to harness this energy for the benefit of the company.

Obviously, these developments are also creating information security risks exponentially as well, and this is the tension you need to manage.

An obvious technique is to use pilots. I see in the not so distant future a realistic demand from the user base to bring their own equipment on a structural scale.

To truly understand what this means and how we can deal with it, I bought a Mac laptop and brought it into the office and simply asked my team to make it work.

I asked them to imagine this was brought in by a new employee and part of that persons contract was the use of their own equipment.

How would we make that work? What is it about our policies which would truly not allow such behavior? What can we change in our policy and controls to support such a trend?

This experience has given us the knowledge of what this could really mean for us, and we have established additional credibility by being ahead of the game.

Sometimes, going through a formal pilot is simply not enough. When I took this role in my company three years ago, one of the first things I did was have an iPhone purchased in Italy – as at that time it was one of the few places to get a simlock-free version.

I brought it into the office, put it on the table, and explained to my teams that they needed to figure out how to make this work in a secure way. It was a risk, as the answer could have been that it's not possible, or the internal politics of me owning the information security policy and openly breaking it could have turned out badly.

Fortunately, through this process we were able to get more focus on the controls and policies required to support iPhones and other devices than were currently standard.

Oddly, it was also a test of bring-your-own, although the attraction of making this iPhone work was enough that nobody realised that we were breaking another rule at the same time.

A side effect of finding a secure (enough) way to support iPhones was that I was subsequently able to give key members of top management iPhones for them to use, and it was these people who later openly supported a project proposal for mobile applications.

It wasn't a favour for the gift of the device, it was because these people truly understood the potential of these devices for our business because they had been using them.

Your rules and controls are obsolete the moment they are published, and if you don't provide a mechanism for them to be broken, the users will move on without you.

Whether this is a formal pilot or a disruptive challenge, it might as well being you challenging your own status quo.

I believe the benefits of this far outweigh the costs. A conservative benefit is that you will be able to point the users to a pilot as an explanation of why they need to wait temporarily for something to be supported, or you will have the applied knowledge of why something simply can't be allowed at this time.

A personal benefit, which should not be underestimated, is that such behaviour positions you as a thought leader in emerging technologies, and your executives are looking for people to trust during all this time of change. It might as well be you they trust, before someone else breaks your rules.

Chris Parker is SVP and CIO of LeasePlan

He is speaking on The Consumerisation Of IT - The Dawn Of The B.Y.O. Business at Infosecurity Europe from 19th – 21st April at Earl's Court, London

Pic: TFDeusing cc2.0