Last year I asked a colleague the rhetorical question "how long before we see someone talk about 'Cloud 2.0'?" The sad thing was, I checked online – and it turned out that one vendor was already doing it. Now there are more.

Just in case you didn't already know, the technology industry is a kind of fashion industry – but it's one where (unlike the case in clothing) you can't easily recycle elements of your wardrobe every year to make space for new "on trend" items. If your partner got you some hot new knits for Christmas, you can pass on your old clobber to a charity shop quicker than retail guru Mary Portas can say "fast fashion". If you take on a new approach to software architecture or a new middleware platform, you can't just throw away the old stuff. As I'm probably overly fond of saying, "in IT, nothing ever dies".

Of course this steady accretion of stuff – some in the mould of evergreen classics, other with the lasting power of denim dungarees – brings a number of obvious problems to do with how you control the cost of managing an increasingly diverse and complicated portfolio of technologies and platforms. But there's a more subtle outcome that every CIO needs to be aware of as they seek to navigate a path through the choppy waters of IT investment and portfolio management: and that's that increasingly, no investment decision can be made without understanding the broader context of how, why and where the resulting asset(s) will get used. This accretion means that – putting it very simplistically – assets bump into each other and overlap more and more, and unintended consequences become a more frequent occurrence once investments are made.

The importance of understanding investment context might seem like motherhood and apple pie, but it's a problem that's made more pressing because by and large, the IT supplier community and those who analyse that community have a dangerous habit of seeing every technology category or concept in isolation.

One example: the evolving discussion around Cloud Computing. Most of what I've read on the subject implies that Cloud Computing has to be an 'all or nothing' proposition, even though the practicality of that as an idea is frankly incredibly naive. Here's a couple of other examples: software development languages and platforms (you're either focused on Java, or .NET) and methodologies (you're either focused on Agile, or 'waterfall' style).

Neil Ward-Dutton on: Cloud Computing is the new SOA

In the real world, it's clear that enterprises' opportunities and challenges aren't very often about individual, fenced-off areas of technology or competence. Today's CIOs and IT architects live in a world of 'and', not a world of 'or' - individual technology investment and strategy choices have to be made in the context of a bigger picture.

And the context that needs to be considered isn't just technology context – it's also the context of business use. I recently engaged in a debate about business process improvement and solution architecture with various representatives from one client. The CTO was adamant that IT architecture practice was all about setting a group of standards which should always be adhered to: the role of the IT architect, in this IT leader's mind, was to act as a kind of police force with one rigid set of laws against which to work. But the truth was that his business colleagues had very widely varying contexts of IT use they wanted to apply: in some business areas, it made sense to work with very strict standards, and moreover ensure that cost competitiveness of the resulting systems was the most important factor to consider when investing. In a couple of specific business areas, though, the polar opposite was much closer to the ideal: cost was much less a factor; freedom to acquire, integrate and potentially dispose of new technologies at a pace – even when the result wasn't 'architecturally pure' – was what was really needed.

Neil Ward-Dutton on: Enterprise Architecture: heroes and hairballs

This CTO didn't realise that different contexts of business use for technology mean you have to be prepared to apply different investment and implementation rules in support of different business capabilities and domain. Here, the 'world of and' plays out again: success in the real world isn't about the pursuit of one approach to investment and architecture (cost-driven, flexibility-driven, or whatever) over any other approach. Multiple approaches are likely to be valuable, working side by side.

Vendors' propensity to a kind of tunnel vision is perhaps understandable. When explaining the workings and value of their offerings to the market, vendors have to make generalisations about their prospects' environments. After all, any marketing team which tried to cover all the nuances of potential customer situations would quickly end up with overly complicated stories that would be difficult for anyone to want to hear. But there's a difference between making a set of assumptions about what organisations' existing environments and constraints might look like, and trying to side-step important debate by 'framing' the way that a technology or concept is positioned so that alternative approaches are painted out of the picture. Many industry analysts and pundits – the people who should be helping to show how different perspectives of the world fit together – make the problem worse by overly compartmentalising the way they look at and analyse technologies.

In other words, if you don't work to maintain a big picture of investment context and use that informs your evaluations of new technologies and approaches, it might not be wise to assume that anyone else will do it for you. The 'world of and' is here to stay, but unfortunately there are still many whose 'world of or' view can – perversely – make life difficult by making things look too easy.

What's your view? I'd love to hear your thoughts.