Sunday, 30 December 2007

Here’s to a more balanced 2008

While everyone seems to be busy making predictions about hot technologies, revolutionary industry developments, and various tipping points being reached in 2008, I can’t help hoping that we see a bit more balance emerging in views and opinions over the coming 12 months. It’s probably wishful thinking given that more extreme and/or disruptive ideas are used as a lever for selling everything from hardware and software to management consulting and analyst research, but it would be nice to see us getting away from bandwagons, magic bullets and the simplistic 'single track' thinking that often accompanies them.

Of course that’s not to say that interesting things aren’t happening, and we can look forward some important trends and developments continuing to unfold in the coming year, such as the ongoing move towards more virtualised, dynamic and service oriented infrastructures, the gradual evolution of sourcing and outsourcing options, the awakening of more enterprises to the potential of social computing, etc. The only real seismic shifts we are likely to see, however, are in marketing collateral, analyst reports and the media.

So, while many around us are ‘bigging up’ SaaS, cloud computing, open source software, Web 2.0, and so on, we will continue to do what Freeform Dynamics has always done - examine all of the ideas and propositions in a practical, down-to-earth and objective manner, and provide insights and advice for those working in the real and complex world of ‘brown field’ IT and business.

And with this focus, the ‘how?’ is just as important as the ‘what?’ and the ‘why?’, so our emphasis on community research, tapping into the experience of practitioners as well as strategists, will remain a big part of what we do going forward. During 2007, we gathered over 45,000 responses from IT and business professionals in our research studies. Our analysts therefore really do have a good in-depth understanding of what’s going on out there, and it is a position we fully intend to maintain.

Let me finish by saying a big thank you to everyone that has supported Freeform Dynamics since it was founded two years ago, and wish all of our subscribers, readers, clients, partners, friends and anyone else who knows us a happy, harmonious and ‘balanced’ 2008.

Saturday, 8 December 2007

Managing signal to noise

A couple of months ago, I decided to get stuck in a bit more to the whole social media thing, as a few conversations with other’s that were much more active than me had planted the seed in my mind that I might be missing out on something. Those who know me will realise that this wasn’t so much me getting involved in social media for the first time, as I have been a producer and consumer of blogs for a couple of years now. It was more a case of stepping up a level.

Anyway, I made a real effort to go through the blog rolls of the 20 or so blogs to which I was already subscribed, took recommendations on interesting wikis, and signed up for a bunch more feeds. I also decided to explore the extreme real-time end of social media, and signed up to Twitter.

Fast forwarding to this weekend, I have just deleted my Twitter account and got rid of most of the RSS feeds I had added as part of the exercise.

Why?

Well two reasons. Firstly, I just couldn’t keep up with everything. I struggle to stay on top of my incoming email already, so having too many other streams to monitor and sort through just means more time away from the family and ‘real life’ and/or more chance of missing something important. This last point leads me on to the second reason paring things back again – the signal to noise ratio got considerably worse as I expanded my subscriptions beyond the hand-picked sources I had already been using.

One of the particular challenges I encountered was that so many bloggers and Twitterers out there are clearly on a mission or pushing a specific agenda. Nothing wrong with that in principle provided you take what you read with a pinch of salt, and I personally find it interesting and useful to understand the range of views that exists. Unless you are on the same mission, though, such sources quickly become very boring. There are only so many ways of making the case for ODF, for example, and a daily stream of evangelism thereafter is really just noise to most people.

However, with the exception of Twitter, which I struggled to see the point of, I did actually get some benefit from exploring things a bit more widely. I now have a list of blogs and wikis that might not have a high enough level of genuinely new insights to subscribe to on an ongoing basis, but do represent sources to browse from time to time to keep up to speed in certain areas or provide input for research. The difference is that it will be me going to them rather than them coming to me from this point onwards – which is pretty much the way I have been using the Web for the last decade.

So, while I remain a big fan and active user of social media, I have discovered that to me it is the content being exchanged that matters more than the act of communicating itself. Perhaps that makes me relatively ‘unsociable’ in the online sense, but when it’s the socialising that takes precedent, it is only natural that the signal to noise ratio deteriorates.

Again, nothing inherently wrong with this, but just like I all those ‘put the world to rights’ conversations in pubs, small talk and one-upmanship competitions at parties, etc, activities that are primarily about social interaction should not be confused with the production or exchange of useful information. Somewhere in between lies the ‘conversation around the water cooler’ that forms an important part keeping people informed and tuned in, and there are blogs out there that encapsulate this spirit and are therefore very worthwhile subscribing to (e.g. monkchips). Most of the other feeds I am left with are concerned with blogs and wikis that explore issues and debates in an objective, informed and thought provoking manner, with high level of original content - but these are harder to find than I think many social media advocates like to admit.

At the end of the day, it’s all about how you spend your time, so the trick is to find the optimum balance between continuous incoming streams and keeping tabs on the sources of information that are useful to access but on more of an ‘on demand’ basis. The next stop for me on my social media adventure is therefore tagging and bookmarking.

Sunday, 2 December 2007

Avaya crosses the line

This is not going to be an in-depth post. I just wanted to put on record that I was very impressed with a lot of what I heard during the Avaya industry analyst conference a couple of weeks ago.

It was a pretty big gathering, with analysts from across the world rubbing shoulders with each other. I love events like this, as while we here at Freeform are continuously researching the European and North American markets, it is great to talk with people who have in-depth knowledge of thrusting economies like India and China.

With so many analysts on one place, it also reinforced the myriad of different styles, approaches and areas of coverage that exist within the research community. I guess it will be no surprise that with Avaya’s heritage, the majority of the delegates were specialists in the communications industry, and I lost count of the number of conversations I had on the nitty gritty of the telephony market that left me way behind.

So why was I impressed?

Well, I am a bit of a hybrid when it comes to coverage in that I think of myself as a business and IT analyst primarily, but with a reasonable working knowledge of how the communications part of the equation touches this world. This is very relevant to the Avaya discussion as one of the big topics of the conference was Unified Communications (UC). I don’t want to dwell on this specifically as Robin Bloor, who was also at the event, has already written a pretty good treatment of the topic, but the main point is that UC represents the clearest business and application level cross-over between the traditional IT and telephony spaces outside of the call centre environment that we have seen to date, and Avaya seems to ‘get’ what’s important to be successful once you cross over the old dividing line. The understanding is multi-dimensional too, i.e. Avaya is thinking as much about partnerships, IT related architectures and standards, and business process enhancement in the broader application sense, as well as simply neat functionality.

If you are an Avaya customer, I would encourage you to catch up with the firm’s latest developments in unified comms and 'Communications Enabled Business Processes' (CEBP), as ways of bridging the gap between domains that are still considered separate by many.

I am going to resist saying much more at this stage as Jon Collins and I will be spending some time in a week or so with the most visible player in the unified comms space, Cisco, and one of the objectives we have is to bring ourselves completely up to date with its ideas and developments with regard to IT/comms convergence. I’ll also have to track down the guys at my old firm Nortel, as there have been some interesting developments coming out of that camp in recent times too, and it is a while since I have caught up with them properly.

Looking at the bigger picture, the coming together of communications and IT at the application and process as well as the network level is a significant development which represents opportunities for both suppliers and customers. But it is obviously not just the traditional comms players that are moving into this area – IT incumbents such as Microsoft and IBM are also very active (see here and here) – they are just coming at it from a different direction. You’ll therefore be seeing us spending a lot of time on this topic in 2008.

Meanwhile, it is nice to see Avaya, backed by its new found private equity arrangement, starting to cross the line into the world of IT so convincingly.

Sunday, 11 November 2007

Context and social media

One of the RSS feeds I subscribe to recently threw up a post that provided a link to a YouTube video of an Analyst Relations (AR) professional talking about their job and why they like it. There was no explanation, just a link straight to the clip. Presented in this way, it looked a bit silly, and my first reaction was to ask why on earth the person concerned had published it.

Then it occurred to me that the video was probably made as part of some internal “who’s who” thing or perhaps in a personal capacity to give friends a little insight into what the person did for a living. Whatever the reason, once the clip had lost its original context, it was difficult to know how to take it.

Now in this particular instance, there is probably no harm done, but it got me thinking about people using YouTube and other content hosting sites as essentially a convenient way of storing and retrieving media for embedding in another site. While it’s great to be able to do this, there is a risk that the content may be interpreted and perceived differently when accessed directly or, indeed, via someone else’s site where the content or a link to it is embedded in an entirely different context.

It’s similar to the problem we face as researchers. We have to be very careful when we report the results of our primary research studies to include commentary relating to constraints or restrictions, which, if ignored, could lead to a statistic being taken out of context and spun to mean something that is not supported by the study as a whole. As an aside, this is why we retain copyright of all of our output, even though we make much of it available free of charge and allow anyone to copy it and pass it on. If we placed it into the public domain in an unrestricted manner, it could easily be taken apart and elements presented out of context in the kind of misleading manner we have been discussing.

Zooming out a little, this general issue of maintaining or understanding context highlights the need from an individual perspective to make sure we think before putting something out there that could be picked up in isolation or re-used by someone else for a purpose other than that which was originally intended. Wherever possible, we therefore need to make sure that media objects either contain important context within them or have an explicit reference back to the original source – e.g. your website address.

When we are more in information consumption mode, there is then a need to pay attention to the provenance of content, particularly when looking at a site, page or post that has been assembled by pulling together material from different sources. Personally, I try to track down the original source as much as possible when I look at a quotation, statistic, or even a picture or video, before relying on something I have discovered on the Web. It is so easy to be misled if you are not careful.

Perhaps this all sounds very obvious, but as someone who, like other analysts, has a job that involves gathering, comparing and making sense of intelligence and viewpoints from many different sources, it never ceases to amaze me how often information is misrepresented, either deliberately or unintentionally, by taking it out of context.

With its emphasis on user generated content (UGC) coupled with the absence of editorial processes and other safeguards by design, social media just increases the risk of being caught out. Forethought and vigilance are therefore the watchwords when producing and consuming information in this brave new free-for-all Web 2.0 world.

Monday, 5 November 2007

Dissecting SaaS

I sometimes think I am living in a parallel universe when I get into conversations about Software as a Service (SaaS). People keep talking to me as if there is some kind of seismic shift taking place in the way organisations are acquiring and running software. Then I look around me and down at the results of study after study of buying patterns and investment plans that we carry out here at Freeform Dynamics and all I can see is the continued gradual creep of the price per user per month hosted model that has been taking place in a steady but non-dramatic manner for the best part of a decade now.

When I ask what it is that people are basing their evidence on, they point to solutions such as salesforce.com and Google Office, then at the number of column inches and marketing dollars being spent telling us that SaaS is the future. And yet, beyond salesforce.com finding an opening for a hosted service around a niche application that is largely stand alone and sold into green field environments (at least from a sales force automation perspective), the evidence for the revolution is pretty elusive.

Now before any of you SaaS evangelists write me off as a grumpy old sceptic, I must point out that I am a big SaaS fan, provided you approach it sensibly. Indeed, I have championed SaaS for internal use in both of the companies I have had a hand in building – bet my businesses on it, if you like. It is my firm opinion that there can be little justification for any small business to run email servers and the like in house.

But, I am also a realist, and the evidence I can actually have confidence in tells me that I am unusual in my acceptance of the SaaS model in a business context. For every organisation that says it has SaaS on the agenda, there’s about 7 saying they don’t. And those that are going down the SaaS route are mostly doing so very selectively – they are not looking at a complete shift in philosophy or approach as some would have us believe.

So, SaaS is definitely a trend and this way of delivering solutions will increasingly find its place in the mix, but, in keeping with the title of this blog, let’s keep it grounded and be realistic about the rate of change that is taking place. Just because vendors say it is exploding, doesn’t make it true.

Putting all of the SaaS mania to one side, though, the individual elements of the typical SaaS proposition are actually quite appealing to many. Paying for software on a subscription basis rather than forking out up front for a perpetual licence can help with both cash flow and the optimisation of accounts (subscriptions can be conveniently categorised as an operational cost). Having your applications hosted on someone else’s servers can be beneficial too, especially if this allows you take advantage of robust and scalable platform technology that you would otherwise not have access to. Finally, of course, having someone manage your environment for you means not having to worry about the distraction, cost and risk of maintaining the necessary resources and practices in house – IT is, after all, a non-core activity to most businesses.

The point is, though, that you don’t need to do all of these things at once. If it is the subscription approach that appeals, you can take advantage of this without having another party manage and/or host your applications. If it is getting rid of the hassle and overhead of looking after systems then there are lots of firms willing to provide a managed service, regardless of where your software and hardware resides and who owns them. Such services, just like traditional hosting models, have been around for years and are nothing new.

So, my advice to anyone trying to figure out where SaaS fits into their IT strategy is to look at the components of the proposition individually in the context of a specific requirement. If all three elements, the subscription approach, hosting model and managed services, seem relevant and attractive then SaaS is worth looking at, but if only one or two out of the three appeals, then look for the products, services and/or commercial terms that fit your requirement. The bottom line is you don’t have to drink the Kool Aid and commit your soul to the church of SaaS in order to benefit from any of these things.

Sunday, 28 October 2007

IP PBX: A natural Linux workload?

Every now and again, I read something or hear something quoted in the media that just doesn’t ring true. The latest was an assertion that voice over IP (VoIP) and IP Telephony (IPT) solutions are fine for larger organisations but are not ready for smaller businesses, with a suggestion, even, that lots of small businesses are putting VoIP/IPT solutions in place then ripping them out a few months later for reliability reasons. Such scare-mongering is great for generating headlines, but is misleading and can easily put people off looking at VoIP/IPT who would otherwise gain significant business benefit from it.

This was a prompt for us to conduct a community research study to get to the bottom of what is really going on out there in VoIP user land. I am still crunching the numbers based on feedback from about 1,500 organisations with experience of VoIP/IPT, and will write up and publish the results over the next few weeks. Suffice it to say for now, though, that VoIP is alive and well in the SMB sector where satisfaction with quality of service, functionality and especially overall return on investment is actually higher among smaller organisations than their larger cousins. So don’t be put off if you are looking at VoIP for your business – it isn’t perfect, but stories of widespread disaster are wildly exaggerated.

Watch this space for more details of the research, including thoughts from participants on what to look out for, what to avoid, and how best to move forward to maximise the chances of success.

Meanwhile, something really interesting came out of the freeform anecdotal feedback gathered during the study that was a bit of a wakeup call for me, namely the popularity of the open source IP PBX solution Asterisk. Now I don't want to create the impression that it has the same penetration as solutions and services from commercial market leaders, but it does seem to be filling an important niche for low cost but highly functional IP PBXs among the more tech-savvy contingent. Here are a few representative comments that are typical of the feedback on Asterisk we have received:

“We use Asterisk PBX, running on refurbished hardware, and using a Sangoma A100 to terminate an ISDN30e line. Phones are Atcom AT-530, using SIP. This was the only way the charity project could afford a PBX on the funding available.”

“Using Asterisk really does give back benefits in terms of not being tied to one hardware manufacturer for phones. It’s a system that will do most if not anything you ask of it. Now looking at global deployment.”

“We use Asterisk as our IP PBX running on custom hardware. VoIP itself is an excellent solution, with Asterisk being the best of the bunch.”

“By using Asterisk we don't have to pay extra licensing just to have a redundant backup.“

“It's the great functionality that we love - e.g. a single DDI per employee, no matter where they are. We use an Asterisk server and have 6 staff connected to the server using Nokia E61/E70 phones.”

“We are a Microsoft Windows consulting firm, but have found Asterisk to be the killer app that has us using and promoting Linux.”

This last comment is particularly pertinent at the moment given Microsoft’s current campaign to drive VoIP solutions into the market, though I suspect MS is targeting quite a different audience.

Anyway, thought I would share this discovery of a Linux workload that I have not seen discussed that often before (though it may be that I just wasn’t looking). It is also interesting to identify another credible open source solution that appears to have genuine appeal to smaller businesses, at least those with an IT department capable of setting up and managing their own IP PBX solution. Asterisk won’t, of course, appeal to or even be accessible to everyone (skills sets, IT bandwidth, and so on) but those who use it are generally doing so very successfully.

Thursday, 18 October 2007

Thoughts on Software IP and Patents

A journalist contacted me the other day to ask what I thought about the story that broke earlier in the week about IP Innovation suing Red Hat and Novell over alleged patent infringements associated with Linux operating system.

My comments on this were along the following lines:

This kind of action was an inevitable development, even though it goes against the grain for many. While most would regard the existence of legal intermediaries who profit from such actions as distasteful, especially when it appears to undermine the positive efforts of the open source community, that community cannot resource the world’s software development requirements. The basic right for commercial organisations to protect the fruits of their investment in research and development therefore remains important to ensure continued innovation across the industry.

That said, while I cannot comment on the specific case at hand, a general problem exists in that there is a continuum of ‘originality’ when it comes to inventions. Some patents are clearly ‘right and proper’, but issues arise in greyer areas, particularly when someone is the first to invent and patent something that others would, or indeed have, subsequently come up with independently because it is an obvious or natural way of solving a problem. The patent review process does not always capture these.

There are also undoubtedly some patents endorsed many years ago that would probably not be approved today as the world has moved on, and by modern standards, they just don’t seem that special. Such patents are actually counter-productive in that they act as a constraint on innovation in a way that common sense says is unjustified.

Against this background, the cat and mouse games around patents that are being played at the moment are an unwelcome distraction. If vendors are blatantly using or encouraging patent related actions simply as a competitive spoiling tactic, or using the threat of action, whether implicit or explicit, to perpetuate fear, uncertainty and doubt, their customers should speak out against them. That is not the way a good software partner should behave.

It is such a complex area though, and taking an extreme stance either one way or the other doesn’t really help. We need a balanced approach and an appropriate review process to ensure that patents reflecting genuine investment in innovation are respected, but scatter-gun or speculative registrations do not stand in the way of progress.

Tuesday, 16 October 2007

BEA and Oracle: 0.5 + 0.7 = 0.8?

The problem with takeover bids is that once made, things can never be the same again for ‘target’. At this point, I guess we cannot be absolutely sure that the proposed acquisition of BEA by Oracle will go ahead, or whether other potential buyers will enter the game as James is encouraging with his mischievous egging on of SAP.

Whatever the outcome, the shame is that BEA is now going to find it hard to shake off the cloud of uncertainty. I say ‘shame’ because despite my disagreement with some of the naïve views expressed around packaged applications, I quite liked what I heard in terms of core strategy at BEA World a couple of weeks ago when the ‘Genesis’ story was presented. BEA seemed to have a good solid view of what its customers needed in terms of an SOA based middleware ‘fabric’ that is generally agnostic of specific technologies and applications. Even though Genesis was presented as a work in progress, things seemed to be heading in the right direction.

Meanwhile, as Neil points out, Oracle has had some gaps in its own middleware portfolio being pulled together under the ‘Fusion’ banner. We also know from recent Freeform Dynamics research (which we’ll be publishing soon) that, contrary to the way Oracle often spins the numbers, Fusion middleware adoption is pretty much exclusively aligned to Oracle application incumbency, i.e. there is very little penetration into organisations that do not use Oracle EBS, PeopleSoft, JD Edwards, etc. In this respect, there is little difference in the position of Oracle versus its main application rival, SAP, whose NetWeaver offering is similarly aligned to application incumbency.

So what happens when we put all this together?

Well, according to my own admittedly very subjective metric, I reckon I would put BEA at 0.7 on a scale of 0 to 1, where 1 would indicate an ideal set of open enabling ‘middleware’ solutions to form the linchpin of a future-proof corporate IT infrastructure. I don’t think BEA would argue too much with this – the guys there articulated some ambitious plans but acknowledged that there was still much work to be done.

Turning to Oracle, I think Neil is right when he highlights the solution gaps, but would also call out the challenges Oracle has been having in being taken seriously as an ‘independent’ vendor in this space, given the application alignment we have seen – hence I would put Oracle at 0.5 on our notional scale.

While bringing the BEA into the mix would round out the Oracle offering, there is a corresponding a risk that it would also undermine BEA’s positioning as genuinely independent option, especially given Oracle’s almost rabid competitive stance against SAP. There is then, of course, the obvious redundancy between the two portfolios that will need to be resolved in one way or another. Oracle seems to have got away with the ‘Apps Unlimited’ strategy based on maintaining multiple packaged application code lines, but ‘Middleware Unlimited’ would be stretching the concept beyond the realms of credibility – as well as stretching Oracle’s ability to manage an ever more fragmented R&D effort.

So, the acquisition arithmetic is probably something like 0.5 + 0.7 = 0.8. Don’t take this too literally, I am just trying to make the point while there might be some net overall goodness generated if the acquisition proceeds, the end result is not going to be the answer to everyone’s prayers.

Meanwhile, the most obvious beneficiary in all this is IBM, who can just sit there smiling as the one remaining genuinely independent middleware gorilla, unless, of course, you include Microsoft in the picture, but that’s another story.

Friday, 5 October 2007

BEA: The death of packaged applications revisited?

A couple of us spent some time recently at BEA World in Barcelona. My colleague David Tebbutt has summarised some general thoughts on the overall strategy and direction that was outlined under the “Project Genesis” initiative that you can read over here.

There was an issue raised in the keynote speech and echoed in later sessions that I wanted to pick up specifically, however, as it is another example of a vendor spinning an issue to suit its messaging in a way that can easily mislead if taken at face value.

THE SPIN

BEA has been repositioning itself in recent times from a deliverer of ‘big iron’ infrastructure for building and running business critical Java applications to the custodian of infrastructure enabled business agility and flexibility. Moves into portal technology, social computing and business process management through a combination of acquisitions and in house R&D have all been part of this, and the introduction of the word “liquid” into a lot of its branding as reinforced the positioning.

At this level, BEA is right on the money. Our research has indicated repeatedly that businesses often feel constrained by IT’s inability to respond quickly enough to changing demands – so no arguments there. However, BEA is taking this one step further and questioning the value of packaged applications to support this ‘new world’ of adaptable, service oriented and people/process centric IT. The words are carefully crafted with phrases such as “the days of being able to innovate through packaged applications are over”, but there is a clear objective to sideline the relevance of packaged applications as we look to the future.

It’s another example of the line we hear from other vendors and, indeed, some analysts, which argues that SOA and increasing expectations around the need for IT flexibility spells the death of the application software package. Great though this is for generating headlines or (in the case of BEA) promoting the ‘build’ side of the traditional ‘build versus buy’ argument, there are a few other things we should probably consider before throwing out our ERP systems and other ‘packages’.

THE REALITY

Let’s start with some very obvious stuff. If you examine the way any business works, you will find that the majority of its business processes are ‘non-differentiating’. What we mean by this is that you while you need to be efficient and effective in these areas to manage costs and risks, you are unlikely to compete any more effectively in the market by inventing new ways of doing them. Examples include the vast majority of the accounting and administration that takes place in the average business, and for most to whom they are relevant, things like inventory management, manufacturing planning and execution, human resource management, logistics, and so on are non-differentiating too. Sure, there are exceptions such a Dell that gains significant competitive advantage from the way it has manages its supply chain, manufacturing and logistics activities, but if we look across industries as a whole, most of business processes we see are of the non-differentiating kind, for which it makes sense to simply adopt industry best practice rather than reinventing the wheel for the sake of it.

So let’s be blunt – if you are not using packaged software for non-differentiating business processes then you are mad. Even if you could build a better general ledger or accounts receivable system than SAP or Oracle, you would not actually have gained anything through doing so in business terms. Of course, the chances are that whatever you came up with would not actually be as good as a package solution that has been tuned over the years in line with industry best practice and the requirements of thousands of customers, so the reality is that you would probably be worse off.

Having said this, BEA and others make the argument that traditional packaged applications are relatively closed and monolithic in nature, which is a problem when integrating them into the overall landscape, and when you need to change the processes they support. Even non-differentiating processes need to be modified from time to time for efficiency purposes or to accommodate changes in business structure, merger and acquisition activity, new regulatory requirements, and so on. The argument continues that all of those investments made in the 90’s to put ERP and other packages into place has inadvertently locked business processes in a way that is constraining by modern standards.

Let's put aside the configurability of most packages for a minute accept this argument for the purposes of discussion. The question then becomes, if you have an old monolithic package that is holding you back, what should you do about it?

It is at this point that the SOA extremists pipe up with the purist line about packages becoming irrelevant in the future as organisations compose their solutions to meet the exact needs of the business by selecting and plugging together the optimum mix of best of breed software components and services. The problem is, that’s a bit like saying that because the majority of components that make up a PC are now standard and can be plugged together easily, there is no longer a need for pre-built machines. It is basically nonsense. Whether you are a business looking to buy or rent software functionality, a consumer looking for a PC, or someone in the market for a new car, you are likely to want a product that has been assembled, integration tested and delivered as a working unit, with assurances that it can be maintained and supported as such.

The move to assembly from standards-based components and the surfacing of standards based interfaces has benefits in all contexts, however, and in acknowledgement of this, pretty much all mainstream application package vendors are moving in the direction of SOA. Will they get there overnight? Certainly not, but both SAP and Oracle, for example, are investing huge sums on re-architecting their solutions, which is something the purists often fail to acknowledge or dismiss as just ‘marketing’. The work is real, though, and the aim is to allow standards based ‘services’ to be exposed for easier integration and for selective substitution where it makes sense. Just like you buy an off the shelf PC and perhaps upgrade the graphics card, SOA will allow you take an ERP package, for example, and ‘upgrade’, say, one of the advanced planning components with a third party alternative.

So, what we need are flexible packages rather than an abandonment of the package concept altogether.

CONCLUSION

The story we heard from BEA this week is a very strong one, though as David points out, all of the pieces are not in place yet. When listening to the pitch around the need for more IT infrastructure flexibility that can support the ever increasing rapidity of business change, however, it is important to remember that there is still a lot of relatively routine and boring stuff that will remain best dealt with through prescriptive packaged solutions that encapsulate industry best practice. Furthermore, off the shelf packages or services (if SaaS catches on) are increasingly going to be based on flexible architectures anyway, and the trend away from customisation to ‘soft configuration’ so packages may be tailored to specific needs without code-cutting is already well underway. The ERP and CRM packages of 2007, for example, are inherently more malleable than those of mid-90’s.

In this respect, I would probably on balance disagree with BEA that ‘software package enabled business innovation’ is dead, though I would agree that IT departments should probably be shifting their attention to using the latest infrastructure, tools, techniques and ideas of the kind BEA is promoting to support the ‘differentiating’ elements of the business in as flexible and high impact a way as possible. And in some situations, it may not be sensible to even model processes at all, let alone lock them down in software.

The world and technology landscape is definitely changing, but let’s not assume that embracing new ideas always means abandoning the old – or that the traditional stuff is standing still.

Monday, 1 October 2007

Does OpenOffice.org matter to the mainstream?

I was clicking around on one of the news sites a few days ago and came across reference to a service portfolio called 'Open Office' that mobile operator Orange launched in the UK earlier this month. It's all about remote working, but it's not the Orange service per se that made me think.

Having been doing a lot of research in the desktop computing space recently, the open source alternative to Microsoft Office, OpenOffice.org, was the thing I immediately thought of when I read reports of the Orange announcement. It was then that I realised that the prominence of the term ‘Open Office’ in my mind was almost certainly unrepresentatively high compared to the mainstream. As a branding and marketing savvy outfit, my guess is that Orange would not have chosen this term if it felt that it already had a significantly strong meaning and association out there that would confuse its target audience.

The truth is, of course, that it probably doesn’t. We in the IT profession are more likely to be familiar with it because of the ongoing debate about Microsoft dominance coupled with the noise made by open source evangelists to promote alternatives. But out in the mainstream proper, business people just get on with using Microsoft Office, blissfully unaware of all this – and, in fact, not really that bothered about finding an alternative at all.

We came across some pretty clear evidence of this back in April when we conducted some research in association with The Register news site. You would think that if anything, this would have been biased towards open source advocates (given the nature of the site), but even here, the message was that use of and interest in MS Office alternatives, whether open source or online (e.g. Google Office), is currently very limited outside of a few niches, with much more sentiment of the kind “if it ain’t broke don’t fix it” than anything else.

Whether you think this is good, bad or just doesn’t matter at all is immaterial, that is how it is.

From a personal point of view I am torn. While I strongly endorse the idea of competition, I am also a pragmatist, and common sense (and feedback from a few thousand respondents during research over the past few years) tells me that the last thing we need from a business productivity and communication point of view is to fragment the installed base of desktop office tools in way that introduces compatability issues. The key issue here is file formats and how they are handled, rather than the capability of the tools themselves (assuming they do the job to the level that is required).

On that note, we are currently having fun at Freeform Dynamics with MS Office 2007 Open XML files, which are causing some confusion when we inadvertently send one outside of the company. I would imagine anyone adopting ODF would run into the same kind of issues. The difference between the two, of course, is that the evidence suggests that MS Office 2007 will reach critical mass in the not too distant future, pulling Open XML adoption with it, whereas the drivers for ODF adoption are much less clear.

Meanwhile, if you have an interest in alternatives to MS Office, whether OpenOffice.org or anything else, common sense says you need to check out whether the most common file formats used by the majority are well handled. At the moment, this means Microsoft Office binary formats (*.doc, *.ppt, etc), but will increasingly mean Open XML too.

Right now, ODF looks like a bit of a red herring in the mainstream, expecially if you consider Open XML being embraced as a standard, but there are some big advocates looking to promote it (such as IBM) so there is a chance that this may may change and it is something we are keeping an eye on.

The bottom line, though, is that most IT departments at the moment probably have more pressing priorities than disrupting the MS desktop status quo, which is unlikely to go down well with users anyway.

Despite the level of apathy we are picking up, I am going to keep nosing around this whole area, as there are some segments of the market that struggle to justify the licensing fees associated with Microsoft Office tools, e.g. smaller businesses with relatively simple requirements, and my feeling is that there may well be some niches for alternative solutions opening up as time goes on.