tag:blogger.com,1999:blog-73657010004338890402024-03-08T20:41:57.242+00:00Keeping IT Grounded (Terminated)This blog has been terminated. Please go to www.openreasoning.comDale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comBlogger22125tag:blogger.com,1999:blog-7365701000433889040.post-77388601431336639782008-11-30T19:26:00.000+00:002008-11-30T19:27:13.070+00:00BLOG TERMINATED (please go to www.openreasoning.com)Hi Everyone<br /><br />I have made the decision that maintaining two blogs was bit ambitious, so have decided to close this one down and focus on the <a href="http://www.openreasoning.com/">Open Reasoning</a> instead, so please go there for ongoing posts.<br /><br />You may also be interested in my more formal output in the form of reports and media articles, which can be found <a href="http://www.freeformdynamics.com/analyst.asp?searchfor=Dale%20Vile">here</a>.<br /><br />Cheers<br />DaleDale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-88936728683595611462008-08-20T08:35:00.001+01:002008-09-16T17:23:02.543+01:00iPhone: First impressions of a Blackberry userFor a while now, I have maintained two mobile phones – one for business use and one for personal use - and in terms of requirements, I need different things in each context.<br /><br />There are four ‘must haves’ for my business device – good battery life, quick and convenient calling from a large directory of contacts, a solid, immediate and user friendly email capability, and acceptable security.<br /><br />As a long time Blackberry user, three of these have been ‘givens’ for the past five years, the only compromise in the early days being relatively clunky functionality for making and receiving calls. The Blackberry Curve I am using at the moment, though, delivers well on this front too, so all of my needs for business use are catered for effectively.<br /><br />On the personal side of the equation, my requirements are a bit different. From a calling perspective, I tend to be dialling from a much shorter list of contacts – tens rather than hundreds – and telephony use in general is much lighter. As a small business owner, there is still a requirement to access business email (as you never know what might need your direct attention), so connectivity to Microsoft Exchange and security are still important. Immediacy and user friendliness of email functionality is less so, however – these just need to be good enough allow periodic inbox browsing and very occasional replies.<br /><br />Battery life is an interesting one. When using a device off duty, if I see the juice is getting a little low, I can curtail my usage and prolong the life left in the device. This is generally not an option for business use given the communication intensive nature of the job I do.<br /><br />Beyond the communication stuff, there is also the recreational side of things – music, games, photography and perhaps a little web browsing. This brings me to the iPhone, and when I was looking for an upgrade to my personal device a few weeks ago before going on holiday, I felt obliged to check out this option.<br /><br />Like most people who pick up an iPhone for the first time, it immediately felt quite natural, and it is the first device I have used that appeared to deliver a genuinely usable full web browsing experience – at least when connected to WiFi in the O2 store. When digging a little deeper, the Microsoft Exchange access seemed pretty well covered, the device was pin-securable with remote wipe capability, and the embedded iTunes, GPS enabled mapping, etc looked great. The only thing that seemed a little naff was the camera spec, though I figured it was probably good enough for snapshots of the kids, dog, etc.<br /><br />So, I succumbed, and signed up for an iPhone on the basis that it seemed to do most of the things I wanted. But has it lived up to expectations?<br /><br />Well three weeks on, I have to say that I still really like the iPhone and am pleased that I went for it. It has stood the test of real life use and quite a bit of experimentation over my recent 2-week holiday. It functions OK as a phone and call quality seems pretty good. The music playback quality is also good, especially when compared to my iPod Nano. Beyond this, there’s a reasonable number of games available to keep me amused, and, as suspected, the camera is actually OK for family snapshots, though, unsurprisingly, no good for ‘proper’ creative photography.<br /><br />However, the iPhone is far from a perfect device. The most immediate problem I ran into was battery life. Perhaps optimistically, I started out running with all of the defaults – relatively bright screen setting, 3G enabled, GPS switched on, email delivered from our Exchange server through the ‘push’ mechanism (similar to Blackberry), etc. After returning home a couple of times at the end of a day out with the battery almost exhausted (with relatively light use), I suspected a little tuning was in order.<br /><br />Fortunately, the iPhone allows you to switch off 3G access with the flip of a soft-switch, leaving the device running purely on the GSM network with data access over GPRS or EDGE. This improves battery life considerably, and as I don’t do much browsing when out and about, I have left it this way, figuring I can always enable 3G again for short periods when I really need to. The other adjustment that seemed to extend battery life significantly (apart from the obvious move of winding down the screen brightness) was disabling the push email mechanism and setting the device to poll the Exchange server every hour instead. Again, this adjustment can be made through soft-switch flicking, allowing the polling frequency to be set to every 15 minutes, 30 minutes or whatever, with more frequent polling clearly consuming more power.<br /><br />Interestingly, enabling and disabling WiFi doesn’t appear to make a huge difference, so because of the convenience of the iPhone automatically hooking onto my home network when I arrive at the house, and discovering hotspots when out and about, I have tended to leave WiFi switched on.<br /><br />As a disclaimer, I have to say that my tests have not been that scientific in that I have just been making adjustments in an attempt to get a configuration that works for me while I get on with my life. Unlike a lab test, no two days usage have been exactly the same, so what I am picking up here are gross differences in performance. That said, the one conclusion I have come to is that the combination of battery life limitations coupled with the inability to swap batteries when the power runs out makes the iPhone far from ideal for heavy business use on the basis of the power issue alone.<br /><br />And while I wasn’t explicitly evaluating the device for business use, there are some other things that would cause me concern in this context. Apart from the widely reported lack of cut-and-paste capability, I noticed some quirks associated with the email client, for example, which make it difficult to do some things offline and require zooming and horizontal panning to read some email messages that refuse to word-wrap. Then, while I was pleasantly surprised at how usable the soft touch-screen keyboard was for casual text entry, I cannot imagine ever getting to the level of speed and unconscious use that comes naturally with a device that has a decent physical QWERTY keyboard. This may not be a concern for many, but it is major consideration for me, as I tend to use mobile email very interactively for business purposes.<br /><br />In terms of issues from a corporate adoption perspective, others may also be concerned about lack of data encryption on the device itself, but with a sealed unit, pin access and remote wipe capability, if you take a common sense approach to assessing risk, there is probably not a huge security exposure for most business users.<br /><br />When all things are considered, I would say the iPhone comes nowhere near devices such as the Blackberry Curve or 8800 series in terms of business fitness for purpose, particularly for heavy mobile data users. As a predominantly personal device, however, it is a great example of where mobile technology is going, and as I said, I am very pleased with the overall package.<br /><br />As an industry analyst, I should probably grumble at the closed business practices of Apple itself in terms of controlling the distribution of content for the iPhone, but when I then think about the convenience and ease of use for a non-technical user, I can see that there is a also an upside to controlling things end to end for mass market consumer adoption.<br /><br />So, the bottom line is that based on my initial impressions, I would not discourage anyone from buying an iPhone for personal use, but I would urge them to think about their requirements and do the appropriate due diligence before investing in the device for business use. As for large-scale deployment in a business environment for hard-core mobile requirements, I am not sure the device is yet ready in its current form, though if anyone has any experience to the contrary, I would love to hear from them. How do you rate the iPhone from a policy management, software distribution, maintenance and end-user support perspective for example?<br /><br />Whatever the current situation, the end-user appeal of the iPhone will ensure that it makes its way into many businesses one way or another, and with Nokia, Microsoft, Palm and others already challenging RIM on fitness for purpose, we can look forward to an interesting couple of years as it all shakes out.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-42147773008775945992008-07-15T20:26:00.002+01:002008-07-15T20:31:58.410+01:00Desktop power managementIt’s encouraging that many of the conversations we are having at the moment in relation to IT and sustainability are moving beyond power management in the data centre. It is not that optimising the use of central IT isn’t important, but it really is only one way to drive an organisation’s environmental agenda. And even before we get to main question of how technology can enable more eco-friendly working practices, there is another place we can look to for operational IT power savings – the desktop.<br /><br />When looking in this direction, though, I have noticed that there is a tendency to apply the same kind of thinking that is used on the server side of the equation. Fair enough, accelerating hardware refresh to introduce more power efficient kit into the equation reflects a similar game to that being played in the data centre, but with the carbon cost of manufacture/disposal taken into account, the net gains are hard to establish. In the data centre of course, hardware modernisation is augmented by consolidation and virtualisation to drive up average server utilisation and thus improve energy efficiency.<br /><br />Virtualisation is a different game on the desktop, however. Sure, some will go down the route of running virtual PCs on the server and accessing them through thin client configurations, but it will be a long time before this is the norm. The reality is that most organisations will remain wed to their fat clients for the foreseeable future, so we need to think of the energy question a bit differently. Essentially, the challenge boils down to optimising the power consumption of desktop machines that typically idle for the majority of time they are switched on.<br /><br />In order to deal with this problem, we need to think less about utilisation and inherent power efficiency of hardware and software, and more about controlling the state of machines in terms of their sleep/wake cycle. In practice, a configuration exhibiting a high degree of runtime energy efficiency, but has no active policy to transition to a low power state when idle will consume considerably more power than a less efficient machine whose state is properly managed.<br /><br />This something that Microsoft makes a big point of when talking about Vista in the green context, and indeed early adopters with large Vista estates <a href="http://keepingitgrounded.blogspot.com/2008/06/justifying-large-scale-vista-migration.html">corroborate Microsoft’s claims</a> that Vista’s enhanced manageability translates directly to power savings. The problem is, however, that Windows XP isn’t going away in a hurry, so what about all of those organisations who are interested in desktop power management but will be maintaining older versions of the operating system for some time to come?<br /><br />Well the one approach that is generally acknowledged not to work that well is to educate, encourage or threaten users in an attempt to get them to keep their power configuration set in accordance with environmental policy, and/or to manually shut down their PCs or put them to sleep when they are not in use. IT managers relying on this kind of user discipline are probably not going to see the results they were hoping for unless they’re working for a totally green-tinted organisation.<br /><br />Fortunately, third party solutions exist that can help to enable/enforce centralised power management – a couple of examples being <a href="http://www.verdiem.com">Verdiem</a> and <a href="http://www.1e.com">1E</a>. Using such technology, you can not only cure PC insomnia from a policy enforcement perspective, but also allow real-time remote control of power state so machines can be woken up for backup or software distribution purposes then put to sleep again afterwards. So, if you are serious about saving energy across a large XP estate, the options are there.<br /><br />Something I haven’t had time to look into is whether similar solutions exist for alternative desktops – namely Mac OS X and Linux. Apple kit is certainly not renowned for its enterprise management friendliness, but perhaps ‘right on’ Mac users aren’t so much of a problem as they are of course <a href="http://www.theregister.co.uk/2008/01/18/green_poll_results/">more environmentally aware</a>. As for Linux, I would be interested in any views, recommendations or experiences.<br /><br />Meanwhile, it would be great to see a bit more awareness raising from Microsoft on the availability of solutions to centrally manage power consumption by Windows XP, rather than automatically seguéing from this discussion into a Vista upgrade pitch.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-82149973967033093512008-06-27T15:56:00.004+01:002008-06-27T17:53:27.533+01:00Justifying a large scale Vista migrationOver the past couple of months, I have had in-depth conversations with five CIOs that have made a significant commitment to Windows Vista.<br /><br />One of the main issues I explored with each of them was the foundation upon which the business case for migration was made. The responses I received were remarkably consistent, and not completely in tune with the way Microsoft articulates the Vista proposition.<br /><br />What all these guys said was that their business case for Vista, i.e. the one put before the board, CFO and/or other significant stakeholders, was founded on benefits in two key areas - security risk management and operational cost control.<br /><br />From a security perspective, the focus tended to be on three specific attributes of Vista - better run-time security in the operating system itself, more effective policy enforcement, and the ability to encrypt data on notebook PCs through BitLocker.<br /><br />What I found interesting was the view that while all three of these security related benefits were considered to be significant, it was the last one in particular that was most frequently highlighted as resonating directly with business stakeholders. Recent high profile press coverage about notebooks storing sensitive data being lost or stolen was seen to have an influence here in terms of awareness. Against this background, Vista’s ability to deal with an acknowledged business risk straight out-of-the-box was perceived to be of significant value.<br /><br />Beyond security, double-digit reductions in operational cost generally formed the substance of the business case in financial terms. The general streamlining of the management and maintenance process was highlighted as part of this, and the dramatic simplification of image management in particular was seen as a significant contributor to the savings in the large multi-national environment.<br /><br />Something I was personally very sceptical about, but which three of the five CIOs defended very strongly, were the savings in relation to desktop power consumption. Numbers from 50 Euros per year per desktop upwards were cited as savings, though to be absolutely clear, the benefit comes from better centralised control and enforcement of power management policies rather than efficiencies in the way Vista uses hardware resources.<br /><br />When asked about the element that was clearly missing from these business cases, namely improved user productivity, the general consensus was that this was a red herring. The most positive view was that there is likely to be some impact in this area, but it is impossible to measure in any tangible way, so why would you dilute an otherwise solid business case with something that could easily discredit it? Best to stick the list of intangibles in your bottom drawer and run with what you can defend with confidence.<br /><br />And it is on this point that the CIOs I have been speaking with diverge from the view articulated by Microsoft. In fact one said the obsessive reference to the great user interface, user facing productivity features, etc caused a lot of distraction and confusion when he invited a Microsoft executive to meet some of his business sponsors. When a stakeholder says, “I don’t understand, I thought we were doing this to save money”, it doesn’t actually help to get the investment case signed off.<br /><br />There are a couple of lessons that fall out of this. Firstly, if you are going through the process of evaluating the business case for Vista yourself, the abovementioned criteria will hopefully provide some thoughts based on where at least a few others have put the emphasis – particularly in a large corporate or public sector environment.<br /><br />Secondly, the feedback suggests that you should be prepared for business sponsors to get confused about the rationale for migrating based on the messages broadcast by Microsoft both directly and indirectly through advertising, the media, marketing collateral, etc. The trick here is agreeing that it will be a great spin-off benefit if all of the claimed or suspected end user productivity gains are realised, but keep the investment case itself focused on the more solid stuff that can be defended under cross-examination.<br /><br />Finally, there is a message in here for any Microsoft executives reading this. If you can curb your enthusiasm for obsessing about the Wow! and focus on the things that drive decisions, you might see more movement in the market.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-68249970253219562052008-06-12T22:01:00.005+01:002008-06-13T22:06:17.146+01:00Business Intelligence and the bolting horseThere appears to be a revival of interest in Business Intelligence (BI) among IT vendors at the moment. Some pretty big guns, the likes of Oracle, IBM, SAP and Microsoft, are trying to position themselves more aggressively in this space following the spate of acquisitions.<br /><br />So is this renewed vigour justified?<br /><br />Well from a customer perspective it undoubtedly is. It is pretty clear when you research BI that the gap between business need and IT capability is as great as ever. When we interviewed a bunch of senior business managers from City of London financial institutions last year, for example, they were very clear about this gap:<br /><br /><img src="http://www.freeformdynamics.com/media/2008/0804-Business-performance/Chart-03.jpg" /><br /><br />And if you look at this chart closely, you will notice something quite interesting. While business information availability isn't that bad at an overall financial and arguably operational performance level, it is not very good when you look at more detailed measures and indicators.<br /><br />Why is this interesting?<br /><br />Well because it tells us that by the time those managing the business find out about something important, it is often too late to do anything about it. Stories of product, client or partner related issues only coming to light when someone starts investigating why a higher level number has been missed are quite common.<br /><br />To put it another way, business managers usually have what they need to monitor the ‘effects’ of doing business, but are typically underserved when it comes to the information required to manage the underlying ‘causes’ of those effects. We discuss this more in the <a href="http://www.freeformdynamics.com/fullarticle.asp?aid=127">research report</a> from the study if you are interested, but it does bring home the importance of incorporating continuous analytics capability into the business process itself, as well as having traditional retrospective BI operating off to one side.<br /><br />The aforementioned vendors are therefore spot-on when it comes to making a big noise about the principle of integrating BI capability into applications in a more embedded fashion. Now, whether they have done a good of integrating their recent acquisitions into their broader solution set in practice is another question, but it is at least worth hearing them out.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-37051835301740433402008-06-01T19:45:00.009+01:002008-06-03T07:59:57.669+01:00Talking at cross purposes, or being deliberately misled?Ever had one of those conversations where you debate something for a while then it dawns on you that each party has been talking about something different? It has happened to me quite a few times recently.<br /><br />One example was in relation to Business Process Modelling (<a href="http://en.wikipedia.org/wiki/Business_Process_Modeling">BPM</a>), which is something I grew up with and in my mind is about, well, modelling business processes. It’s a discipline that business analysts have been involved with for a years, and while the technology to support it has moved on, and arguably some of the methodologies too, the fundamental principles haven’t changed that much for a long time now. Then someone asked Freeform Dynamics to design a research study to figure out the level to which organisations had adopted BPM. When I argued during an internal project start-up meeting that you couldn’t really ask someone about when and how they were taking something on board that they had been doing for a decade or two, it turned out that the ‘BPM’ we were being asked to investigate was actually 'Business Process <em>Management'</em> and was based on a definition which included the technical side of things – workflow rules engines, SOA orchestration, and so on. Not quite the technology-independent business view of BPM that I was taught earlier in my career, but as soon as the misunderstanding was cleared up, we could design the research accordingly.<br /><br />Another example was prompted by a report I read the other day claiming that Software as a Service (<a href="http://en.wikipedia.org/wiki/Software_as_a_Service">SaaS</a>) is now a mature and pervasive model. This was reminiscent of claims made during a number of other conversations I have had recently with SaaS advocates, that I have been struggling to reconcile with the findings of our <a href="http://www.freeformdynamics.com/fullarticle.asp?aid=318">own research</a>. The latter has shown quite conclusively that while larger organisations are starting to make selective use of SaaS for delivering business application functionality, 'pervasive' is certainly not a word that applies in this area. Then I realised that some of the advocates were throwing a whole bunch of stuff into their definition of SaaS (or the related <a href="http://en.wikipedia.org/wiki/S%2BS">S+S</a> model) that I would never dream of including when discussing the delivery of business application functionality. Internet search, traditional ISP services, and even things like consumer content services, online help and automatic updates associated with desktop software can sometimes be lumped together when referring the 'SaaS market'. Again, once the ambiguity is cleared up, you can see where people are coming from, and make a judgement on the usefulness (or otherwise) of what they are saying.<br /><br />I guess we at Freeform are particularly sensitive to precision when it comes to discussing market activity, as primary research designed to figure out what’s really going on behind the buzzwords and the hype is so central to what we do. The experiences I have outlined, however, highlight how easily people can be misled by imprecise or ambiguous definitions if they are not on their guard. And with so much vested interest and evangelism driving the market, the temptation for some to spin and exploit our ever changing vocabulary is significant, so we all need to careful about what is behind those stats and definitions.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-77686243082440745552008-04-27T19:56:00.010+01:002008-04-27T21:10:56.665+01:00Cloud Computing and Web 2.0Don’t you just hate it when another woolly ambiguous term is forced upon us? When I was approached by yet another journalist the other day asking me my thoughts on the impact of cloud computing, I simply sighed and told them it is a bit like Web 2.0. In itself, it is difficult to pin down exactly what is meant by it. The best you can do is say that both of these terms refer to a general direction in which the industry appears to be moving.<br /><br />In the case of Web 2.0, it is about the Web becoming a generally more interactive medium. This can manifest itself at a technology level through everything from Ajax through mash-ups to SOA, and at a behavioural level through social media and the simple fact that websites are generally now more geared up to a two-way dialogue than they used to be.<br /><br />In the case of cloud computing, it is about the evolution of dynamic virtualised infrastructure that allows us to think more in terms of resource pools than individual IT components. This in turn opens the door to delivering computing resource on a utility basis, which is equally applicable both internally (i.e. with regard to the way you use your data centre) and externally – which takes you into the realm of utility computing and software as a service.<br /><br />The point about both Web 2.0 and cloud computing is that they both sprung up arbitrarily on the evolutionary timeline, and seeming embraced anything and everything that could be thrown into the mix. While the very specific phenomenon of social networking is certainly noteworthy, this bears little relationship to evolution of rich user interfaces and composite applications, in fact many social networking sites have appalling UIs by traditional standards. Yet Web 2.0 can mean either of these things, and, confusingly, lots of other concepts too.<br /><br />Similarly, we have been talking about virtualisation ultimately leading to computing grids and utility computing for years, and giving it a new name doesn’t actually change anything in terms of the underlying trend. In fact, you knew where you stood much better when you could talk about virtualisation and grid technology as the enabling stuff, and utility computing and application services as what it enables. As everyone jumps onto the cloud computing bandwagon, it all gets mixed up and confused, just like Web 2.0.<br /><br />So, if you are one of those people wondering what cloud computing is really all about after listening the IBM explanation, the Microsoft one, and the evangelical rhetoric we have heard recently from the Google and Salesforce.com camp, don’t worry, you are not alone. The trick is to think of it as a label for a trend at one level, and an industry bandwagon at another, and keep your expectations pretty low in terms of clarity and consistency for the time being. Don’t however, dismiss the underlying trend it itself. While we are not looking at a revolution here, some of the developments in this general area are really quite interesting and valuable – though, you probably knew that already, even before the marketing hype was thrust upon us.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-2994085147370043762008-04-14T11:13:00.003+01:002008-04-14T11:44:55.954+01:00Oracle and CollaborationI was interested to read about <a href="http://www.it-analysis.com/blogs/MWD/2008/4/the_mysterious_oracle.html">Angela’s experience</a> trying to secure a briefing from Oracle on its collaboration related offerings and activities. As Angela pointed out, the ‘Big O’ was the only large vendor that ‘should’ have a story in this space that declined to tell her what it was up to.<br /><br />When I later commented on this (with a link to the above) via Twitter, someone else came back to me to say that they too had been having trouble getting Oracle to open up in this area.<br /><br />I have to say that this doesn’t surprise me. It must be quite challenging for Oracle at the moment trying to figure out how to position in this space. The Oracle Collaboration Suite was launched a few years ago supposedly to save the world from flaky Microsoft Exchange installations and pretty much fell flat. Oracle believed its own rhetoric about the world hating Microsoft, so looked silly to most people when it aggressively launched an initiative that would only work if customers ditched their existing Microsoft messaging infrastructure, which was never going to happen.<br /><br />In addition to some of the things Angela mentioned, we have also seen the portal wars in which Oracle has consistently been on the back foot, and lately, the march of Microsoft SharePoint and a range of collaboration and unified communications offerings from IBM under the Lotus and WebSphere brands that are largely messaging system agnostic.<br /><br />Then most recently, we have seen the BEA collaboration offerings thrown into the mix, which before the acquisition, were beginning to look pretty good. BEA had a very sound grasp of the heterogeneous world in which customers live and was taking a very mature view of social media in the enterprise, for example. And, of course, it wasn’t encumbered by competitive obsession, which, as an aside, is arguably one of the biggest obstacles to Oracle being accepted as a truly strategic partner in many major accounts. Telling CIOs and business executives that they have been stupid over the years to waste their money on SAP, Microsoft and IBM, for example, is not the best way to win friends in high places. While competition is good, destructive messaging generally only appeals to junior level activists. It is a huge turn-off in senior management circles.<br /><br />Coming back to the original question, we should probably continue to expect Oracle to be tight-lipped on not just collaboration, but middleware strategy in general for a little while yet. I have personally been told on a couple of occasions to refer to the ‘official line on oracle.com' when looking for clarity on <a href="http://www.freeformdynamics.com/fullarticle.asp?aid=128">open questions</a> that we hear from Oracle’s customers (old or newly acquired). Irritating though this might be, and frustrating though it is to be fobbed off with ‘Mom and Apple Pie’ type feel-good policy statements, the truth is that there is little else Oracle can do until it gets its act together properly.<br /><br />And to be fair, given some of the <a href="http://www.freeformdynamics.com/fullarticle.asp?aid=73">confusion</a> than came about as a result of articulating nice sounding stories around work-in-progress plans associated its CRM and ERP acquisitions in the past (that later had to be ‘adjusted’), it is probably better for us to hang on until Oracle really has worked out what it is trying to do in collaboration as it <a href="http://www.freeformdynamics.com/fullarticle.asp?aid=107">has in the enterprise application space</a>.<br /><br />Oracle is undoubtedly already aware that needs to be careful that the collaboration and closely related unified communications markets do not slip away from it, and will be doing what it can to make sure it doesn't get left behind again. In the meantime, it goes without saying that customers should challenge the company hard before making major commitments to it in these areas.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-7828449175579136962008-03-28T11:23:00.004+00:002008-03-28T18:50:04.887+00:00Making chipsets interestingAt the risk of offending all those who love to talk for hours about cores, caches and clock speeds, I have to say that I personally find discussions about the innards of silicon chips and how they are wired together intensely boring. In fact, I’ve probably already used all the wrong words and phrases, even in that first sentence, which is no doubt going to annoy some people further.<br /><br />So, when <a href="http://www.freeformdynamics.com/pdf/Biography%20-%20Tony%20Lock.pdf">Tony</a>, <a href="http://www.freeformdynamics.com/pdf/Biography_Martin_Atherton.pdf">Martin</a> and I were invited to a dinner to meet with some of AMD’s European executives, I was understandably in two minds about attending, especially as I am also not really into all this wining and dining stuff as some other analyst are.<br /><br />I went along, though, and I’m glad I did. Sure, I found myself sucked into the odd eye glazing conversation that I only partially understood, but something that came across clearly was that AMD is investing quite a bit in ‘reaching through’ relationships with its direct customers (largely the OEMs) to the ultimate customers – Enterprises, SMBs and consumers.<br /><br />Of course there is nothing new or unique in this, in fact I ran a team at Nortel Networks back in the early 00’s which did exactly the same thing (in that case, reaching through the mobile operators to understand how 3G related to their subscribers). The basic idea is that you can gain insights and tune your R&D based on direct end user/buyer input that would not be possible if you worked second hand through your customer as an intermediary. To do this well, however, you really need people who understand that end user environment and the trends that are taking place within it, and that’s not necessarily the same people that deal with your core product design from an internal perspective.<br /><br />Anyway, this end-user oriented view of the world shifted discussions to more familiar territory for me during the dinner, and I enjoyed hearing people like Giuseppe Amato, who goes under the title “Director, Value Proposition Team”, explaining how the whole process works in relation to data centre evolution, high performance computing and mobile working. It changed my perception of AMD quite a bit from simply “the alternative to Intel” to that of an independent player that is committed to driving industry development in its own way.<br /><br />While I am not qualified to comment on the relative merits of AMD technology versus the competition, nor its ability to execute in the cut throat world of OEM deals and supply chains, I now have a much better appreciation of why what AMD does actually matters. It is not just about price/performance or performance per watt of energy consumed, it is about shifting thresholds to make things economically or practically possible in the mainstream market that previously were not. That’s why the “what if you could....?” conversations with end customers as suppliers like AMD reach through to them are so important. And also why, for the first time in my life, I actually had some genuinely interesting conversations about silicon that were directly relevant to the world in which I live.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-7480428276769432612008-03-12T09:05:00.006+00:002008-03-12T15:15:29.191+00:00Downgrading from Vista to XPI blogged a while back on how a Vista upgrade effectively rendered my old desktop machine useless for business purposes (see <a href="http://freeformcomment.blogspot.com/2007/07/retiring-leonardo.html">Retiring Leonardo</a> from last year). I got a lot of feedback at that time as many people out there were obviously trying to get a handle on the viability of upgrading older kit.<br /><br />While this debate continues, the related question has now arisen of whether even some PCs pre-installed with Vista are capable of running it adequately. Based on my own experience, this is a very pertinent question to ask if you are considering buying anything with less than a 1.8 Ghz Core2 Duo processor with 2Gb of memory - the current minimum spec I work on for serious business use. Yet there are lots of Vista machines out there on the market that are significantly less powerful than this.<br /><br />Without getting into the rights or wrongs of this state of affairs, if you are unlucky enough to be struggling with Vista on a lower spec machine, you may be interested in a recent experience I had which was a bit of a wakeup call – not just in terms of the physical performance side of things, but also on the broader question of the value of Vista from an end user perspective in a business environment.<br /><br />A few months ago, I needed to replace my notebook. As a notebook to me is companion to my desktop rather than my main machine, I wasn’t looking for anything very powerful – size, weight and battery life were much more important considerations. So, after a happy couple of hours cruising up and down all of the hi-tech shops in London’s Tottenham Court Road trying all the latest kit, I opted for a Sony TZ Series – about 1.2 kilos in weight, fantastic screen, reduced size but really nice keyboard, embedded cellular modem, and lots of other good stuff.<br /><br />The machine came with Windows Vista Business Edition pre-installed and when I was playing with it in the shop, it was pretty responsive – the 1.2Gz Core2 Duo processor seemed to be up to the job. When I got the machine back to the ranch and loaded everything onto it, though, I have to admit to being a little disappointed with speed. Nevertheless, it was good enough, so I just got on with using it.<br /><br />Over the course of the next four months, however, the performance gradually degraded and the user experience became awful. It eventually got to the stage where it was talking 12 minutes to boot and about 6-7 minutes to shut down, with very sluggish performance in between and frequent hangs requiring a forced shutdown (which in itself was probably making matters worse).<br /><br />When researching the problem on the Web, it was clear that I was not the only one to be experiencing issues with Vista on the TZ Series, and the more I read, the more the answer to my problems became obvious – ‘downgrade’ the machine to Windows XP. A few forum entries mentioned a kit on the Sony website designed to allow you to do this, with all of the relevant drivers and utilities, and a set of instructions to guide you through the process. I duly downloaded this, followed the instructions, and it just worked. The longest part was installing and patching XP itself (which you have to buy separately, by the way – your Vista licence doesn’t cover it <em><span style="color:#990000;">** See clarification below</span></em>) .<br /><br />The end result is fantastic. The word ‘downgrade’ seems totally inappropriate – in fact, it feels like the machine has gone through a significant upgrade. It now boots in well under 2 minutes (with all the same applications loaded as before), is highly resilient (has gone through a lot of sleep/wake cycles without crashing once) and, interestingly, many of the Sony utilities work much more naturally (I suspect they were designed for XP in the first place then ported to Vista).<br /><br />The one thing I was a bit worried about was going back to XP from a usability and functionality perspective having got so used to Vista, but I was surprised to find that the experience was actually quite a positive one. Everything seemed more crisp, immediate and uncluttered and so far, the only thing I have missed is the enhanced application switching mechanism in Vista, i.e. the Alt-Tab and Windows-Tab functionality. That’s a minor sacrifice for the other benefits, though, and it only took me an hour or two to get used to the old mechanism again.<br /><br />The switch back to XP was such a breath of fresh air that I have also ‘downgraded’ the desktop machine I am using at the moment. On a reasonable spec PC you don’t see the same increase in actual performance, but the XP interface still feels a lot cleaner and snappier (at least to me). Having both machines running the same OS obviously has its advantages too.<br /><br />Now before everyone goes rushing out to downgrade their Vista machines based on this little story, it would be irresponsible of me not to point out that during my research, I read accounts from many happy Vista users, lots of which seemed to be getting on fine with the TZ and similarly spec’d machines. I would suspect the number and range of applications you work with has a bearing on this - remember I said that the TZ felt fine when I was just playing with OS with no applications installed before buying it. It could also, of course, be that people just accept the out-of-the-box experience as normal and don’t really question whether they are getting the best performance from their hardware. All I can say is that the downgrade was definitely the right thing for me, and is something to consider if you find yourself in a similar situation.<br /><br />In the meantime, we continue to experiment with various desktop options here at <a href="http://www.freeformdynamics.com/">Freeform Dynamics</a>, and those looking at alternatives may be interested a post from my colleague Jon Collins entitled <a href="http://totalimmersion.wordpress.com/2007/11/19/why-ive-replaced-vista-with-linux/">Why I’ve replaced Vista with Linux</a>.<br /><br />Finally, as I type this, I have a brand new MacBook sitting next to me here on my desk, and over the coming few weeks I am going to be looking at the practicalities of using the Mac in a Windows dominated mainstream business environment, so watch this space for experiences with that.<br /><br /><em><span style="color:#990000;">** Clarification re licensing terms: The right to downgrade Vista depends which edition you have. Vista Ultimate and Business may be downgraded within the terms of the Microsoft EULA at no additional cost, but this right does not apply to other editions of the software.</span></em>Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-21954764099828500162008-01-31T09:43:00.000+00:002008-02-01T05:51:59.123+00:00Are your IT staff adequately trained?An interesting finding emerged from one of our recent studies into IT Service Management (ITSM). It concerns a cause and effect that is pretty obvious once it is highlighted. Put simply, IT departments operate much more smoothly and efficiently if IT staff are adequately trained.<br /><br />The data, which is derived from over 1,100 responses to an online survey, is difficult to argue with. There is a clear relationship between the attention paid to IT staff training and the perceived level of burden experienced by IT. To put it another way, properly trained staff find it easier to cope with the demands placed on them in areas such as infrastructure optimisation and management to keep service levels up and costs down, effective maintenance of desktops to manage user satisfaction and keep security risks under control, and provision of helpdesk services to meet user expectations with regard to support.<br /><br />What’s more, the relationship between training and operational efficiency and effectiveness is a linear one. What does that mean? Well, it doesn’t really matter whether training requirements have been neglected, if the organisation already has its act together, or if it’s somewhere in between, indications are that that incremental training will always have a positive impact. To put this into perspective, another finding from the same report was that investment in other areas, such as systems management automation and integration, does not deliver benefits in the same linear fashion. Essentially, you need to get past a threshold of capability before significant improvements are generated.<br /><br />There are some interesting lessons in here for all organisations, but particularly those that have a tendency to skimp on investment in skills development. If this study is anything to go by, such an approach is clearly false economy. In fact, if you have anything to do with running an IT department that is underperforming on IT service delivery and operational efficiency, then the first port of call when looking for improvements should probably be staff development. While upgrading your systems management tools and technology may also be a necessity, investment in this way will take time to pay back. Meanwhile, a bit of additional training at a fraction of the cost is likely to have a much more immediate impact.<br /><br />Oh yeah, and study also quite clearly shows that training end users can have a similar impact, reducing the burden placed on IT in areas such as desktop management and help desk delivery. The basic principle here is that adequately trained users encounter (and create) fewer problems, and when problems do occur, users are much better placed to sort themselves out.<br />There’s a lot more to this research than the stuff we have been talking about above, so if you’d like to learn more, you can download a full copy of the findings from <a href="http://www.freeformdynamics.com/fullarticle.asp?aid=130">here</a>. And if you’re interested in a companion report looking at the future of IT Service Management (ITSM) in general, you can download that from <a href="http://www.freeformdynamics.com/fullarticle.asp?aid=129">here</a>.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-53395989990968093242008-01-25T19:25:00.000+00:002008-01-25T19:36:59.502+00:00The customer view of BEA’s acquisition by OracleWhen the BEA Oracle deal was finally announced last week, my first instinct, like many analysts and journalists I would guess, was to rush to the keyboard and bash something out. But what was there to be said that hadn’t already been covered? After re-reading my <a href="http://keepingitgrounded.blogspot.com/2007/10/bea-and-oracle-05-07-08.html">previous post</a> on the topic, I didn’t have a great deal more to say at that point.<br /><br />So, instead of writing a blog post, I composed a little questionnaire and reached out to Oracle and BEA customers through an online survey to capture opinion where the rubber meets the road. In a very short space of time, I gathered nearly 300 responses, including a lot of freeform feedback. I then spent an interesting few hours reading through and categorising people’s views, which is the part of this job I really enjoy. Gathering statistics through tick and bash surveys is one thing, but reading a few hundred comments in which a bunch of smart people tell you what they think in a totally unconstrained manner is a great way to get under the skin of a topic.<br /><br />In this case, I quickly uncovered a bunch of angles on the BEA acquisition that I hadn’t previously considered. Here is quick summary the themes, both positive and negative, that I managed to pull out (ranked in order of frequency of mention):<br /><br /><strong>Reasons given for why the acquisition is bad news</strong><br />1. Reduced choice and competition in the market<br />2. Uncertainties for customers with existing product investments<br />3. Loss of innovation, Oracle will smother the goodness of BEA<br />4. Concerns about Oracle as a supplier (style and nature)<br />5. Increased cost for BEA users (particularly maintenance)<br />6. Fear of lock-in as Oracle optimises between stack components<br /><br /><strong>Reasons given for why the acquisition is good news</strong><br />1. A stronger and more mature solution will emerge (eventually)<br />2. Rescue of good technology from a company that had lost its way<br />3. Creation of stronger and more credible competition for IBM<br />4. Better synergy between BEA technology and Oracle RDBMS, tools, etc<br />5. Reinforcement of distinction between commercial offerings and OSS<br />6. More integrated approach to customers and account management<br /><br />Even though a lot of these are pretty obvious, I’m sure most people looking at this list will spot a couple of angles that they hadn’t previously thought of, and if you are a customer trying work out the impact of the acquisition, then this probably isn’t a bad starting point for assessing the balance between risk and opportunity in what is actually quite a complex situation.<br /><br />Of course we also gathered some stats, and I’ll throw in this chart in here that illustrates the sentiment overall.<br /><br /><img height="346" alt="Oracle and BEA survey" src="http://www.freeformdynamics.com/media/2008/01/BEA-Oracle-1.jpg" width="460" /><br /><br />So, the initial reaction to the acquisition, while mixed, is definitely net negative.<br /><br />Anyway, if you’re are interested in a drill down on the above chart broken down by customer type (BEA versus Oracle versus joint customers), along with and fuller discussion of the findings, you can check out the more complete analysis I put together <a href="http://www.freeformdynamics.com/fullarticle.asp?aid=128">here</a> or <a href="http://www.theregister.co.uk/2008/01/24/oracle_bea_acquisition/">here</a>.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-54943443853237847102007-12-30T22:10:00.000+00:002007-12-30T22:41:00.144+00:00Here’s to a more balanced 2008While everyone seems to be busy making predictions about hot technologies, revolutionary industry developments, and various tipping points being reached in 2008, I can’t help hoping that we see a bit more balance emerging in views and opinions over the coming 12 months. It’s probably wishful thinking given that more extreme and/or disruptive ideas are used as a lever for selling everything from hardware and software to management consulting and analyst research, but it would be nice to see us getting away from bandwagons, magic bullets and the simplistic 'single track' thinking that often accompanies them.<br /><br />Of course that’s not to say that interesting things aren’t happening, and we can look forward some important trends and developments continuing to unfold in the coming year, such as the ongoing move towards more virtualised, dynamic and service oriented infrastructures, the gradual evolution of sourcing and outsourcing options, the awakening of more enterprises to the potential of social computing, etc. The only <em>real</em> seismic shifts we are likely to see, however, are in marketing collateral, analyst reports and the media.<br /><br />So, while many around us are ‘bigging up’ SaaS, cloud computing, open source software, Web 2.0, and so on, we will continue to do what Freeform Dynamics has always done - examine all of the ideas and propositions in a practical, down-to-earth and objective manner, and provide insights and advice for those working in the real and complex world of ‘brown field’ IT and business.<br /><br />And with this focus, the ‘how?’ is just as important as the ‘what?’ and the ‘why?’, so our emphasis on <a href="http://www.freeformdynamics.com/services.asp">community research</a>, tapping into the experience of practitioners as well as strategists, will remain a big part of what we do going forward. During 2007, we gathered over 45,000 responses from IT and business professionals in our research studies. Our analysts therefore really do have a good in-depth understanding of what’s going on out there, and it is a position we fully intend to maintain.<br /><br />Let me finish by saying a big thank you to everyone that has supported Freeform Dynamics since it was founded two years ago, and wish all of our subscribers, readers, clients, partners, friends and anyone else who knows us a happy, harmonious and ‘balanced’ 2008.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-87873683657845311102007-12-08T18:34:00.000+00:002007-12-08T18:52:30.604+00:00Managing signal to noiseA couple of months ago, I decided to get stuck in a bit more to the whole social media thing, as a few conversations with other’s that were much more active than me had planted the seed in my mind that I might be missing out on something. Those who know me will realise that this wasn’t so much me getting involved in social media for the first time, as I have been a producer and consumer of blogs for a couple of years now. It was more a case of stepping up a level.<br /><br />Anyway, I made a real effort to go through the blog rolls of the 20 or so blogs to which I was already subscribed, took recommendations on interesting wikis, and signed up for a bunch more feeds. I also decided to explore the extreme real-time end of social media, and signed up to <a href="http://twitter.com/">Twitter</a>.<br /><br />Fast forwarding to this weekend, I have just deleted my Twitter account and got rid of most of the RSS feeds I had added as part of the exercise.<br /><br />Why?<br /><br />Well two reasons. Firstly, I just couldn’t keep up with everything. I struggle to stay on top of my incoming email already, so having too many other streams to monitor and sort through just means more time away from the family and ‘real life’ and/or more chance of missing something important. This last point leads me on to the second reason paring things back again – the signal to noise ratio got considerably worse as I expanded my subscriptions beyond the hand-picked sources I had already been using.<br /><br />One of the particular challenges I encountered was that so many bloggers and Twitterers out there are clearly on a mission or pushing a specific agenda. Nothing wrong with that in principle provided you take what you read with a pinch of salt, and I personally find it interesting and useful to understand the range of views that exists. Unless you are on the same mission, though, such sources quickly become very boring. There are only so many ways of making the case for ODF, for example, and a daily stream of evangelism thereafter is really just noise to most people.<br /><br />However, with the exception of Twitter, which I struggled to see the point of, I did actually get some benefit from exploring things a bit more widely. I now have a list of blogs and wikis that might not have a high enough level of genuinely new insights to subscribe to on an ongoing basis, but do represent sources to browse from time to time to keep up to speed in certain areas or provide input for research. The difference is that it will be me going to them rather than them coming to me from this point onwards – which is pretty much the way I have been using the Web for the last decade.<br /><br />So, while I remain a big fan and active user of social media, I have discovered that to me it is the content being exchanged that matters more than the act of communicating itself. Perhaps that makes me relatively ‘unsociable’ in the online sense, but when it’s the socialising that takes precedent, it is only natural that the signal to noise ratio deteriorates.<br /><br />Again, nothing inherently wrong with this, but just like I all those ‘put the world to rights’ conversations in pubs, small talk and one-upmanship competitions at parties, etc, activities that are primarily about social interaction should not be confused with the production or exchange of useful information. Somewhere in between lies the ‘conversation around the water cooler’ that forms an important part keeping people informed and tuned in, and there are blogs out there that encapsulate this spirit and are therefore very worthwhile subscribing to (e.g. <a href="http://www.redmonk.com/jgovernor/">monkchips</a>). Most of the other feeds I am left with are concerned with blogs and wikis that explore issues and debates in an objective, informed and thought provoking manner, with high level of original content - but these are harder to find than I think many social media advocates like to admit.<br /><br />At the end of the day, it’s all about how you spend your time, so the trick is to find the optimum balance between continuous incoming streams and keeping tabs on the sources of information that are useful to access but on more of an ‘on demand’ basis. The next stop for me on my social media adventure is therefore tagging and bookmarking.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-52330680297679979362007-12-02T11:15:00.000+00:002007-12-02T11:46:52.312+00:00Avaya crosses the lineThis is not going to be an in-depth post. I just wanted to put on record that I was very impressed with a lot of what I heard during the <a href="http://www.avaya.com/">Avaya</a> industry analyst conference a couple of weeks ago.<br /><br />It was a pretty big gathering, with analysts from across the world rubbing shoulders with each other. I love events like this, as while we here at <a href="http://www.freeformdynamics.com/">Freeform</a> are continuously researching the European and North American markets, it is great to talk with people who have in-depth knowledge of thrusting economies like India and China.<br /><br />With so many analysts on one place, it also reinforced the myriad of different styles, approaches and areas of coverage that exist within the research community. I guess it will be no surprise that with Avaya’s heritage, the majority of the delegates were specialists in the communications industry, and I lost count of the number of conversations I had on the nitty gritty of the telephony market that left me way behind.<br /><br />So why was I impressed?<br /><br />Well, I am a bit of a hybrid when it comes to coverage in that I think of myself as a business and IT analyst primarily, but with a reasonable working knowledge of how the communications part of the equation touches this world. This is very relevant to the Avaya discussion as one of the big topics of the conference was Unified Communications (UC). I don’t want to dwell on this specifically as <a href="http://havemacwillblog.com/">Robin Bloor</a>, who was also at the event, has already written a <a href="http://havemacwillblog.com/2007/11/16/what-is-unified-communications/">pretty good treatment of the topic</a>, but the main point is that UC represents the clearest business and application level cross-over between the traditional IT and telephony spaces outside of the call centre environment that we have seen to date, and Avaya seems to ‘get’ what’s important to be successful once you cross over the old dividing line. The understanding is multi-dimensional too, i.e. Avaya is thinking as much about partnerships, IT related architectures and standards, and business process enhancement in the broader application sense, as well as simply neat functionality.<br /><br />If you are an Avaya customer, I would encourage you to catch up with the firm’s latest developments in <a href="http://www.avaya.com/pillars/usa/UnifiedCommunications/Landing.html">unified comms</a> and 'Communications Enabled Business Processes' (<a href="http://www.avaya.com/pillars/usa/CEBP/Landing.html">CEBP</a>), as ways of bridging the gap between domains that are still considered separate by many.<br /><br />I am going to resist saying much more at this stage as <a href="http://totalimmersion.wordpress.com/">Jon Collins</a> and I will be spending some time in a week or so with the most visible player in the unified comms space, <a href="http://www.cisco.com/en/US/netsol/ns151/networking_solutions_unified_communications_home.html">Cisco</a>, and one of the objectives we have is to bring ourselves completely up to date with its ideas and developments with regard to IT/comms convergence. I’ll also have to track down the guys at my old firm Nortel, as there have been some <a href="http://www.nortel.com/promotions/uc/index.html?NT_promo_T_ID=hp_box2_08_16_07_uc_offer">interesting developments</a> coming out of that camp in recent times too, and it is a while since I have caught up with them properly.<br /><br />Looking at the bigger picture, the coming together of communications and IT at the application and process as well as the network level is a significant development which represents opportunities for both suppliers and customers. But it is obviously not just the traditional comms players that are moving into this area – IT incumbents such as Microsoft and IBM are also very active (see <a href="http://www.microsoft.com/uc/Default.mspx">here</a> and <a href="http://www-306.ibm.com/software/lotus/unified-communications/">here</a>) – they are just coming at it from a different direction. You’ll therefore be seeing us spending a lot of time on this topic in 2008.<br /><br />Meanwhile, it is nice to see Avaya, backed by its new found <a href="http://www.computing.co.uk/crn/news/2191388/avaya-acquired-private-equity">private equity arrangement</a>, starting to cross the line into the world of IT so convincingly.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-8749918996097104232007-11-11T18:31:00.000+00:002007-11-11T18:36:48.628+00:00Context and social mediaOne of the RSS feeds I subscribe to recently threw up a post that provided a link to a YouTube video of an Analyst Relations (AR) professional talking about their job and why they like it. There was no explanation, just a link straight to the clip. Presented in this way, it looked a bit silly, and my first reaction was to ask why on earth the person concerned had published it.<br /><br />Then it occurred to me that the video was probably made as part of some internal “who’s who” thing or perhaps in a personal capacity to give friends a little insight into what the person did for a living. Whatever the reason, once the clip had lost its original context, it was difficult to know how to take it.<br /><br />Now in this particular instance, there is probably no harm done, but it got me thinking about people using YouTube and other content hosting sites as essentially a convenient way of storing and retrieving media for embedding in another site. While it’s great to be able to do this, there is a risk that the content may be interpreted and perceived differently when accessed directly or, indeed, via someone else’s site where the content or a link to it is embedded in an entirely different context.<br /><br />It’s similar to the problem we face as researchers. We have to be very careful when we report the results of our primary research studies to include commentary relating to constraints or restrictions, which, if ignored, could lead to a statistic being taken out of context and spun to mean something that is not supported by the study as a whole. As an aside, this is why we retain copyright of all of our output, even though we make much of it available free of charge and allow anyone to copy it and pass it on. If we placed it into the public domain in an unrestricted manner, it could easily be taken apart and elements presented out of context in the kind of misleading manner we have been discussing.<br /><br />Zooming out a little, this general issue of maintaining or understanding context highlights the need from an individual perspective to make sure we think before putting something out there that could be picked up in isolation or re-used by someone else for a purpose other than that which was originally intended. Wherever possible, we therefore need to make sure that media objects either contain important context within them or have an explicit reference back to the original source – e.g. your website address.<br /><br />When we are more in information consumption mode, there is then a need to pay attention to the provenance of content, particularly when looking at a site, page or post that has been assembled by pulling together material from different sources. Personally, I try to track down the original source as much as possible when I look at a quotation, statistic, or even a picture or video, before relying on something I have discovered on the Web. It is so easy to be misled if you are not careful.<br /><br />Perhaps this all sounds very obvious, but as someone who, like other analysts, has a job that involves gathering, comparing and making sense of intelligence and viewpoints from many different sources, it never ceases to amaze me how often information is misrepresented, either deliberately or unintentionally, by taking it out of context.<br /><br />With its emphasis on user generated content (UGC) coupled with the absence of editorial processes and other safeguards by design, social media just increases the risk of being caught out. Forethought and vigilance are therefore the watchwords when producing and consuming information in this brave new free-for-all Web 2.0 world.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-89840255950246594292007-11-05T08:07:00.000+00:002007-11-05T08:17:57.237+00:00Dissecting SaaSI sometimes think I am living in a parallel universe when I get into conversations about Software as a Service (SaaS). People keep talking to me as if there is some kind of seismic shift taking place in the way organisations are acquiring and running software. Then I look around me and down at the results of study after study of buying patterns and investment plans that we carry out here at <a href="http://www.freeformdynamics.com/">Freeform Dynamics</a> and all I can see is the continued gradual creep of the price per user per month hosted model that has been taking place in a steady but non-dramatic manner for the best part of a decade now.<br /><br />When I ask what it is that people are basing their evidence on, they point to solutions such as salesforce.com and Google Office, then at the number of column inches and marketing dollars being spent telling us that SaaS is the future. And yet, beyond salesforce.com finding an opening for a hosted service around a niche application that is largely stand alone and sold into green field environments (at least from a sales force automation perspective), the evidence for the revolution is pretty elusive.<br /><br />Now before any of you SaaS evangelists write me off as a grumpy old sceptic, I must point out that I am a big SaaS fan, provided you approach it sensibly. Indeed, I have championed SaaS for internal use in both of the companies I have had a hand in building – bet my businesses on it, if you like. It is my firm opinion that there can be little justification for any small business to run email servers and the like in house.<br /><br />But, I am also a realist, and the evidence I can actually have confidence in tells me that I am unusual in my acceptance of the SaaS model in a business context. For every organisation that says it has SaaS on the agenda, there’s about 7 saying they don’t. And those that are going down the SaaS route are mostly doing so very selectively – they are not looking at a complete shift in philosophy or approach as some would have us believe.<br /><br />So, SaaS is definitely a trend and this way of delivering solutions will increasingly find its place in the mix, but, in keeping with the title of this blog, let’s keep it grounded and be realistic about the rate of change that is taking place. Just because vendors say it is exploding, doesn’t make it true.<br /><br />Putting all of the SaaS mania to one side, though, the individual elements of the typical SaaS proposition are actually quite appealing to many. Paying for software on a subscription basis rather than forking out up front for a perpetual licence can help with both cash flow and the optimisation of accounts (subscriptions can be conveniently categorised as an operational cost). Having your applications hosted on someone else’s servers can be beneficial too, especially if this allows you take advantage of robust and scalable platform technology that you would otherwise not have access to. Finally, of course, having someone manage your environment for you means not having to worry about the distraction, cost and risk of maintaining the necessary resources and practices in house – IT is, after all, a non-core activity to most businesses.<br /><br />The point is, though, that you don’t need to do all of these things at once. If it is the subscription approach that appeals, you can take advantage of this without having another party manage and/or host your applications. If it is getting rid of the hassle and overhead of looking after systems then there are lots of firms willing to provide a managed service, regardless of where your software and hardware resides and who owns them. Such services, just like traditional hosting models, have been around for years and are nothing new.<br /><br />So, my advice to anyone trying to figure out where SaaS fits into their IT strategy is to look at the components of the proposition individually in the context of a specific requirement. If all three elements, the subscription approach, hosting model and managed services, seem relevant and attractive then SaaS is worth looking at, but if only one or two out of the three appeals, then look for the products, services and/or commercial terms that fit your requirement. The bottom line is you don’t have to drink the Kool Aid and commit your soul to the church of SaaS in order to benefit from any of these things.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-58854411628752282402007-10-28T18:03:00.000+00:002007-10-28T18:48:31.750+00:00IP PBX: A natural Linux workload?Every now and again, I read something or hear something quoted in the media that just doesn’t ring true. The latest was an assertion that voice over IP (VoIP) and IP Telephony (IPT) solutions are fine for larger organisations but are not ready for smaller businesses, with a suggestion, even, that lots of small businesses are putting VoIP/IPT solutions in place then ripping them out a few months later for reliability reasons. Such scare-mongering is great for generating headlines, but is misleading and can easily put people off looking at VoIP/IPT who would otherwise gain significant business benefit from it.<br /><br />This was a prompt for us to conduct a community research study to get to the bottom of what is really going on out there in VoIP user land. I am still crunching the numbers based on feedback from about 1,500 organisations with experience of VoIP/IPT, and will write up and publish the results over the next few weeks. Suffice it to say for now, though, that VoIP is alive and well in the SMB sector where satisfaction with quality of service, functionality and especially overall return on investment is actually higher among smaller organisations than their larger cousins. So don’t be put off if you are looking at VoIP for your business – it isn’t perfect, but stories of widespread disaster are wildly exaggerated.<br /><br />Watch this space for more details of the research, including thoughts from participants on what to look out for, what to avoid, and how best to move forward to maximise the chances of success.<br /><br />Meanwhile, something really interesting came out of the freeform anecdotal feedback gathered during the study that was a bit of a wakeup call for me, namely the popularity of the open source IP PBX solution <a href="http://www.asterisk.org/">Asterisk</a>. Now I don't want to create the impression that it has the same penetration as solutions and services from commercial market leaders, but it does seem to be filling an important niche for low cost but highly functional IP PBXs among the more tech-savvy contingent. Here are a few representative comments that are typical of the feedback on Asterisk we have received:<br /><em><span style="color:#3333ff;"><br /><blockquote><em><span style="color:#3333ff;">“We use Asterisk PBX, running on refurbished hardware, and using a Sangoma A100 to terminate an ISDN30e line. Phones are Atcom AT-530, using SIP. This was the only way the charity project could afford a PBX on the funding available.”<br /><br />“Using Asterisk really does give back benefits in terms of not being tied to one hardware manufacturer for phones. It’s a system that will do most if not anything you ask of it. Now looking at global deployment.”<br /><br />“We use Asterisk as our IP PBX running on custom hardware. VoIP itself is an excellent solution, with Asterisk being the best of the bunch.”<br /><br />“By using Asterisk we don't have to pay extra licensing just to have a redundant backup.“<br /><br />“It's the great functionality that we love - e.g. a single DDI per employee, no matter where they are. We use an Asterisk server and have 6 staff connected to the server using Nokia E61/E70 phones.”<br /><br />“We are a Microsoft Windows consulting firm, but have found Asterisk to be the killer app that has us using and promoting Linux.”<br /></span></em></blockquote></span></em><br />This last comment is particularly pertinent at the moment given Microsoft’s current campaign to drive VoIP solutions into the market, though I suspect MS is targeting quite a different audience.<br /><br />Anyway, thought I would share this discovery of a Linux workload that I have not seen discussed that often before (though it may be that I just wasn’t looking). It is also interesting to identify another credible open source solution that appears to have genuine appeal to smaller businesses, at least those with an IT department capable of setting up and managing their own IP PBX solution. Asterisk won’t, of course, appeal to or even be accessible to everyone (skills sets, IT bandwidth, and so on) but those who use it are generally doing so very successfully.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-50697880156063548792007-10-18T23:05:00.000+01:002007-10-18T23:14:29.543+01:00Thoughts on Software IP and PatentsA journalist contacted me the other day to ask what I thought about the story that broke earlier in the week about IP Innovation suing Red Hat and Novell over alleged patent infringements associated with Linux operating system.<br /><br />My comments on this were along the following lines:<br /><br />This kind of action was an inevitable development, even though it goes against the grain for many. While most would regard the existence of legal intermediaries who profit from such actions as distasteful, especially when it appears to undermine the positive efforts of the open source community, that community cannot resource the world’s software development requirements. The basic right for commercial organisations to protect the fruits of their investment in research and development therefore remains important to ensure continued innovation across the industry. <br /><br />That said, while I cannot comment on the specific case at hand, a general problem exists in that there is a continuum of ‘originality’ when it comes to inventions. Some patents are clearly ‘right and proper’, but issues arise in greyer areas, particularly when someone is the first to invent and patent something that others would, or indeed have, subsequently come up with independently because it is an obvious or natural way of solving a problem. The patent review process does not always capture these. <br /><br />There are also undoubtedly some patents endorsed many years ago that would probably not be approved today as the world has moved on, and by modern standards, they just don’t seem that special. Such patents are actually counter-productive in that they act as a constraint on innovation in a way that common sense says is unjustified.<br /><br />Against this background, the cat and mouse games around patents that are being played at the moment are an unwelcome distraction. If vendors are blatantly using or encouraging patent related actions simply as a competitive spoiling tactic, or using the threat of action, whether implicit or explicit, to perpetuate fear, uncertainty and doubt, their customers should speak out against them. That is not the way a good software partner should behave. <br /><br />It is such a complex area though, and taking an extreme stance either one way or the other doesn’t really help. We need a balanced approach and an appropriate review process to ensure that patents reflecting genuine investment in innovation are respected, but scatter-gun or speculative registrations do not stand in the way of progress.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-43601362246437947832007-10-16T02:32:00.001+01:002008-09-16T16:47:20.983+01:00BEA and Oracle: 0.5 + 0.7 = 0.8?The problem with takeover bids is that once made, things can never be the same again for ‘target’. At this point, I guess we cannot be absolutely sure that the proposed acquisition of BEA by Oracle will go ahead, or whether other potential buyers will enter the game as James is encouraging with his <a href="http://www.redmonk.com/jgovernor/2007/10/12/come-on-sap-make-it-interesting-make-an-offer-for-bea/">mischievous egging on</a> of SAP.<br /><br />Whatever the outcome, the shame is that BEA is now going to find it hard to shake off the cloud of uncertainty. I say ‘shame’ because despite <a href="http://keepingitgrounded.blogspot.com/2007/10/bea-death-of-packaged-applications.html">my disagreement</a> with some of the naïve views expressed around packaged applications, I quite liked what I heard in terms of core strategy at BEA World a couple of weeks ago when the ‘<a href="http://www.bea.com/framework.jsp?CNT=pr01856.htm&FP=/content/news_events/press_releases/2007">Genesis</a>’ story was presented. BEA seemed to have a good solid view of what its customers needed in terms of an SOA based middleware ‘fabric’ that is generally agnostic of specific technologies and applications. Even though Genesis was presented as a work in progress, things seemed to be heading in the right direction.<br /><br />Meanwhile, as <a href="http://www.mwdadvisors.com/blog/2007/10/oracle-proposes-to-buy-bea.html">Neil points out</a>, Oracle has had some gaps in its own middleware portfolio being pulled together under the ‘Fusion’ banner. We also know from recent Freeform Dynamics research (which we’ll be publishing soon) that, contrary to the way Oracle often spins the numbers, Fusion middleware adoption is pretty much exclusively aligned to Oracle application incumbency, i.e. there is very little penetration into organisations that do not use Oracle EBS, PeopleSoft, JD Edwards, etc. In this respect, there is little difference in the position of Oracle versus its main application rival, SAP, whose NetWeaver offering is similarly aligned to application incumbency.<br /><br />So what happens when we put all this together?<br /><br />Well, according to my own admittedly very subjective metric, I reckon I would put BEA at 0.7 on a scale of 0 to 1, where 1 would indicate an ideal set of open enabling ‘middleware’ solutions to form the linchpin of a future-proof corporate IT infrastructure. I don’t think BEA would argue too much with this – the guys there articulated some ambitious plans but acknowledged that there was still much work to be done.<br /><br />Turning to Oracle, I think Neil is right when he highlights the solution gaps, but would also call out the challenges Oracle has been having in being taken seriously as an ‘independent’ vendor in this space, given the application alignment we have seen – hence I would put Oracle at 0.5 on our notional scale.<br /><br />While bringing the BEA into the mix would round out the Oracle offering, there is a corresponding a risk that it would also undermine BEA’s positioning as genuinely independent option, especially given Oracle’s almost rabid competitive stance against SAP. There is then, of course, the obvious redundancy between the two portfolios that will need to be resolved in one way or another. Oracle seems to have got away with the ‘Apps Unlimited’ strategy based on maintaining multiple packaged application code lines, but ‘Middleware Unlimited’ would be stretching the concept beyond the realms of credibility – as well as stretching Oracle’s ability to manage an ever more fragmented R&D effort.<br /><br />So, the acquisition arithmetic is probably something like 0.5 + 0.7 = 0.8. Don’t take this too literally, I am just trying to make the point while there might be some net overall goodness generated if the acquisition proceeds, the end result is not going to be the answer to everyone’s prayers.<br /><br />Meanwhile, the most obvious beneficiary in all this is IBM, who can just sit there smiling as the one remaining genuinely independent middleware gorilla, unless, of course, you include Microsoft in the picture, but that’s another story.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-88319060876583016702007-10-05T15:55:00.000+01:002007-10-05T16:13:04.340+01:00BEA: The death of packaged applications revisited?A couple of us spent some time recently at BEA World in Barcelona. My colleague David Tebbutt has summarised some general thoughts on the overall strategy and direction that was outlined under the “Project Genesis” initiative that you can read over <a href="http://teblog.typepad.com/david_tebbutt/2007/10/bea-struts-its-.html">here</a>.<br /><br />There was an issue raised in the keynote speech and echoed in later sessions that I wanted to pick up specifically, however, as it is another example of a vendor spinning an issue to suit its messaging in a way that can easily mislead if taken at face value.<br /><br /><strong>THE SPIN</strong><br /><br />BEA has been repositioning itself in recent times from a deliverer of ‘big iron’ infrastructure for building and running business critical Java applications to the custodian of infrastructure enabled business agility and flexibility. Moves into portal technology, social computing and business process management through a combination of acquisitions and in house R&D have all been part of this, and the introduction of the word “liquid” into a lot of its branding as reinforced the positioning.<br /><br />At this level, BEA is right on the money. Our research has indicated repeatedly that businesses often feel constrained by IT’s inability to respond quickly enough to changing demands – so no arguments there. However, BEA is taking this one step further and questioning the value of packaged applications to support this ‘new world’ of adaptable, service oriented and people/process centric IT. The words are carefully crafted with phrases such as “the days of being able to innovate through packaged applications are over”, but there is a clear objective to sideline the relevance of packaged applications as we look to the future.<br /><br />It’s another example of the line we hear from other vendors and, indeed, some analysts, which argues that SOA and increasing expectations around the need for IT flexibility spells the death of the application software package. Great though this is for generating headlines or (in the case of BEA) promoting the ‘build’ side of the traditional ‘build versus buy’ argument, there are a few other things we should probably consider before throwing out our ERP systems and other ‘packages’.<br /><br /><strong>THE REALITY</strong><br /><br />Let’s start with some very obvious stuff. If you examine the way any business works, you will find that the majority of its business processes are ‘non-differentiating’. What we mean by this is that you while you need to be efficient and effective in these areas to manage costs and risks, you are unlikely to compete any more effectively in the market by inventing new ways of doing them. Examples include the vast majority of the accounting and administration that takes place in the average business, and for most to whom they are relevant, things like inventory management, manufacturing planning and execution, human resource management, logistics, and so on are non-differentiating too. Sure, there are exceptions such a Dell that gains significant competitive advantage from the way it has manages its supply chain, manufacturing and logistics activities, but if we look across industries as a whole, most of business processes we see are of the non-differentiating kind, for which it makes sense to simply adopt industry best practice rather than reinventing the wheel for the sake of it.<br /><br />So let’s be blunt – if you are not using packaged software for non-differentiating business processes then you are mad. Even if you could build a better general ledger or accounts receivable system than SAP or Oracle, you would not actually have gained anything through doing so in business terms. Of course, the chances are that whatever you came up with would not actually be as good as a package solution that has been tuned over the years in line with industry best practice and the requirements of thousands of customers, so the reality is that you would probably be worse off.<br /><br />Having said this, BEA and others make the argument that traditional packaged applications are relatively closed and monolithic in nature, which is a problem when integrating them into the overall landscape, and when you need to change the processes they support. Even non-differentiating processes need to be modified from time to time for efficiency purposes or to accommodate changes in business structure, merger and acquisition activity, new regulatory requirements, and so on. The argument continues that all of those investments made in the 90’s to put ERP and other packages into place has inadvertently locked business processes in a way that is constraining by modern standards.<br /><br />Let's put aside the configurability of most packages for a minute accept this argument for the purposes of discussion. The question then becomes, if you have an old monolithic package that is holding you back, what should you do about it?<br /><br />It is at this point that the SOA extremists pipe up with the purist line about packages becoming irrelevant in the future as organisations compose their solutions to meet the exact needs of the business by selecting and plugging together the optimum mix of best of breed software components and services. The problem is, that’s a bit like saying that because the majority of components that make up a PC are now standard and can be plugged together easily, there is no longer a need for pre-built machines. It is basically nonsense. Whether you are a business looking to buy or rent software functionality, a consumer looking for a PC, or someone in the market for a new car, you are likely to want a product that has been assembled, integration tested and delivered as a working unit, with assurances that it can be maintained and supported as such.<br /><br />The move to assembly from standards-based components and the surfacing of standards based interfaces has benefits in all contexts, however, and in acknowledgement of this, pretty much all mainstream application package vendors are moving in the direction of SOA. Will they get there overnight? Certainly not, but both SAP and Oracle, for example, are investing huge sums on re-architecting their solutions, which is something the purists often fail to acknowledge or dismiss as just ‘marketing’. The work is real, though, and the aim is to allow standards based ‘services’ to be exposed for easier integration and for selective substitution where it makes sense. Just like you buy an off the shelf PC and perhaps upgrade the graphics card, SOA will allow you take an ERP package, for example, and ‘upgrade’, say, one of the advanced planning components with a third party alternative. <br /><br />So, what we need are flexible packages rather than an abandonment of the package concept altogether.<br /><br /><strong>CONCLUSION</strong><br /><br />The story we heard from BEA this week is a very strong one, though as David points out, all of the pieces are not in place yet. When listening to the pitch around the need for more IT infrastructure flexibility that can support the ever increasing rapidity of business change, however, it is important to remember that there is still a lot of relatively routine and boring stuff that will remain best dealt with through prescriptive packaged solutions that encapsulate industry best practice. Furthermore, off the shelf packages or services (if SaaS catches on) are increasingly going to be based on flexible architectures anyway, and the trend away from customisation to ‘soft configuration’ so packages may be tailored to specific needs without code-cutting is already well underway. The ERP and CRM packages of 2007, for example, are inherently more malleable than those of mid-90’s.<br /><br />In this respect, I would probably on balance disagree with BEA that ‘software package enabled business innovation’ is dead, though I would agree that IT departments should probably be shifting their attention to using the latest infrastructure, tools, techniques and ideas of the kind BEA is promoting to support the ‘differentiating’ elements of the business in as flexible and high impact a way as possible. And in some situations, <a href="http://freeformcomment.blogspot.com/2006/10/process-modelling-and-design-dont-get.html">it may not be sensible to even model processes at all</a>, let alone lock them down in software.<br /><br />The world and technology landscape is definitely changing, but let’s not assume that embracing new ideas always means abandoning the old – or that the traditional stuff is standing still.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.comtag:blogger.com,1999:blog-7365701000433889040.post-16373124328677365122007-10-01T15:52:00.000+01:002007-10-01T14:08:02.641+01:00Does OpenOffice.org matter to the mainstream?I was clicking around on one of the news sites a few days ago and came across reference to a service portfolio called 'Open Office' that mobile operator Orange launched in the UK earlier this month. It's all about remote working, but it's not the Orange service <em>per se</em> that made me think.<br /><br />Having been doing a lot of research in the desktop computing space recently, the open source alternative to Microsoft Office, <a href="http://www.openoffice.org/">OpenOffice.org</a>, was the thing I immediately thought of when I read reports of the Orange announcement. It was then that I realised that the prominence of the term ‘Open Office’ in my mind was almost certainly unrepresentatively high compared to the mainstream. As a branding and marketing savvy outfit, my guess is that Orange would not have chosen this term if it felt that it already had a significantly strong meaning and association out there that would confuse its target audience.<br /><br />The truth is, of course, that it probably doesn’t. We in the IT profession are more likely to be familiar with it because of the ongoing debate about Microsoft dominance coupled with the noise made by open source evangelists to promote alternatives. But out in the mainstream proper, business people just get on with using Microsoft Office, blissfully unaware of all this – and, in fact, not really that bothered about finding an alternative at all.<br /><br />We came across some pretty clear evidence of this back in April when we conducted some <a href="http://www.theregister.com/2007/04/20/desktop_office_suites/">research</a> in association with <em><a href="http://www.theregister.com/">The Register</a></em> news site. You would think that if anything, this would have been biased towards open source advocates (given the nature of the site), but even here, the message was that use of and interest in MS Office alternatives, whether open source or online (e.g. Google Office), is currently very limited outside of a few niches, with much more sentiment of the kind “if it ain’t broke don’t fix it” than anything else.<br /><br />Whether you think this is good, bad or just doesn’t matter at all is immaterial, that is how it is.<br /><br />From a personal point of view I am torn. While I strongly endorse the idea of competition, I am also a pragmatist, and common sense (and feedback from a few thousand respondents during research over the past few years) tells me that the last thing we need from a business productivity and communication point of view is to fragment the installed base of desktop office tools in way that introduces compatability issues. The key issue here is file formats and how they are handled, rather than the capability of the tools themselves (assuming they do the job to the level that is required).<br /><br />On that note, we are currently having fun at Freeform Dynamics with MS Office 2007 <a href="http://en.wikipedia.org/wiki/Office_Open_XML">Open XML</a> files, which are causing some confusion when we inadvertently send one outside of the company. I would imagine anyone adopting <a href="http://en.wikipedia.org/wiki/OpenDocument">ODF</a> would run into the same kind of issues. The difference between the two, of course, is that the evidence suggests that MS Office 2007 will reach critical mass in the not too distant future, pulling Open XML adoption with it, whereas the drivers for ODF adoption are much less clear.<br /><br />Meanwhile, if you have an interest in alternatives to MS Office, whether OpenOffice.org or anything else, common sense says you need to check out whether the most common file formats used by the majority are well handled. At the moment, this means Microsoft Office binary formats (*.doc, *.ppt, etc), but will increasingly mean Open XML too.<br /><br />Right now, ODF looks like a bit of a red herring in the mainstream, expecially if you consider Open XML being embraced as a standard, but there are some big advocates looking to promote it (such as IBM) so there is a chance that this may may change and it is something we are keeping an eye on.<br /><br />The bottom line, though, is that most IT departments at the moment probably have more pressing priorities than disrupting the MS desktop status quo, which is unlikely to go down well with users anyway.<br /><br />Despite the level of apathy we are picking up, I am going to keep nosing around this whole area, as there are some segments of the market that struggle to justify the licensing fees associated with Microsoft Office tools, e.g. smaller businesses with relatively simple requirements, and my feeling is that there may well be some niches for alternative solutions opening up as time goes on.Dale Vilehttp://www.blogger.com/profile/04136788355130256923noreply@blogger.com