Sunday 30 November 2008

BLOG TERMINATED (please go to www.openreasoning.com)

Hi Everyone

I have made the decision that maintaining two blogs was bit ambitious, so have decided to close this one down and focus on the Open Reasoning instead, so please go there for ongoing posts.

You may also be interested in my more formal output in the form of reports and media articles, which can be found here.

Cheers
Dale

Wednesday 20 August 2008

iPhone: First impressions of a Blackberry user

For a while now, I have maintained two mobile phones – one for business use and one for personal use - and in terms of requirements, I need different things in each context.

There are four ‘must haves’ for my business device – good battery life, quick and convenient calling from a large directory of contacts, a solid, immediate and user friendly email capability, and acceptable security.

As a long time Blackberry user, three of these have been ‘givens’ for the past five years, the only compromise in the early days being relatively clunky functionality for making and receiving calls. The Blackberry Curve I am using at the moment, though, delivers well on this front too, so all of my needs for business use are catered for effectively.

On the personal side of the equation, my requirements are a bit different. From a calling perspective, I tend to be dialling from a much shorter list of contacts – tens rather than hundreds – and telephony use in general is much lighter. As a small business owner, there is still a requirement to access business email (as you never know what might need your direct attention), so connectivity to Microsoft Exchange and security are still important. Immediacy and user friendliness of email functionality is less so, however – these just need to be good enough allow periodic inbox browsing and very occasional replies.

Battery life is an interesting one. When using a device off duty, if I see the juice is getting a little low, I can curtail my usage and prolong the life left in the device. This is generally not an option for business use given the communication intensive nature of the job I do.

Beyond the communication stuff, there is also the recreational side of things – music, games, photography and perhaps a little web browsing. This brings me to the iPhone, and when I was looking for an upgrade to my personal device a few weeks ago before going on holiday, I felt obliged to check out this option.

Like most people who pick up an iPhone for the first time, it immediately felt quite natural, and it is the first device I have used that appeared to deliver a genuinely usable full web browsing experience – at least when connected to WiFi in the O2 store. When digging a little deeper, the Microsoft Exchange access seemed pretty well covered, the device was pin-securable with remote wipe capability, and the embedded iTunes, GPS enabled mapping, etc looked great. The only thing that seemed a little naff was the camera spec, though I figured it was probably good enough for snapshots of the kids, dog, etc.

So, I succumbed, and signed up for an iPhone on the basis that it seemed to do most of the things I wanted. But has it lived up to expectations?

Well three weeks on, I have to say that I still really like the iPhone and am pleased that I went for it. It has stood the test of real life use and quite a bit of experimentation over my recent 2-week holiday. It functions OK as a phone and call quality seems pretty good. The music playback quality is also good, especially when compared to my iPod Nano. Beyond this, there’s a reasonable number of games available to keep me amused, and, as suspected, the camera is actually OK for family snapshots, though, unsurprisingly, no good for ‘proper’ creative photography.

However, the iPhone is far from a perfect device. The most immediate problem I ran into was battery life. Perhaps optimistically, I started out running with all of the defaults – relatively bright screen setting, 3G enabled, GPS switched on, email delivered from our Exchange server through the ‘push’ mechanism (similar to Blackberry), etc. After returning home a couple of times at the end of a day out with the battery almost exhausted (with relatively light use), I suspected a little tuning was in order.

Fortunately, the iPhone allows you to switch off 3G access with the flip of a soft-switch, leaving the device running purely on the GSM network with data access over GPRS or EDGE. This improves battery life considerably, and as I don’t do much browsing when out and about, I have left it this way, figuring I can always enable 3G again for short periods when I really need to. The other adjustment that seemed to extend battery life significantly (apart from the obvious move of winding down the screen brightness) was disabling the push email mechanism and setting the device to poll the Exchange server every hour instead. Again, this adjustment can be made through soft-switch flicking, allowing the polling frequency to be set to every 15 minutes, 30 minutes or whatever, with more frequent polling clearly consuming more power.

Interestingly, enabling and disabling WiFi doesn’t appear to make a huge difference, so because of the convenience of the iPhone automatically hooking onto my home network when I arrive at the house, and discovering hotspots when out and about, I have tended to leave WiFi switched on.

As a disclaimer, I have to say that my tests have not been that scientific in that I have just been making adjustments in an attempt to get a configuration that works for me while I get on with my life. Unlike a lab test, no two days usage have been exactly the same, so what I am picking up here are gross differences in performance. That said, the one conclusion I have come to is that the combination of battery life limitations coupled with the inability to swap batteries when the power runs out makes the iPhone far from ideal for heavy business use on the basis of the power issue alone.

And while I wasn’t explicitly evaluating the device for business use, there are some other things that would cause me concern in this context. Apart from the widely reported lack of cut-and-paste capability, I noticed some quirks associated with the email client, for example, which make it difficult to do some things offline and require zooming and horizontal panning to read some email messages that refuse to word-wrap. Then, while I was pleasantly surprised at how usable the soft touch-screen keyboard was for casual text entry, I cannot imagine ever getting to the level of speed and unconscious use that comes naturally with a device that has a decent physical QWERTY keyboard. This may not be a concern for many, but it is major consideration for me, as I tend to use mobile email very interactively for business purposes.

In terms of issues from a corporate adoption perspective, others may also be concerned about lack of data encryption on the device itself, but with a sealed unit, pin access and remote wipe capability, if you take a common sense approach to assessing risk, there is probably not a huge security exposure for most business users.

When all things are considered, I would say the iPhone comes nowhere near devices such as the Blackberry Curve or 8800 series in terms of business fitness for purpose, particularly for heavy mobile data users. As a predominantly personal device, however, it is a great example of where mobile technology is going, and as I said, I am very pleased with the overall package.

As an industry analyst, I should probably grumble at the closed business practices of Apple itself in terms of controlling the distribution of content for the iPhone, but when I then think about the convenience and ease of use for a non-technical user, I can see that there is a also an upside to controlling things end to end for mass market consumer adoption.

So, the bottom line is that based on my initial impressions, I would not discourage anyone from buying an iPhone for personal use, but I would urge them to think about their requirements and do the appropriate due diligence before investing in the device for business use. As for large-scale deployment in a business environment for hard-core mobile requirements, I am not sure the device is yet ready in its current form, though if anyone has any experience to the contrary, I would love to hear from them. How do you rate the iPhone from a policy management, software distribution, maintenance and end-user support perspective for example?

Whatever the current situation, the end-user appeal of the iPhone will ensure that it makes its way into many businesses one way or another, and with Nokia, Microsoft, Palm and others already challenging RIM on fitness for purpose, we can look forward to an interesting couple of years as it all shakes out.

Tuesday 15 July 2008

Desktop power management

It’s encouraging that many of the conversations we are having at the moment in relation to IT and sustainability are moving beyond power management in the data centre. It is not that optimising the use of central IT isn’t important, but it really is only one way to drive an organisation’s environmental agenda. And even before we get to main question of how technology can enable more eco-friendly working practices, there is another place we can look to for operational IT power savings – the desktop.

When looking in this direction, though, I have noticed that there is a tendency to apply the same kind of thinking that is used on the server side of the equation. Fair enough, accelerating hardware refresh to introduce more power efficient kit into the equation reflects a similar game to that being played in the data centre, but with the carbon cost of manufacture/disposal taken into account, the net gains are hard to establish. In the data centre of course, hardware modernisation is augmented by consolidation and virtualisation to drive up average server utilisation and thus improve energy efficiency.

Virtualisation is a different game on the desktop, however. Sure, some will go down the route of running virtual PCs on the server and accessing them through thin client configurations, but it will be a long time before this is the norm. The reality is that most organisations will remain wed to their fat clients for the foreseeable future, so we need to think of the energy question a bit differently. Essentially, the challenge boils down to optimising the power consumption of desktop machines that typically idle for the majority of time they are switched on.

In order to deal with this problem, we need to think less about utilisation and inherent power efficiency of hardware and software, and more about controlling the state of machines in terms of their sleep/wake cycle. In practice, a configuration exhibiting a high degree of runtime energy efficiency, but has no active policy to transition to a low power state when idle will consume considerably more power than a less efficient machine whose state is properly managed.

This something that Microsoft makes a big point of when talking about Vista in the green context, and indeed early adopters with large Vista estates corroborate Microsoft’s claims that Vista’s enhanced manageability translates directly to power savings. The problem is, however, that Windows XP isn’t going away in a hurry, so what about all of those organisations who are interested in desktop power management but will be maintaining older versions of the operating system for some time to come?

Well the one approach that is generally acknowledged not to work that well is to educate, encourage or threaten users in an attempt to get them to keep their power configuration set in accordance with environmental policy, and/or to manually shut down their PCs or put them to sleep when they are not in use. IT managers relying on this kind of user discipline are probably not going to see the results they were hoping for unless they’re working for a totally green-tinted organisation.

Fortunately, third party solutions exist that can help to enable/enforce centralised power management – a couple of examples being Verdiem and 1E. Using such technology, you can not only cure PC insomnia from a policy enforcement perspective, but also allow real-time remote control of power state so machines can be woken up for backup or software distribution purposes then put to sleep again afterwards. So, if you are serious about saving energy across a large XP estate, the options are there.

Something I haven’t had time to look into is whether similar solutions exist for alternative desktops – namely Mac OS X and Linux. Apple kit is certainly not renowned for its enterprise management friendliness, but perhaps ‘right on’ Mac users aren’t so much of a problem as they are of course more environmentally aware. As for Linux, I would be interested in any views, recommendations or experiences.

Meanwhile, it would be great to see a bit more awareness raising from Microsoft on the availability of solutions to centrally manage power consumption by Windows XP, rather than automatically seguéing from this discussion into a Vista upgrade pitch.

Friday 27 June 2008

Justifying a large scale Vista migration

Over the past couple of months, I have had in-depth conversations with five CIOs that have made a significant commitment to Windows Vista.

One of the main issues I explored with each of them was the foundation upon which the business case for migration was made. The responses I received were remarkably consistent, and not completely in tune with the way Microsoft articulates the Vista proposition.

What all these guys said was that their business case for Vista, i.e. the one put before the board, CFO and/or other significant stakeholders, was founded on benefits in two key areas - security risk management and operational cost control.

From a security perspective, the focus tended to be on three specific attributes of Vista - better run-time security in the operating system itself, more effective policy enforcement, and the ability to encrypt data on notebook PCs through BitLocker.

What I found interesting was the view that while all three of these security related benefits were considered to be significant, it was the last one in particular that was most frequently highlighted as resonating directly with business stakeholders. Recent high profile press coverage about notebooks storing sensitive data being lost or stolen was seen to have an influence here in terms of awareness. Against this background, Vista’s ability to deal with an acknowledged business risk straight out-of-the-box was perceived to be of significant value.

Beyond security, double-digit reductions in operational cost generally formed the substance of the business case in financial terms. The general streamlining of the management and maintenance process was highlighted as part of this, and the dramatic simplification of image management in particular was seen as a significant contributor to the savings in the large multi-national environment.

Something I was personally very sceptical about, but which three of the five CIOs defended very strongly, were the savings in relation to desktop power consumption. Numbers from 50 Euros per year per desktop upwards were cited as savings, though to be absolutely clear, the benefit comes from better centralised control and enforcement of power management policies rather than efficiencies in the way Vista uses hardware resources.

When asked about the element that was clearly missing from these business cases, namely improved user productivity, the general consensus was that this was a red herring. The most positive view was that there is likely to be some impact in this area, but it is impossible to measure in any tangible way, so why would you dilute an otherwise solid business case with something that could easily discredit it? Best to stick the list of intangibles in your bottom drawer and run with what you can defend with confidence.

And it is on this point that the CIOs I have been speaking with diverge from the view articulated by Microsoft. In fact one said the obsessive reference to the great user interface, user facing productivity features, etc caused a lot of distraction and confusion when he invited a Microsoft executive to meet some of his business sponsors. When a stakeholder says, “I don’t understand, I thought we were doing this to save money”, it doesn’t actually help to get the investment case signed off.

There are a couple of lessons that fall out of this. Firstly, if you are going through the process of evaluating the business case for Vista yourself, the abovementioned criteria will hopefully provide some thoughts based on where at least a few others have put the emphasis – particularly in a large corporate or public sector environment.

Secondly, the feedback suggests that you should be prepared for business sponsors to get confused about the rationale for migrating based on the messages broadcast by Microsoft both directly and indirectly through advertising, the media, marketing collateral, etc. The trick here is agreeing that it will be a great spin-off benefit if all of the claimed or suspected end user productivity gains are realised, but keep the investment case itself focused on the more solid stuff that can be defended under cross-examination.

Finally, there is a message in here for any Microsoft executives reading this. If you can curb your enthusiasm for obsessing about the Wow! and focus on the things that drive decisions, you might see more movement in the market.

Thursday 12 June 2008

Business Intelligence and the bolting horse

There appears to be a revival of interest in Business Intelligence (BI) among IT vendors at the moment. Some pretty big guns, the likes of Oracle, IBM, SAP and Microsoft, are trying to position themselves more aggressively in this space following the spate of acquisitions.

So is this renewed vigour justified?

Well from a customer perspective it undoubtedly is. It is pretty clear when you research BI that the gap between business need and IT capability is as great as ever. When we interviewed a bunch of senior business managers from City of London financial institutions last year, for example, they were very clear about this gap:



And if you look at this chart closely, you will notice something quite interesting. While business information availability isn't that bad at an overall financial and arguably operational performance level, it is not very good when you look at more detailed measures and indicators.

Why is this interesting?

Well because it tells us that by the time those managing the business find out about something important, it is often too late to do anything about it. Stories of product, client or partner related issues only coming to light when someone starts investigating why a higher level number has been missed are quite common.

To put it another way, business managers usually have what they need to monitor the ‘effects’ of doing business, but are typically underserved when it comes to the information required to manage the underlying ‘causes’ of those effects. We discuss this more in the research report from the study if you are interested, but it does bring home the importance of incorporating continuous analytics capability into the business process itself, as well as having traditional retrospective BI operating off to one side.

The aforementioned vendors are therefore spot-on when it comes to making a big noise about the principle of integrating BI capability into applications in a more embedded fashion. Now, whether they have done a good of integrating their recent acquisitions into their broader solution set in practice is another question, but it is at least worth hearing them out.

Sunday 1 June 2008

Talking at cross purposes, or being deliberately misled?

Ever had one of those conversations where you debate something for a while then it dawns on you that each party has been talking about something different? It has happened to me quite a few times recently.

One example was in relation to Business Process Modelling (BPM), which is something I grew up with and in my mind is about, well, modelling business processes. It’s a discipline that business analysts have been involved with for a years, and while the technology to support it has moved on, and arguably some of the methodologies too, the fundamental principles haven’t changed that much for a long time now. Then someone asked Freeform Dynamics to design a research study to figure out the level to which organisations had adopted BPM. When I argued during an internal project start-up meeting that you couldn’t really ask someone about when and how they were taking something on board that they had been doing for a decade or two, it turned out that the ‘BPM’ we were being asked to investigate was actually 'Business Process Management' and was based on a definition which included the technical side of things – workflow rules engines, SOA orchestration, and so on. Not quite the technology-independent business view of BPM that I was taught earlier in my career, but as soon as the misunderstanding was cleared up, we could design the research accordingly.

Another example was prompted by a report I read the other day claiming that Software as a Service (SaaS) is now a mature and pervasive model. This was reminiscent of claims made during a number of other conversations I have had recently with SaaS advocates, that I have been struggling to reconcile with the findings of our own research. The latter has shown quite conclusively that while larger organisations are starting to make selective use of SaaS for delivering business application functionality, 'pervasive' is certainly not a word that applies in this area. Then I realised that some of the advocates were throwing a whole bunch of stuff into their definition of SaaS (or the related S+S model) that I would never dream of including when discussing the delivery of business application functionality. Internet search, traditional ISP services, and even things like consumer content services, online help and automatic updates associated with desktop software can sometimes be lumped together when referring the 'SaaS market'. Again, once the ambiguity is cleared up, you can see where people are coming from, and make a judgement on the usefulness (or otherwise) of what they are saying.

I guess we at Freeform are particularly sensitive to precision when it comes to discussing market activity, as primary research designed to figure out what’s really going on behind the buzzwords and the hype is so central to what we do. The experiences I have outlined, however, highlight how easily people can be misled by imprecise or ambiguous definitions if they are not on their guard. And with so much vested interest and evangelism driving the market, the temptation for some to spin and exploit our ever changing vocabulary is significant, so we all need to careful about what is behind those stats and definitions.

Sunday 27 April 2008

Cloud Computing and Web 2.0

Don’t you just hate it when another woolly ambiguous term is forced upon us? When I was approached by yet another journalist the other day asking me my thoughts on the impact of cloud computing, I simply sighed and told them it is a bit like Web 2.0. In itself, it is difficult to pin down exactly what is meant by it. The best you can do is say that both of these terms refer to a general direction in which the industry appears to be moving.

In the case of Web 2.0, it is about the Web becoming a generally more interactive medium. This can manifest itself at a technology level through everything from Ajax through mash-ups to SOA, and at a behavioural level through social media and the simple fact that websites are generally now more geared up to a two-way dialogue than they used to be.

In the case of cloud computing, it is about the evolution of dynamic virtualised infrastructure that allows us to think more in terms of resource pools than individual IT components. This in turn opens the door to delivering computing resource on a utility basis, which is equally applicable both internally (i.e. with regard to the way you use your data centre) and externally – which takes you into the realm of utility computing and software as a service.

The point about both Web 2.0 and cloud computing is that they both sprung up arbitrarily on the evolutionary timeline, and seeming embraced anything and everything that could be thrown into the mix. While the very specific phenomenon of social networking is certainly noteworthy, this bears little relationship to evolution of rich user interfaces and composite applications, in fact many social networking sites have appalling UIs by traditional standards. Yet Web 2.0 can mean either of these things, and, confusingly, lots of other concepts too.

Similarly, we have been talking about virtualisation ultimately leading to computing grids and utility computing for years, and giving it a new name doesn’t actually change anything in terms of the underlying trend. In fact, you knew where you stood much better when you could talk about virtualisation and grid technology as the enabling stuff, and utility computing and application services as what it enables. As everyone jumps onto the cloud computing bandwagon, it all gets mixed up and confused, just like Web 2.0.

So, if you are one of those people wondering what cloud computing is really all about after listening the IBM explanation, the Microsoft one, and the evangelical rhetoric we have heard recently from the Google and Salesforce.com camp, don’t worry, you are not alone. The trick is to think of it as a label for a trend at one level, and an industry bandwagon at another, and keep your expectations pretty low in terms of clarity and consistency for the time being. Don’t however, dismiss the underlying trend it itself. While we are not looking at a revolution here, some of the developments in this general area are really quite interesting and valuable – though, you probably knew that already, even before the marketing hype was thrust upon us.

Monday 14 April 2008

Oracle and Collaboration

I was interested to read about Angela’s experience trying to secure a briefing from Oracle on its collaboration related offerings and activities. As Angela pointed out, the ‘Big O’ was the only large vendor that ‘should’ have a story in this space that declined to tell her what it was up to.

When I later commented on this (with a link to the above) via Twitter, someone else came back to me to say that they too had been having trouble getting Oracle to open up in this area.

I have to say that this doesn’t surprise me. It must be quite challenging for Oracle at the moment trying to figure out how to position in this space. The Oracle Collaboration Suite was launched a few years ago supposedly to save the world from flaky Microsoft Exchange installations and pretty much fell flat. Oracle believed its own rhetoric about the world hating Microsoft, so looked silly to most people when it aggressively launched an initiative that would only work if customers ditched their existing Microsoft messaging infrastructure, which was never going to happen.

In addition to some of the things Angela mentioned, we have also seen the portal wars in which Oracle has consistently been on the back foot, and lately, the march of Microsoft SharePoint and a range of collaboration and unified communications offerings from IBM under the Lotus and WebSphere brands that are largely messaging system agnostic.

Then most recently, we have seen the BEA collaboration offerings thrown into the mix, which before the acquisition, were beginning to look pretty good. BEA had a very sound grasp of the heterogeneous world in which customers live and was taking a very mature view of social media in the enterprise, for example. And, of course, it wasn’t encumbered by competitive obsession, which, as an aside, is arguably one of the biggest obstacles to Oracle being accepted as a truly strategic partner in many major accounts. Telling CIOs and business executives that they have been stupid over the years to waste their money on SAP, Microsoft and IBM, for example, is not the best way to win friends in high places. While competition is good, destructive messaging generally only appeals to junior level activists. It is a huge turn-off in senior management circles.

Coming back to the original question, we should probably continue to expect Oracle to be tight-lipped on not just collaboration, but middleware strategy in general for a little while yet. I have personally been told on a couple of occasions to refer to the ‘official line on oracle.com' when looking for clarity on open questions that we hear from Oracle’s customers (old or newly acquired). Irritating though this might be, and frustrating though it is to be fobbed off with ‘Mom and Apple Pie’ type feel-good policy statements, the truth is that there is little else Oracle can do until it gets its act together properly.

And to be fair, given some of the confusion than came about as a result of articulating nice sounding stories around work-in-progress plans associated its CRM and ERP acquisitions in the past (that later had to be ‘adjusted’), it is probably better for us to hang on until Oracle really has worked out what it is trying to do in collaboration as it has in the enterprise application space.

Oracle is undoubtedly already aware that needs to be careful that the collaboration and closely related unified communications markets do not slip away from it, and will be doing what it can to make sure it doesn't get left behind again. In the meantime, it goes without saying that customers should challenge the company hard before making major commitments to it in these areas.

Friday 28 March 2008

Making chipsets interesting

At the risk of offending all those who love to talk for hours about cores, caches and clock speeds, I have to say that I personally find discussions about the innards of silicon chips and how they are wired together intensely boring. In fact, I’ve probably already used all the wrong words and phrases, even in that first sentence, which is no doubt going to annoy some people further.

So, when Tony, Martin and I were invited to a dinner to meet with some of AMD’s European executives, I was understandably in two minds about attending, especially as I am also not really into all this wining and dining stuff as some other analyst are.

I went along, though, and I’m glad I did. Sure, I found myself sucked into the odd eye glazing conversation that I only partially understood, but something that came across clearly was that AMD is investing quite a bit in ‘reaching through’ relationships with its direct customers (largely the OEMs) to the ultimate customers – Enterprises, SMBs and consumers.

Of course there is nothing new or unique in this, in fact I ran a team at Nortel Networks back in the early 00’s which did exactly the same thing (in that case, reaching through the mobile operators to understand how 3G related to their subscribers). The basic idea is that you can gain insights and tune your R&D based on direct end user/buyer input that would not be possible if you worked second hand through your customer as an intermediary. To do this well, however, you really need people who understand that end user environment and the trends that are taking place within it, and that’s not necessarily the same people that deal with your core product design from an internal perspective.

Anyway, this end-user oriented view of the world shifted discussions to more familiar territory for me during the dinner, and I enjoyed hearing people like Giuseppe Amato, who goes under the title “Director, Value Proposition Team”, explaining how the whole process works in relation to data centre evolution, high performance computing and mobile working. It changed my perception of AMD quite a bit from simply “the alternative to Intel” to that of an independent player that is committed to driving industry development in its own way.

While I am not qualified to comment on the relative merits of AMD technology versus the competition, nor its ability to execute in the cut throat world of OEM deals and supply chains, I now have a much better appreciation of why what AMD does actually matters. It is not just about price/performance or performance per watt of energy consumed, it is about shifting thresholds to make things economically or practically possible in the mainstream market that previously were not. That’s why the “what if you could....?” conversations with end customers as suppliers like AMD reach through to them are so important. And also why, for the first time in my life, I actually had some genuinely interesting conversations about silicon that were directly relevant to the world in which I live.

Wednesday 12 March 2008

Downgrading from Vista to XP

I blogged a while back on how a Vista upgrade effectively rendered my old desktop machine useless for business purposes (see Retiring Leonardo from last year). I got a lot of feedback at that time as many people out there were obviously trying to get a handle on the viability of upgrading older kit.

While this debate continues, the related question has now arisen of whether even some PCs pre-installed with Vista are capable of running it adequately. Based on my own experience, this is a very pertinent question to ask if you are considering buying anything with less than a 1.8 Ghz Core2 Duo processor with 2Gb of memory - the current minimum spec I work on for serious business use. Yet there are lots of Vista machines out there on the market that are significantly less powerful than this.

Without getting into the rights or wrongs of this state of affairs, if you are unlucky enough to be struggling with Vista on a lower spec machine, you may be interested in a recent experience I had which was a bit of a wakeup call – not just in terms of the physical performance side of things, but also on the broader question of the value of Vista from an end user perspective in a business environment.

A few months ago, I needed to replace my notebook. As a notebook to me is companion to my desktop rather than my main machine, I wasn’t looking for anything very powerful – size, weight and battery life were much more important considerations. So, after a happy couple of hours cruising up and down all of the hi-tech shops in London’s Tottenham Court Road trying all the latest kit, I opted for a Sony TZ Series – about 1.2 kilos in weight, fantastic screen, reduced size but really nice keyboard, embedded cellular modem, and lots of other good stuff.

The machine came with Windows Vista Business Edition pre-installed and when I was playing with it in the shop, it was pretty responsive – the 1.2Gz Core2 Duo processor seemed to be up to the job. When I got the machine back to the ranch and loaded everything onto it, though, I have to admit to being a little disappointed with speed. Nevertheless, it was good enough, so I just got on with using it.

Over the course of the next four months, however, the performance gradually degraded and the user experience became awful. It eventually got to the stage where it was talking 12 minutes to boot and about 6-7 minutes to shut down, with very sluggish performance in between and frequent hangs requiring a forced shutdown (which in itself was probably making matters worse).

When researching the problem on the Web, it was clear that I was not the only one to be experiencing issues with Vista on the TZ Series, and the more I read, the more the answer to my problems became obvious – ‘downgrade’ the machine to Windows XP. A few forum entries mentioned a kit on the Sony website designed to allow you to do this, with all of the relevant drivers and utilities, and a set of instructions to guide you through the process. I duly downloaded this, followed the instructions, and it just worked. The longest part was installing and patching XP itself (which you have to buy separately, by the way – your Vista licence doesn’t cover it ** See clarification below) .

The end result is fantastic. The word ‘downgrade’ seems totally inappropriate – in fact, it feels like the machine has gone through a significant upgrade. It now boots in well under 2 minutes (with all the same applications loaded as before), is highly resilient (has gone through a lot of sleep/wake cycles without crashing once) and, interestingly, many of the Sony utilities work much more naturally (I suspect they were designed for XP in the first place then ported to Vista).

The one thing I was a bit worried about was going back to XP from a usability and functionality perspective having got so used to Vista, but I was surprised to find that the experience was actually quite a positive one. Everything seemed more crisp, immediate and uncluttered and so far, the only thing I have missed is the enhanced application switching mechanism in Vista, i.e. the Alt-Tab and Windows-Tab functionality. That’s a minor sacrifice for the other benefits, though, and it only took me an hour or two to get used to the old mechanism again.

The switch back to XP was such a breath of fresh air that I have also ‘downgraded’ the desktop machine I am using at the moment. On a reasonable spec PC you don’t see the same increase in actual performance, but the XP interface still feels a lot cleaner and snappier (at least to me). Having both machines running the same OS obviously has its advantages too.

Now before everyone goes rushing out to downgrade their Vista machines based on this little story, it would be irresponsible of me not to point out that during my research, I read accounts from many happy Vista users, lots of which seemed to be getting on fine with the TZ and similarly spec’d machines. I would suspect the number and range of applications you work with has a bearing on this - remember I said that the TZ felt fine when I was just playing with OS with no applications installed before buying it. It could also, of course, be that people just accept the out-of-the-box experience as normal and don’t really question whether they are getting the best performance from their hardware. All I can say is that the downgrade was definitely the right thing for me, and is something to consider if you find yourself in a similar situation.

In the meantime, we continue to experiment with various desktop options here at Freeform Dynamics, and those looking at alternatives may be interested a post from my colleague Jon Collins entitled Why I’ve replaced Vista with Linux.

Finally, as I type this, I have a brand new MacBook sitting next to me here on my desk, and over the coming few weeks I am going to be looking at the practicalities of using the Mac in a Windows dominated mainstream business environment, so watch this space for experiences with that.

** Clarification re licensing terms: The right to downgrade Vista depends which edition you have. Vista Ultimate and Business may be downgraded within the terms of the Microsoft EULA at no additional cost, but this right does not apply to other editions of the software.

Thursday 31 January 2008

Are your IT staff adequately trained?

An interesting finding emerged from one of our recent studies into IT Service Management (ITSM). It concerns a cause and effect that is pretty obvious once it is highlighted. Put simply, IT departments operate much more smoothly and efficiently if IT staff are adequately trained.

The data, which is derived from over 1,100 responses to an online survey, is difficult to argue with. There is a clear relationship between the attention paid to IT staff training and the perceived level of burden experienced by IT. To put it another way, properly trained staff find it easier to cope with the demands placed on them in areas such as infrastructure optimisation and management to keep service levels up and costs down, effective maintenance of desktops to manage user satisfaction and keep security risks under control, and provision of helpdesk services to meet user expectations with regard to support.

What’s more, the relationship between training and operational efficiency and effectiveness is a linear one. What does that mean? Well, it doesn’t really matter whether training requirements have been neglected, if the organisation already has its act together, or if it’s somewhere in between, indications are that that incremental training will always have a positive impact. To put this into perspective, another finding from the same report was that investment in other areas, such as systems management automation and integration, does not deliver benefits in the same linear fashion. Essentially, you need to get past a threshold of capability before significant improvements are generated.

There are some interesting lessons in here for all organisations, but particularly those that have a tendency to skimp on investment in skills development. If this study is anything to go by, such an approach is clearly false economy. In fact, if you have anything to do with running an IT department that is underperforming on IT service delivery and operational efficiency, then the first port of call when looking for improvements should probably be staff development. While upgrading your systems management tools and technology may also be a necessity, investment in this way will take time to pay back. Meanwhile, a bit of additional training at a fraction of the cost is likely to have a much more immediate impact.

Oh yeah, and study also quite clearly shows that training end users can have a similar impact, reducing the burden placed on IT in areas such as desktop management and help desk delivery. The basic principle here is that adequately trained users encounter (and create) fewer problems, and when problems do occur, users are much better placed to sort themselves out.
There’s a lot more to this research than the stuff we have been talking about above, so if you’d like to learn more, you can download a full copy of the findings from here. And if you’re interested in a companion report looking at the future of IT Service Management (ITSM) in general, you can download that from here.

Friday 25 January 2008

The customer view of BEA’s acquisition by Oracle

When the BEA Oracle deal was finally announced last week, my first instinct, like many analysts and journalists I would guess, was to rush to the keyboard and bash something out. But what was there to be said that hadn’t already been covered? After re-reading my previous post on the topic, I didn’t have a great deal more to say at that point.

So, instead of writing a blog post, I composed a little questionnaire and reached out to Oracle and BEA customers through an online survey to capture opinion where the rubber meets the road. In a very short space of time, I gathered nearly 300 responses, including a lot of freeform feedback. I then spent an interesting few hours reading through and categorising people’s views, which is the part of this job I really enjoy. Gathering statistics through tick and bash surveys is one thing, but reading a few hundred comments in which a bunch of smart people tell you what they think in a totally unconstrained manner is a great way to get under the skin of a topic.

In this case, I quickly uncovered a bunch of angles on the BEA acquisition that I hadn’t previously considered. Here is quick summary the themes, both positive and negative, that I managed to pull out (ranked in order of frequency of mention):

Reasons given for why the acquisition is bad news
1. Reduced choice and competition in the market
2. Uncertainties for customers with existing product investments
3. Loss of innovation, Oracle will smother the goodness of BEA
4. Concerns about Oracle as a supplier (style and nature)
5. Increased cost for BEA users (particularly maintenance)
6. Fear of lock-in as Oracle optimises between stack components

Reasons given for why the acquisition is good news
1. A stronger and more mature solution will emerge (eventually)
2. Rescue of good technology from a company that had lost its way
3. Creation of stronger and more credible competition for IBM
4. Better synergy between BEA technology and Oracle RDBMS, tools, etc
5. Reinforcement of distinction between commercial offerings and OSS
6. More integrated approach to customers and account management

Even though a lot of these are pretty obvious, I’m sure most people looking at this list will spot a couple of angles that they hadn’t previously thought of, and if you are a customer trying work out the impact of the acquisition, then this probably isn’t a bad starting point for assessing the balance between risk and opportunity in what is actually quite a complex situation.

Of course we also gathered some stats, and I’ll throw in this chart in here that illustrates the sentiment overall.

Oracle and BEA survey

So, the initial reaction to the acquisition, while mixed, is definitely net negative.

Anyway, if you’re are interested in a drill down on the above chart broken down by customer type (BEA versus Oracle versus joint customers), along with and fuller discussion of the findings, you can check out the more complete analysis I put together here or here.