At the risk of offending all those who love to talk for hours about cores, caches and clock speeds, I have to say that I personally find discussions about the innards of silicon chips and how they are wired together intensely boring. In fact, I’ve probably already used all the wrong words and phrases, even in that first sentence, which is no doubt going to annoy some people further.
So, when Tony, Martin and I were invited to a dinner to meet with some of AMD’s European executives, I was understandably in two minds about attending, especially as I am also not really into all this wining and dining stuff as some other analyst are.
I went along, though, and I’m glad I did. Sure, I found myself sucked into the odd eye glazing conversation that I only partially understood, but something that came across clearly was that AMD is investing quite a bit in ‘reaching through’ relationships with its direct customers (largely the OEMs) to the ultimate customers – Enterprises, SMBs and consumers.
Of course there is nothing new or unique in this, in fact I ran a team at Nortel Networks back in the early 00’s which did exactly the same thing (in that case, reaching through the mobile operators to understand how 3G related to their subscribers). The basic idea is that you can gain insights and tune your R&D based on direct end user/buyer input that would not be possible if you worked second hand through your customer as an intermediary. To do this well, however, you really need people who understand that end user environment and the trends that are taking place within it, and that’s not necessarily the same people that deal with your core product design from an internal perspective.
Anyway, this end-user oriented view of the world shifted discussions to more familiar territory for me during the dinner, and I enjoyed hearing people like Giuseppe Amato, who goes under the title “Director, Value Proposition Team”, explaining how the whole process works in relation to data centre evolution, high performance computing and mobile working. It changed my perception of AMD quite a bit from simply “the alternative to Intel” to that of an independent player that is committed to driving industry development in its own way.
While I am not qualified to comment on the relative merits of AMD technology versus the competition, nor its ability to execute in the cut throat world of OEM deals and supply chains, I now have a much better appreciation of why what AMD does actually matters. It is not just about price/performance or performance per watt of energy consumed, it is about shifting thresholds to make things economically or practically possible in the mainstream market that previously were not. That’s why the “what if you could....?” conversations with end customers as suppliers like AMD reach through to them are so important. And also why, for the first time in my life, I actually had some genuinely interesting conversations about silicon that were directly relevant to the world in which I live.
Friday, 28 March 2008
Wednesday, 12 March 2008
Downgrading from Vista to XP
I blogged a while back on how a Vista upgrade effectively rendered my old desktop machine useless for business purposes (see Retiring Leonardo from last year). I got a lot of feedback at that time as many people out there were obviously trying to get a handle on the viability of upgrading older kit.
While this debate continues, the related question has now arisen of whether even some PCs pre-installed with Vista are capable of running it adequately. Based on my own experience, this is a very pertinent question to ask if you are considering buying anything with less than a 1.8 Ghz Core2 Duo processor with 2Gb of memory - the current minimum spec I work on for serious business use. Yet there are lots of Vista machines out there on the market that are significantly less powerful than this.
Without getting into the rights or wrongs of this state of affairs, if you are unlucky enough to be struggling with Vista on a lower spec machine, you may be interested in a recent experience I had which was a bit of a wakeup call – not just in terms of the physical performance side of things, but also on the broader question of the value of Vista from an end user perspective in a business environment.
A few months ago, I needed to replace my notebook. As a notebook to me is companion to my desktop rather than my main machine, I wasn’t looking for anything very powerful – size, weight and battery life were much more important considerations. So, after a happy couple of hours cruising up and down all of the hi-tech shops in London’s Tottenham Court Road trying all the latest kit, I opted for a Sony TZ Series – about 1.2 kilos in weight, fantastic screen, reduced size but really nice keyboard, embedded cellular modem, and lots of other good stuff.
The machine came with Windows Vista Business Edition pre-installed and when I was playing with it in the shop, it was pretty responsive – the 1.2Gz Core2 Duo processor seemed to be up to the job. When I got the machine back to the ranch and loaded everything onto it, though, I have to admit to being a little disappointed with speed. Nevertheless, it was good enough, so I just got on with using it.
Over the course of the next four months, however, the performance gradually degraded and the user experience became awful. It eventually got to the stage where it was talking 12 minutes to boot and about 6-7 minutes to shut down, with very sluggish performance in between and frequent hangs requiring a forced shutdown (which in itself was probably making matters worse).
When researching the problem on the Web, it was clear that I was not the only one to be experiencing issues with Vista on the TZ Series, and the more I read, the more the answer to my problems became obvious – ‘downgrade’ the machine to Windows XP. A few forum entries mentioned a kit on the Sony website designed to allow you to do this, with all of the relevant drivers and utilities, and a set of instructions to guide you through the process. I duly downloaded this, followed the instructions, and it just worked. The longest part was installing and patching XP itself (which you have to buy separately, by the way – your Vista licence doesn’t cover it ** See clarification below) .
The end result is fantastic. The word ‘downgrade’ seems totally inappropriate – in fact, it feels like the machine has gone through a significant upgrade. It now boots in well under 2 minutes (with all the same applications loaded as before), is highly resilient (has gone through a lot of sleep/wake cycles without crashing once) and, interestingly, many of the Sony utilities work much more naturally (I suspect they were designed for XP in the first place then ported to Vista).
The one thing I was a bit worried about was going back to XP from a usability and functionality perspective having got so used to Vista, but I was surprised to find that the experience was actually quite a positive one. Everything seemed more crisp, immediate and uncluttered and so far, the only thing I have missed is the enhanced application switching mechanism in Vista, i.e. the Alt-Tab and Windows-Tab functionality. That’s a minor sacrifice for the other benefits, though, and it only took me an hour or two to get used to the old mechanism again.
The switch back to XP was such a breath of fresh air that I have also ‘downgraded’ the desktop machine I am using at the moment. On a reasonable spec PC you don’t see the same increase in actual performance, but the XP interface still feels a lot cleaner and snappier (at least to me). Having both machines running the same OS obviously has its advantages too.
Now before everyone goes rushing out to downgrade their Vista machines based on this little story, it would be irresponsible of me not to point out that during my research, I read accounts from many happy Vista users, lots of which seemed to be getting on fine with the TZ and similarly spec’d machines. I would suspect the number and range of applications you work with has a bearing on this - remember I said that the TZ felt fine when I was just playing with OS with no applications installed before buying it. It could also, of course, be that people just accept the out-of-the-box experience as normal and don’t really question whether they are getting the best performance from their hardware. All I can say is that the downgrade was definitely the right thing for me, and is something to consider if you find yourself in a similar situation.
In the meantime, we continue to experiment with various desktop options here at Freeform Dynamics, and those looking at alternatives may be interested a post from my colleague Jon Collins entitled Why I’ve replaced Vista with Linux.
Finally, as I type this, I have a brand new MacBook sitting next to me here on my desk, and over the coming few weeks I am going to be looking at the practicalities of using the Mac in a Windows dominated mainstream business environment, so watch this space for experiences with that.
** Clarification re licensing terms: The right to downgrade Vista depends which edition you have. Vista Ultimate and Business may be downgraded within the terms of the Microsoft EULA at no additional cost, but this right does not apply to other editions of the software.
While this debate continues, the related question has now arisen of whether even some PCs pre-installed with Vista are capable of running it adequately. Based on my own experience, this is a very pertinent question to ask if you are considering buying anything with less than a 1.8 Ghz Core2 Duo processor with 2Gb of memory - the current minimum spec I work on for serious business use. Yet there are lots of Vista machines out there on the market that are significantly less powerful than this.
Without getting into the rights or wrongs of this state of affairs, if you are unlucky enough to be struggling with Vista on a lower spec machine, you may be interested in a recent experience I had which was a bit of a wakeup call – not just in terms of the physical performance side of things, but also on the broader question of the value of Vista from an end user perspective in a business environment.
A few months ago, I needed to replace my notebook. As a notebook to me is companion to my desktop rather than my main machine, I wasn’t looking for anything very powerful – size, weight and battery life were much more important considerations. So, after a happy couple of hours cruising up and down all of the hi-tech shops in London’s Tottenham Court Road trying all the latest kit, I opted for a Sony TZ Series – about 1.2 kilos in weight, fantastic screen, reduced size but really nice keyboard, embedded cellular modem, and lots of other good stuff.
The machine came with Windows Vista Business Edition pre-installed and when I was playing with it in the shop, it was pretty responsive – the 1.2Gz Core2 Duo processor seemed to be up to the job. When I got the machine back to the ranch and loaded everything onto it, though, I have to admit to being a little disappointed with speed. Nevertheless, it was good enough, so I just got on with using it.
Over the course of the next four months, however, the performance gradually degraded and the user experience became awful. It eventually got to the stage where it was talking 12 minutes to boot and about 6-7 minutes to shut down, with very sluggish performance in between and frequent hangs requiring a forced shutdown (which in itself was probably making matters worse).
When researching the problem on the Web, it was clear that I was not the only one to be experiencing issues with Vista on the TZ Series, and the more I read, the more the answer to my problems became obvious – ‘downgrade’ the machine to Windows XP. A few forum entries mentioned a kit on the Sony website designed to allow you to do this, with all of the relevant drivers and utilities, and a set of instructions to guide you through the process. I duly downloaded this, followed the instructions, and it just worked. The longest part was installing and patching XP itself (which you have to buy separately, by the way – your Vista licence doesn’t cover it ** See clarification below) .
The end result is fantastic. The word ‘downgrade’ seems totally inappropriate – in fact, it feels like the machine has gone through a significant upgrade. It now boots in well under 2 minutes (with all the same applications loaded as before), is highly resilient (has gone through a lot of sleep/wake cycles without crashing once) and, interestingly, many of the Sony utilities work much more naturally (I suspect they were designed for XP in the first place then ported to Vista).
The one thing I was a bit worried about was going back to XP from a usability and functionality perspective having got so used to Vista, but I was surprised to find that the experience was actually quite a positive one. Everything seemed more crisp, immediate and uncluttered and so far, the only thing I have missed is the enhanced application switching mechanism in Vista, i.e. the Alt-Tab and Windows-Tab functionality. That’s a minor sacrifice for the other benefits, though, and it only took me an hour or two to get used to the old mechanism again.
The switch back to XP was such a breath of fresh air that I have also ‘downgraded’ the desktop machine I am using at the moment. On a reasonable spec PC you don’t see the same increase in actual performance, but the XP interface still feels a lot cleaner and snappier (at least to me). Having both machines running the same OS obviously has its advantages too.
Now before everyone goes rushing out to downgrade their Vista machines based on this little story, it would be irresponsible of me not to point out that during my research, I read accounts from many happy Vista users, lots of which seemed to be getting on fine with the TZ and similarly spec’d machines. I would suspect the number and range of applications you work with has a bearing on this - remember I said that the TZ felt fine when I was just playing with OS with no applications installed before buying it. It could also, of course, be that people just accept the out-of-the-box experience as normal and don’t really question whether they are getting the best performance from their hardware. All I can say is that the downgrade was definitely the right thing for me, and is something to consider if you find yourself in a similar situation.
In the meantime, we continue to experiment with various desktop options here at Freeform Dynamics, and those looking at alternatives may be interested a post from my colleague Jon Collins entitled Why I’ve replaced Vista with Linux.
Finally, as I type this, I have a brand new MacBook sitting next to me here on my desk, and over the coming few weeks I am going to be looking at the practicalities of using the Mac in a Windows dominated mainstream business environment, so watch this space for experiences with that.
** Clarification re licensing terms: The right to downgrade Vista depends which edition you have. Vista Ultimate and Business may be downgraded within the terms of the Microsoft EULA at no additional cost, but this right does not apply to other editions of the software.
Thursday, 31 January 2008
Are your IT staff adequately trained?
An interesting finding emerged from one of our recent studies into IT Service Management (ITSM). It concerns a cause and effect that is pretty obvious once it is highlighted. Put simply, IT departments operate much more smoothly and efficiently if IT staff are adequately trained.
The data, which is derived from over 1,100 responses to an online survey, is difficult to argue with. There is a clear relationship between the attention paid to IT staff training and the perceived level of burden experienced by IT. To put it another way, properly trained staff find it easier to cope with the demands placed on them in areas such as infrastructure optimisation and management to keep service levels up and costs down, effective maintenance of desktops to manage user satisfaction and keep security risks under control, and provision of helpdesk services to meet user expectations with regard to support.
What’s more, the relationship between training and operational efficiency and effectiveness is a linear one. What does that mean? Well, it doesn’t really matter whether training requirements have been neglected, if the organisation already has its act together, or if it’s somewhere in between, indications are that that incremental training will always have a positive impact. To put this into perspective, another finding from the same report was that investment in other areas, such as systems management automation and integration, does not deliver benefits in the same linear fashion. Essentially, you need to get past a threshold of capability before significant improvements are generated.
There are some interesting lessons in here for all organisations, but particularly those that have a tendency to skimp on investment in skills development. If this study is anything to go by, such an approach is clearly false economy. In fact, if you have anything to do with running an IT department that is underperforming on IT service delivery and operational efficiency, then the first port of call when looking for improvements should probably be staff development. While upgrading your systems management tools and technology may also be a necessity, investment in this way will take time to pay back. Meanwhile, a bit of additional training at a fraction of the cost is likely to have a much more immediate impact.
Oh yeah, and study also quite clearly shows that training end users can have a similar impact, reducing the burden placed on IT in areas such as desktop management and help desk delivery. The basic principle here is that adequately trained users encounter (and create) fewer problems, and when problems do occur, users are much better placed to sort themselves out.
There’s a lot more to this research than the stuff we have been talking about above, so if you’d like to learn more, you can download a full copy of the findings from here. And if you’re interested in a companion report looking at the future of IT Service Management (ITSM) in general, you can download that from here.
The data, which is derived from over 1,100 responses to an online survey, is difficult to argue with. There is a clear relationship between the attention paid to IT staff training and the perceived level of burden experienced by IT. To put it another way, properly trained staff find it easier to cope with the demands placed on them in areas such as infrastructure optimisation and management to keep service levels up and costs down, effective maintenance of desktops to manage user satisfaction and keep security risks under control, and provision of helpdesk services to meet user expectations with regard to support.
What’s more, the relationship between training and operational efficiency and effectiveness is a linear one. What does that mean? Well, it doesn’t really matter whether training requirements have been neglected, if the organisation already has its act together, or if it’s somewhere in between, indications are that that incremental training will always have a positive impact. To put this into perspective, another finding from the same report was that investment in other areas, such as systems management automation and integration, does not deliver benefits in the same linear fashion. Essentially, you need to get past a threshold of capability before significant improvements are generated.
There are some interesting lessons in here for all organisations, but particularly those that have a tendency to skimp on investment in skills development. If this study is anything to go by, such an approach is clearly false economy. In fact, if you have anything to do with running an IT department that is underperforming on IT service delivery and operational efficiency, then the first port of call when looking for improvements should probably be staff development. While upgrading your systems management tools and technology may also be a necessity, investment in this way will take time to pay back. Meanwhile, a bit of additional training at a fraction of the cost is likely to have a much more immediate impact.
Oh yeah, and study also quite clearly shows that training end users can have a similar impact, reducing the burden placed on IT in areas such as desktop management and help desk delivery. The basic principle here is that adequately trained users encounter (and create) fewer problems, and when problems do occur, users are much better placed to sort themselves out.
There’s a lot more to this research than the stuff we have been talking about above, so if you’d like to learn more, you can download a full copy of the findings from here. And if you’re interested in a companion report looking at the future of IT Service Management (ITSM) in general, you can download that from here.
Friday, 25 January 2008
The customer view of BEA’s acquisition by Oracle
When the BEA Oracle deal was finally announced last week, my first instinct, like many analysts and journalists I would guess, was to rush to the keyboard and bash something out. But what was there to be said that hadn’t already been covered? After re-reading my previous post on the topic, I didn’t have a great deal more to say at that point.
So, instead of writing a blog post, I composed a little questionnaire and reached out to Oracle and BEA customers through an online survey to capture opinion where the rubber meets the road. In a very short space of time, I gathered nearly 300 responses, including a lot of freeform feedback. I then spent an interesting few hours reading through and categorising people’s views, which is the part of this job I really enjoy. Gathering statistics through tick and bash surveys is one thing, but reading a few hundred comments in which a bunch of smart people tell you what they think in a totally unconstrained manner is a great way to get under the skin of a topic.
In this case, I quickly uncovered a bunch of angles on the BEA acquisition that I hadn’t previously considered. Here is quick summary the themes, both positive and negative, that I managed to pull out (ranked in order of frequency of mention):
Reasons given for why the acquisition is bad news
1. Reduced choice and competition in the market
2. Uncertainties for customers with existing product investments
3. Loss of innovation, Oracle will smother the goodness of BEA
4. Concerns about Oracle as a supplier (style and nature)
5. Increased cost for BEA users (particularly maintenance)
6. Fear of lock-in as Oracle optimises between stack components
Reasons given for why the acquisition is good news
1. A stronger and more mature solution will emerge (eventually)
2. Rescue of good technology from a company that had lost its way
3. Creation of stronger and more credible competition for IBM
4. Better synergy between BEA technology and Oracle RDBMS, tools, etc
5. Reinforcement of distinction between commercial offerings and OSS
6. More integrated approach to customers and account management
Even though a lot of these are pretty obvious, I’m sure most people looking at this list will spot a couple of angles that they hadn’t previously thought of, and if you are a customer trying work out the impact of the acquisition, then this probably isn’t a bad starting point for assessing the balance between risk and opportunity in what is actually quite a complex situation.
Of course we also gathered some stats, and I’ll throw in this chart in here that illustrates the sentiment overall.

So, the initial reaction to the acquisition, while mixed, is definitely net negative.
Anyway, if you’re are interested in a drill down on the above chart broken down by customer type (BEA versus Oracle versus joint customers), along with and fuller discussion of the findings, you can check out the more complete analysis I put together here or here.
So, instead of writing a blog post, I composed a little questionnaire and reached out to Oracle and BEA customers through an online survey to capture opinion where the rubber meets the road. In a very short space of time, I gathered nearly 300 responses, including a lot of freeform feedback. I then spent an interesting few hours reading through and categorising people’s views, which is the part of this job I really enjoy. Gathering statistics through tick and bash surveys is one thing, but reading a few hundred comments in which a bunch of smart people tell you what they think in a totally unconstrained manner is a great way to get under the skin of a topic.
In this case, I quickly uncovered a bunch of angles on the BEA acquisition that I hadn’t previously considered. Here is quick summary the themes, both positive and negative, that I managed to pull out (ranked in order of frequency of mention):
Reasons given for why the acquisition is bad news
1. Reduced choice and competition in the market
2. Uncertainties for customers with existing product investments
3. Loss of innovation, Oracle will smother the goodness of BEA
4. Concerns about Oracle as a supplier (style and nature)
5. Increased cost for BEA users (particularly maintenance)
6. Fear of lock-in as Oracle optimises between stack components
Reasons given for why the acquisition is good news
1. A stronger and more mature solution will emerge (eventually)
2. Rescue of good technology from a company that had lost its way
3. Creation of stronger and more credible competition for IBM
4. Better synergy between BEA technology and Oracle RDBMS, tools, etc
5. Reinforcement of distinction between commercial offerings and OSS
6. More integrated approach to customers and account management
Even though a lot of these are pretty obvious, I’m sure most people looking at this list will spot a couple of angles that they hadn’t previously thought of, and if you are a customer trying work out the impact of the acquisition, then this probably isn’t a bad starting point for assessing the balance between risk and opportunity in what is actually quite a complex situation.
Of course we also gathered some stats, and I’ll throw in this chart in here that illustrates the sentiment overall.

So, the initial reaction to the acquisition, while mixed, is definitely net negative.
Anyway, if you’re are interested in a drill down on the above chart broken down by customer type (BEA versus Oracle versus joint customers), along with and fuller discussion of the findings, you can check out the more complete analysis I put together here or here.
Sunday, 30 December 2007
Here’s to a more balanced 2008
While everyone seems to be busy making predictions about hot technologies, revolutionary industry developments, and various tipping points being reached in 2008, I can’t help hoping that we see a bit more balance emerging in views and opinions over the coming 12 months. It’s probably wishful thinking given that more extreme and/or disruptive ideas are used as a lever for selling everything from hardware and software to management consulting and analyst research, but it would be nice to see us getting away from bandwagons, magic bullets and the simplistic 'single track' thinking that often accompanies them.
Of course that’s not to say that interesting things aren’t happening, and we can look forward some important trends and developments continuing to unfold in the coming year, such as the ongoing move towards more virtualised, dynamic and service oriented infrastructures, the gradual evolution of sourcing and outsourcing options, the awakening of more enterprises to the potential of social computing, etc. The only real seismic shifts we are likely to see, however, are in marketing collateral, analyst reports and the media.
So, while many around us are ‘bigging up’ SaaS, cloud computing, open source software, Web 2.0, and so on, we will continue to do what Freeform Dynamics has always done - examine all of the ideas and propositions in a practical, down-to-earth and objective manner, and provide insights and advice for those working in the real and complex world of ‘brown field’ IT and business.
And with this focus, the ‘how?’ is just as important as the ‘what?’ and the ‘why?’, so our emphasis on community research, tapping into the experience of practitioners as well as strategists, will remain a big part of what we do going forward. During 2007, we gathered over 45,000 responses from IT and business professionals in our research studies. Our analysts therefore really do have a good in-depth understanding of what’s going on out there, and it is a position we fully intend to maintain.
Let me finish by saying a big thank you to everyone that has supported Freeform Dynamics since it was founded two years ago, and wish all of our subscribers, readers, clients, partners, friends and anyone else who knows us a happy, harmonious and ‘balanced’ 2008.
Of course that’s not to say that interesting things aren’t happening, and we can look forward some important trends and developments continuing to unfold in the coming year, such as the ongoing move towards more virtualised, dynamic and service oriented infrastructures, the gradual evolution of sourcing and outsourcing options, the awakening of more enterprises to the potential of social computing, etc. The only real seismic shifts we are likely to see, however, are in marketing collateral, analyst reports and the media.
So, while many around us are ‘bigging up’ SaaS, cloud computing, open source software, Web 2.0, and so on, we will continue to do what Freeform Dynamics has always done - examine all of the ideas and propositions in a practical, down-to-earth and objective manner, and provide insights and advice for those working in the real and complex world of ‘brown field’ IT and business.
And with this focus, the ‘how?’ is just as important as the ‘what?’ and the ‘why?’, so our emphasis on community research, tapping into the experience of practitioners as well as strategists, will remain a big part of what we do going forward. During 2007, we gathered over 45,000 responses from IT and business professionals in our research studies. Our analysts therefore really do have a good in-depth understanding of what’s going on out there, and it is a position we fully intend to maintain.
Let me finish by saying a big thank you to everyone that has supported Freeform Dynamics since it was founded two years ago, and wish all of our subscribers, readers, clients, partners, friends and anyone else who knows us a happy, harmonious and ‘balanced’ 2008.
Saturday, 8 December 2007
Managing signal to noise
A couple of months ago, I decided to get stuck in a bit more to the whole social media thing, as a few conversations with other’s that were much more active than me had planted the seed in my mind that I might be missing out on something. Those who know me will realise that this wasn’t so much me getting involved in social media for the first time, as I have been a producer and consumer of blogs for a couple of years now. It was more a case of stepping up a level.
Anyway, I made a real effort to go through the blog rolls of the 20 or so blogs to which I was already subscribed, took recommendations on interesting wikis, and signed up for a bunch more feeds. I also decided to explore the extreme real-time end of social media, and signed up to Twitter.
Fast forwarding to this weekend, I have just deleted my Twitter account and got rid of most of the RSS feeds I had added as part of the exercise.
Why?
Well two reasons. Firstly, I just couldn’t keep up with everything. I struggle to stay on top of my incoming email already, so having too many other streams to monitor and sort through just means more time away from the family and ‘real life’ and/or more chance of missing something important. This last point leads me on to the second reason paring things back again – the signal to noise ratio got considerably worse as I expanded my subscriptions beyond the hand-picked sources I had already been using.
One of the particular challenges I encountered was that so many bloggers and Twitterers out there are clearly on a mission or pushing a specific agenda. Nothing wrong with that in principle provided you take what you read with a pinch of salt, and I personally find it interesting and useful to understand the range of views that exists. Unless you are on the same mission, though, such sources quickly become very boring. There are only so many ways of making the case for ODF, for example, and a daily stream of evangelism thereafter is really just noise to most people.
However, with the exception of Twitter, which I struggled to see the point of, I did actually get some benefit from exploring things a bit more widely. I now have a list of blogs and wikis that might not have a high enough level of genuinely new insights to subscribe to on an ongoing basis, but do represent sources to browse from time to time to keep up to speed in certain areas or provide input for research. The difference is that it will be me going to them rather than them coming to me from this point onwards – which is pretty much the way I have been using the Web for the last decade.
So, while I remain a big fan and active user of social media, I have discovered that to me it is the content being exchanged that matters more than the act of communicating itself. Perhaps that makes me relatively ‘unsociable’ in the online sense, but when it’s the socialising that takes precedent, it is only natural that the signal to noise ratio deteriorates.
Again, nothing inherently wrong with this, but just like I all those ‘put the world to rights’ conversations in pubs, small talk and one-upmanship competitions at parties, etc, activities that are primarily about social interaction should not be confused with the production or exchange of useful information. Somewhere in between lies the ‘conversation around the water cooler’ that forms an important part keeping people informed and tuned in, and there are blogs out there that encapsulate this spirit and are therefore very worthwhile subscribing to (e.g. monkchips). Most of the other feeds I am left with are concerned with blogs and wikis that explore issues and debates in an objective, informed and thought provoking manner, with high level of original content - but these are harder to find than I think many social media advocates like to admit.
At the end of the day, it’s all about how you spend your time, so the trick is to find the optimum balance between continuous incoming streams and keeping tabs on the sources of information that are useful to access but on more of an ‘on demand’ basis. The next stop for me on my social media adventure is therefore tagging and bookmarking.
Anyway, I made a real effort to go through the blog rolls of the 20 or so blogs to which I was already subscribed, took recommendations on interesting wikis, and signed up for a bunch more feeds. I also decided to explore the extreme real-time end of social media, and signed up to Twitter.
Fast forwarding to this weekend, I have just deleted my Twitter account and got rid of most of the RSS feeds I had added as part of the exercise.
Why?
Well two reasons. Firstly, I just couldn’t keep up with everything. I struggle to stay on top of my incoming email already, so having too many other streams to monitor and sort through just means more time away from the family and ‘real life’ and/or more chance of missing something important. This last point leads me on to the second reason paring things back again – the signal to noise ratio got considerably worse as I expanded my subscriptions beyond the hand-picked sources I had already been using.
One of the particular challenges I encountered was that so many bloggers and Twitterers out there are clearly on a mission or pushing a specific agenda. Nothing wrong with that in principle provided you take what you read with a pinch of salt, and I personally find it interesting and useful to understand the range of views that exists. Unless you are on the same mission, though, such sources quickly become very boring. There are only so many ways of making the case for ODF, for example, and a daily stream of evangelism thereafter is really just noise to most people.
However, with the exception of Twitter, which I struggled to see the point of, I did actually get some benefit from exploring things a bit more widely. I now have a list of blogs and wikis that might not have a high enough level of genuinely new insights to subscribe to on an ongoing basis, but do represent sources to browse from time to time to keep up to speed in certain areas or provide input for research. The difference is that it will be me going to them rather than them coming to me from this point onwards – which is pretty much the way I have been using the Web for the last decade.
So, while I remain a big fan and active user of social media, I have discovered that to me it is the content being exchanged that matters more than the act of communicating itself. Perhaps that makes me relatively ‘unsociable’ in the online sense, but when it’s the socialising that takes precedent, it is only natural that the signal to noise ratio deteriorates.
Again, nothing inherently wrong with this, but just like I all those ‘put the world to rights’ conversations in pubs, small talk and one-upmanship competitions at parties, etc, activities that are primarily about social interaction should not be confused with the production or exchange of useful information. Somewhere in between lies the ‘conversation around the water cooler’ that forms an important part keeping people informed and tuned in, and there are blogs out there that encapsulate this spirit and are therefore very worthwhile subscribing to (e.g. monkchips). Most of the other feeds I am left with are concerned with blogs and wikis that explore issues and debates in an objective, informed and thought provoking manner, with high level of original content - but these are harder to find than I think many social media advocates like to admit.
At the end of the day, it’s all about how you spend your time, so the trick is to find the optimum balance between continuous incoming streams and keeping tabs on the sources of information that are useful to access but on more of an ‘on demand’ basis. The next stop for me on my social media adventure is therefore tagging and bookmarking.
Sunday, 2 December 2007
Avaya crosses the line
This is not going to be an in-depth post. I just wanted to put on record that I was very impressed with a lot of what I heard during the Avaya industry analyst conference a couple of weeks ago.
It was a pretty big gathering, with analysts from across the world rubbing shoulders with each other. I love events like this, as while we here at Freeform are continuously researching the European and North American markets, it is great to talk with people who have in-depth knowledge of thrusting economies like India and China.
With so many analysts on one place, it also reinforced the myriad of different styles, approaches and areas of coverage that exist within the research community. I guess it will be no surprise that with Avaya’s heritage, the majority of the delegates were specialists in the communications industry, and I lost count of the number of conversations I had on the nitty gritty of the telephony market that left me way behind.
So why was I impressed?
Well, I am a bit of a hybrid when it comes to coverage in that I think of myself as a business and IT analyst primarily, but with a reasonable working knowledge of how the communications part of the equation touches this world. This is very relevant to the Avaya discussion as one of the big topics of the conference was Unified Communications (UC). I don’t want to dwell on this specifically as Robin Bloor, who was also at the event, has already written a pretty good treatment of the topic, but the main point is that UC represents the clearest business and application level cross-over between the traditional IT and telephony spaces outside of the call centre environment that we have seen to date, and Avaya seems to ‘get’ what’s important to be successful once you cross over the old dividing line. The understanding is multi-dimensional too, i.e. Avaya is thinking as much about partnerships, IT related architectures and standards, and business process enhancement in the broader application sense, as well as simply neat functionality.
If you are an Avaya customer, I would encourage you to catch up with the firm’s latest developments in unified comms and 'Communications Enabled Business Processes' (CEBP), as ways of bridging the gap between domains that are still considered separate by many.
I am going to resist saying much more at this stage as Jon Collins and I will be spending some time in a week or so with the most visible player in the unified comms space, Cisco, and one of the objectives we have is to bring ourselves completely up to date with its ideas and developments with regard to IT/comms convergence. I’ll also have to track down the guys at my old firm Nortel, as there have been some interesting developments coming out of that camp in recent times too, and it is a while since I have caught up with them properly.
Looking at the bigger picture, the coming together of communications and IT at the application and process as well as the network level is a significant development which represents opportunities for both suppliers and customers. But it is obviously not just the traditional comms players that are moving into this area – IT incumbents such as Microsoft and IBM are also very active (see here and here) – they are just coming at it from a different direction. You’ll therefore be seeing us spending a lot of time on this topic in 2008.
Meanwhile, it is nice to see Avaya, backed by its new found private equity arrangement, starting to cross the line into the world of IT so convincingly.
It was a pretty big gathering, with analysts from across the world rubbing shoulders with each other. I love events like this, as while we here at Freeform are continuously researching the European and North American markets, it is great to talk with people who have in-depth knowledge of thrusting economies like India and China.
With so many analysts on one place, it also reinforced the myriad of different styles, approaches and areas of coverage that exist within the research community. I guess it will be no surprise that with Avaya’s heritage, the majority of the delegates were specialists in the communications industry, and I lost count of the number of conversations I had on the nitty gritty of the telephony market that left me way behind.
So why was I impressed?
Well, I am a bit of a hybrid when it comes to coverage in that I think of myself as a business and IT analyst primarily, but with a reasonable working knowledge of how the communications part of the equation touches this world. This is very relevant to the Avaya discussion as one of the big topics of the conference was Unified Communications (UC). I don’t want to dwell on this specifically as Robin Bloor, who was also at the event, has already written a pretty good treatment of the topic, but the main point is that UC represents the clearest business and application level cross-over between the traditional IT and telephony spaces outside of the call centre environment that we have seen to date, and Avaya seems to ‘get’ what’s important to be successful once you cross over the old dividing line. The understanding is multi-dimensional too, i.e. Avaya is thinking as much about partnerships, IT related architectures and standards, and business process enhancement in the broader application sense, as well as simply neat functionality.
If you are an Avaya customer, I would encourage you to catch up with the firm’s latest developments in unified comms and 'Communications Enabled Business Processes' (CEBP), as ways of bridging the gap between domains that are still considered separate by many.
I am going to resist saying much more at this stage as Jon Collins and I will be spending some time in a week or so with the most visible player in the unified comms space, Cisco, and one of the objectives we have is to bring ourselves completely up to date with its ideas and developments with regard to IT/comms convergence. I’ll also have to track down the guys at my old firm Nortel, as there have been some interesting developments coming out of that camp in recent times too, and it is a while since I have caught up with them properly.
Looking at the bigger picture, the coming together of communications and IT at the application and process as well as the network level is a significant development which represents opportunities for both suppliers and customers. But it is obviously not just the traditional comms players that are moving into this area – IT incumbents such as Microsoft and IBM are also very active (see here and here) – they are just coming at it from a different direction. You’ll therefore be seeing us spending a lot of time on this topic in 2008.
Meanwhile, it is nice to see Avaya, backed by its new found private equity arrangement, starting to cross the line into the world of IT so convincingly.
Subscribe to:
Posts (Atom)