19th > October > 2005 Archive

Web 2.0: you're not even slightly whelmed

LettersLetters This month's Web 2.0 Conference has been heralded as the birth of a new technology frontier: by people who want to sell you tickets to Web 2.0 conferences, and by web designers who want VC funding. We didn't coin the phrase Bubble 2.0 - The Economist beat us to that one by a fortnight - but we did suggest there were many other areas of technology that could usefully be improved first. (See ) Now here's what you think. We also recommend Nicholas Carr's The Amorality of Web 2.0 which homes in on the New Age aspect - the web as transcendent religious experience - and Robin Miller's terrific confession that he was tempted to sell the VCs a blank CD, such was their eagerness to fund any old rubbish. 'Just as you can't build a house on sand, you can't build "a global operating system" based on a presentation layer and a few scripting kludges.' And I laughed out loud over the notion of CSS as a transmission protocol. The self-congratulation of the digerati truly knows no bounds. Bubble 2.0 indeed. Best, Ralph Lombreglia Brillant and spot-on.... Please write more! Robert Shaw ITU Internet Strategy and Policy Advisor The webcon people should probably be told that IPO's and overpriced buyouts by big tech companies are very much Web 1.0. Kind regards, James Hoskins. Loved what you said about Bubble 2.0. Personally I feel these techno-love-ins are about as far removed from the reality of networked technology as its as possible to get. I make my living from networked services and the number one thing I run into every day is stuff that doesn't work. Just about everything is a kludge. A hopeless rag-bag or ancient code anjud other bits of crap slopped together into a finished system just because it has a shiny front-end. With Flickr: no one can tell me that a company that was bought for $18 million has a backend infrastructure that cost anything like that. It's amazing it's as reliable as it ism considering the whole thing is probably hanging together on cheapo hardware and all kinds of hacked-together scripts that fix *that day's problem*, that ends up as such an enormous legacy of badly written code that almost no one knows how it works and no one knows how to render it back down to a cohesive whole again. And these problems are prevalent in commercial products, free products, open source products - you name 'um and there's a Titanic-sized software engineering problem that goes with them. We've created such a monster, such a headless goliath built on quick-sand that we're all sinking without trace. Everyday users are confronted with the same old problems we never resolve. But let me get back to the Utopians. Hardly anything they suggest is capable of challenging the real problems our species faces now. An army of Bloggers is not going to stop the ice caps from melting. But seriously, outside their pampered middle-class lives so much of this drivel has hardly any profound application. Take online voting. Just supposing someone cam up with a perfect, fraud-free and simple way to do it. Then do we suppose it would actually improve the credibility of the democractic process by one iota? Edward Thompson said in the New Statesman more than 25 years ago that elections were "just a theatrical show of legitimacy" so all we'd end up with is crappy leaders, faster, because no one has went back to re-examine the substance. The Internet: it's good for news, current affairs and guerilla-journalism, sure. What else? An armchair rummage-sale courtesy of eBay? Tracking down celebrity-skin from Google? Badly encoded MP3s from eDonkey? Downloading porn and warez from newsgroups? It's hard not to feel utterly cynical about the whole thing when you see Skype making money from infrastructure it didn't invest a cent in or more hugely overvalued dot-com cowboys starting gambling sites to rip more people of on a scale that would even made Vegas casinos blush. Sometimes I find myself asking one question: is this kind of technology itself the problem? Are we really convinced it has anything substantial to offer other than peddling even more rubbish? If you ask me the technology at our disposal now has decades to go before its anything like as transparent, reliable and purposeful as it could be. Kevin Hall On the issue of Balkanization, Bill Nichols is more optimistic. One size doesn't fit all, he argues, but There may be another way to look at this. Your comments about the deficiencies of Internet 1.0 are dead on, but certainly not the fault of the originators in 1968 and 197x. Our use is outgrowing the design of the original net, with somewhat predictable results. QoS, priorities, filtering, etc. are all adaptive attempts to remedy these limitations. The high speed Internet2 used by science and government is slightly different than 1.0, but more specifically, limited in who connects for what currently. It will probably remain that way. What will be the result? We will eventually realize that not only does one size not fit all, it *cannot* fit all. We are in the process of building multiple nets that will interconnect at well defined points, not everywhere, with well defined protocols and limits. It must be that way to serve the separate interests. There are security, performance and access requirements that a single net can only approximate, at considerable expense and overhead. Yet it will not result in Balkanization, but speciation. Nets will be optimized for classes of traffic and classes of applications. They will be invisible but there if you need them, efficient for specific purposes. Overall, a network of nets will be more robust and more efficient than one master net that has to either warp itself or warp traffic to work. One net cannot work well with all these new demands, therefore we are seeing sprouts, seedlings of a new growth that will deliver the MultiNet(C). I think Google's dark fiber may be the first step in this direction. Bill Nicholls And right into the nitty gritty, Adesso's Phil Stanhope takes issue with many of the technical assumptions behind Web 2.0. Like the fact that we're always on, all the time, with a flawless broadband connection:- Andrew, Well said. I sat in a workshop a few weeks ago with a collection of Global 500 CIOs on SOA. Interestingly, they understood that SOA/Web Services was simply when you boil away the hoopla, just an interface API. That most web-services are simple encapsulations over other (older) systems. That there is a huge difference between offering a beta service (perhaps indefinitely) in the consumer space, versus an intranet service that allows composite applications to be created in ways not anticipated in advance. But once those apps are created they become an instant legacy. Those of them who'd had enough of an internal ecosystem of web-services up and running were now tackling the inevitable: change management and version control problems. Yes, good problems to have, but nonetheless, old problems. You can't have the Flickr experience once you have an internal application in production that sales/customer service are depending on. In many many ways, SOA as we currently have it is simply interface-based componentry where we've solved the transport/serialize/deserialize problem. We do deeply believe in rich clients. And web services. And SOA. But there are many misconceptions out there. If you're dumb enough to deprecate a method in a web-service ... or add another one but change the semantics ... or have a required semantic usage pattern that can't be readily conveyed in WSDL ... well, is it really any different than doing CORBA/OMG, DSOM, DCOM, COM++, or Enterprise Beans? No. Are people doing this? Yes. Is it better to get a SOAP exception that fails to be able to call a method in a service? Or is it better to never be able to compile/link in the first place? There used to be a problem called "DLL Hell". How is changing a DLL after the original software shipped any different than revising a web-services interface and having no way to change the code that called the previous one? It isn't. Another favorite misconception of mine is that Google et al are thin-clients. Google downloads megabytes of javascript onto devices in order to do some of their latest tricks. These products have detailed knowledge of browser programming models and have developed means to improve on the plain old, dumb HTML presentation layer. But these are the same sandbox problems that virus writers, spammers, etc have exploited for years. Google Maps, in many ways, is a "clever exploit". [This was written before the MySpace worm knocked out the site. That was one guy's first AJAX app, it took him a week to write, studying just one hour a day - ed] My favorite way to show that Google depends on deep knowledge of the browser that it's trying to deliver bits to is to try and load news.google.com over a GPRS connection on a PocketPC Phone. It simply fails after a long-period of time (i.e it can DNS resolve, it can start to receive some bits from the source, but then it simply never loads). Hit news.bbc.co.uk (or theregister.co.uk for that matter), and guess what you get? An actual web page delivered to you that you can read. Wow ... what a concept! LOCKSS is interesting in many ways, but it just shows one example of a valid use case. At Adesso we focus on another one only delivering the data to your devices that you need. Now, based on rules, that may be all the data you have on your other devices. Or it may be all that you need on a PDA. Or it may be all that you need to process your field service tickets. What I object to about the Bubble 2.0 hallucination is the presumption that we're always on, always connected. Phil Stanhope CTO, Adesso Systems When I see "Bubble 2.0" I can't help but think of the real estate market! Anyway, I read your article - since you discuss such a wide range of topics I think it might have been better to split it up into multiple articles, but I found many of the things discussed interesting. I agree Marketing/PR departments shouldn't decide how to build products. As a fellow Californian I'm not sure if your stereotype of us being utopians that think the Internet will solve all our problems applies. Karl No fear, there are many here in California who build real systems, and have very little time for Web 2.0 crapola. And finally Adrian Burdess has a suggestion: Save The Utopians! The starry eyed sun stroked utopian's are a lovely bunch of people. They are a unique representation of a breed of the human race seemingly psychologically incapable of any form of cynicism what so ever and like any rare curiosity they should be preserved for study in their natural environment. If we don’t act now we stand to loose these rare and beautiful creatures and technology gives us options we never had in the 60's when the last bunch died off in the pandemic that was the 70's. I suggest the best place should be some form of "Nature Reserve" or "Wildlife Game Reserve" where we can study these unusual creatures remotely from inside hides and from behind fences so as best not to interfere with their rather cute almost delicious society. Of course we would have to carefully restrict access to this "Reserve" from the outside whilst ensuring that the Utopians were able to inter relate without obstruction, barrier, impediment or most importantly, any lack of potential trust. These "Trustbased Reserve's" would help ensure that they continue to exist for many generations by "Walling" them off from periodic outbreaks of cynicism and worse the more common and insidious disease, realism. Adrian Burdess Awesome. Thank you. ®
Andrew Orlowski, 19 Oct 2005

Macromedia outs Flash on BREW

Macromedia will develop a version of its popular Flash player for Qualcomm's BREW runtime, helping both companies dig deeper into the market for mobile application developers.
Gavin Clarke, 19 Oct 2005
fingers pointing at man

Fayrewood still mulling bid proposal

Fayrewood, the European distie, says it is still in talks that could lead to an offer to all or part of the group. It says discussions are still at an early stage. So not much movement then, since 26 August, when the company first announced that it was in discussions. What's new today is a trading update from the directors who "believe that the operational performance of the Group has been unaffected by these discussions and report that trading in the year to date has been in line with management expectations and normal seasonal patterns". ®
Taylor Walton, 19 Oct 2005
hands waving dollar bills in the air

Two-factor banking

People who lived through the Second World War, like my grandparents, had a very different view of money than those of us who grew up in the Information Age. Many of us still remember being told how foolish it is to keep one's life savings under a bed mattress, because the banks were known as trusted entities that will always do a better job of looking after your money. Even my grandparents, albeit reluctantly, came to realize that putting trust in financial institutions was the only way to go. That trust is eroding, however, in light of a massive onslaught of phishing scams on the Internet. The irony is that the security issues surrounding this kind financial theft are by-and-large due to the poor security and social engineering of an individual - and therefore the responsibility for losses are similarly owned by that individual, not the bank. There are all sorts of toolbars [1] [2] [3], security approaches, and browser extensions that try to mitigate this threat, but they're all ineffective - not because they don't work, but because they'll never get installed on the computers of people who really need them. The forced use of two-factor authentication for banking systems accessible over the Internet is our only hope for the mitigating the phishing threat. And since the banks have no financial responsibility to do this on their own, the only way this is ever going to happen is by requiring them to do it through legislation. Some approaches in the banking world In the US, federal regulators are now requiring banks to have at least two-factor authentication with their websites by the end of 2006. The Federal Financial Institutions Examination Council (made up of the FDIC - Federal Deposit Insurance Corp, the US Federal Reserve, the US Comptroller the Currency, and others) has very recently issued a press release as well as specific, non technology-specific guidance (PDF) on the need for two-factor authentication. It's an idea being sold to banks and the public as a way to address identity theft in a supposedly proactive manner. In Sweden, one Internet bank has used the interesting idea of one-time passwords mailed out on a "scratch-pad," but even that novel approach has been attacked and compromised by a recent phishing scam. There has been some suggestion on the use of drop-down menus on Internet banking sites to thwart the use of keyloggers, but many Trojans also capture screenshots so this approach really isn't very good. While not quite phishing-specific, here's a funny one for you. Sometimes a con-artist is so slick he can convince a senior people at several major European banks to hand over hundreds of thousands of dollars (or rather, Euros) in the bathroom stall at a public bar. "Psst, I'm a secret agent and I need your help." When they caught up with this guy, he was already suntanning on a beach. A case for tokens I've been doing online banking for over five years, and many of our readers have been doing it longer. Five years is more than enough time for the banks to figure out a cost-effective, long-term solution to the problem of stolen passwords (which soon becomes stolen money). Today they secure their internal systems just fine, and they've trained their staff on how to absolve all responsibility when a customer's machine is infected with a Trojan and their bank account has been compromised: "Don't worry, our internal banking systems are quite secure. Have a nice day." We've all known people infected with Trojans, keyloggers, spyware, and the like. The first thing I tell people when they call for advice is to get off the phone with me and immediately call their bank - reset their passwords or disable Internet access to their accounts altogether - and hope that it isn't too late. A token is often a small keychain-like device with a non-repeating number that changes every minute. These are made by a number of companies, and they've been used in the corporate world for many years. It's time that (1) banks eat the cost of providing these tokens, (2) more governments besides just the US force the use of two-factor authentication in the banking world, and (3) people understanding security, meaning all of us, lobby their elected officials to get the proper legislation in place. I have to agree with what Bruce Schneier wrote recently, that pushing all the responsibility from consumers to financial institutions (and most likely, doing it through legislation, if you ask me) is the only way to get this done. A secure public terminal? I look at many people's computer as an unsafe public terminal. When I'm invited over to a friend's place for dinner, I'm afraid to do anything on their machine because I know all the nasty things it could be infected with... logging my passwords, stealing my identity, and so much more. I always wonder how badly it's owned up. If you've ever checked your bank account from a public terminal at an Internet café like I have, you immediately realize two things: one, it's an incredibly dumb thing to do, and two, having a token as a password that changes every minute would dramatically lower the overall risk - regardless of how 0wn3d the machine really is. In certain unexpected circumstances, either using a public terminal or abstaining from access altogether may be the only choice. Where are our tokens? The average person doesn't understand how phishing works or is prevented, because the security world is so complicated - and yet the risk of losing money through one's Internet banking account is a very simple concept to understand. It's time that more governments around the world step in to ensure that Internet banking remains safe. Copyright © 2005, SecurityFocus
Kelly Martin, 19 Oct 2005

Woman drops sprog on Dutch Big Brother

Dutch channel Talpa TV has pushed back the envelope of broadcasting excellence by becoming the first to broadcast a reality TV birth - albeit eight hours after the actual event. Big Brother contestant Tanja dropped baby Joscelyn Savanna on Tuesday night in accordance with the guidelines laid down by the Dutch Social Affairs Ministry, viz: no live broadcast of the event and birth in a specially-prepared private room with midwife in attendance. Tanja, 27, was seven months' pregnant when she entered the BB house in August, the BBC reports. She's one of nine remaining hopefuls eyeing the €400,000 prize for being the last social inadequate standing when the show ends on 22 December. Jocelyn Savanna, meanwhile, can only be filmed a few hours a day and stays in a private room where mum and relatives can visit unmolested by the cameras. Footage of the birth itself apparently focussed mainly on mum's face and the midwife doing her stuff, which is a relief. Naturally, some have questioned whether the whole thing is just a stunt to boost ratings. Not so, said BB spokeswoman Stephanie Dekker who reckons "the scenes were not explicit, and that more graphic birth footage had been shown on educational and medical programmes". Which, as we all know is a complete and utter load of cobblers, but it all makes for good, clean family entertainment. We look forward to BB's first live suicide with eager anticipation. ®
Lester Haines, 19 Oct 2005

Hola Gringo, can you get me a mobile?

European and North American mobile users irked by high tariffs and spotty coverage on mobile phones should spare a thought for the citizens of Cuba. For the land of sun, salsa and socialism is so afflicted with bureaucracy that the only way ordinary people can get a pre-paid mobile phone is to have a foreigner sign up for them. Cuba's tightly regulated state-run mobile providers (Cubacel and one other) only offer services to foreigners or authorised Cubans. The only way around the restrictions for ordinary Cubans is to persuade a tourist to sign up on their behalf. Even then coverage is limited to the major cities (Havana, Santiago, Trinidad, Pinar del Rio etc). And expensive by local standards: sign-up fees are 50 Cuban pesos convertibles (pegged to the US$ so $50) which comes with $15-worth of prepaid calls or texts. The situation has existed for two years or more but an eyewitness account from reader Chris still makes for an interesting read. In the city of Trinidad where I was staying with a local family, I was given a lift in a dangerous old Chevrolet to Cubacel’s offices to help the landlady’s son to open his account. Here I was sat for 40 minutes filling in four separate forms, submitting a photocopy of my passport and providing my signature, along with the son AND the Cubacel official as 'witness'. The form read something along the lines of I consent for (insert name) to open a pre-paid mobile account on my behalf. Effectively, it’s my account. Cubans just shrugged when I showed my complete and utter dumbfoundedness at this unnecessary barrier to development and said "este pais" (this country), very Alan Partridge. After two and a half hours in immigration and baggage reclaim at Havana's Jose Marti airport where I had to declare bottles of mineral water that I had in my rucksack, I thought none of the country's bureaucracy would surprise me. But this incident was astounding. While I respect Fidel for making Cuba self-sufficient under the most unjust embargo, he really needs to open up the country to new technology. There was no broadband either, it was like the South Park episode with the man from 1996 [Prehistoric Ice Man] when they find a man frozen in ice for three years and to make him feel at home they play him Ace of Base and give him inadequate internet connection. According to latest figures from the CIA world fact book there were only 17,900 mobiles in Cuba in 2002. "Wireless service is expensive and remains restricted to foreigners and regime elites," it notes. ®
John Leyden, 19 Oct 2005

Nominet votes for Argentinian solution to net ownership

In an historically unusual decision, the company running all .uk internet domains, Nominet, has voted for an Argentinian solution to the current crisis over internet ownership. In an official statement, Nominet's legal and policy director Emily Taylor said the company preferred Argentina's proposal over the other seven on the table - including one by the EU and put forward by the UK government. That proposal would see things continue pretty much as they are but with the creation of a worldwide forum in which governments, private sector, civil society and international organisations would all play a part. "From Nominet's perspective, nothing radical needs to change in internet governance," Taylor said. "Intervention by governments worldwide, each with their own political agenda and cultural beliefs to uphold, threatens to consign the internet to a future of over-regulation." The issue of internet governance - who should run the internet and how - exploded last month at a UN conference in Geneva. The United States, which currently has unilateral overall control of the internet, wants the status quo to stay the same, whereas most countries, particularly Brazil, China and Iran, want control to be shared among many governments. Late in the conference, the EU arrived with a radical proposal that spliced the two sides together in an effort to reach compromise. Of that effort, Nominet says: "We do not see the EU and US positions as fundamentally incompatible: the EU position does, after all talk about 'not replacing existing structures' and emphasises 'complementarity' between different actors." Despite there being eight proposals (from Africa, Argentina, Brazil, Canada, EU, Iran, Japan and Russia), there are essentially three models being proposed. The status quo: The system continues as is with ICANN in charge and a new forum is created that comes up with solutions to public policy issues i.e. dealing with spam or cybercrime or new top-level domains. (Africa, Argentina, Canada) The hybrid: A new forum is created as well as a new body that is given overall control of ICANN. Essentially the hands-off US government role is replaced with a more hands-on international government consortium. (EU, Japan) The government approach: A new body run by governments which takes over from ICANN. (Brazil, Iran, Russia) It is hardly surprising that Nominet would go for the more free market approach being a company that has benefitted from a gentle hand on the controls. Plus, of course, being the UK, Nominet has little to fear from, or to be ideologically opposed to, the current US-led and run internet. Nominet's Taylor summed up the company's position: "Amidst calls for international intervention to avoid dominance of the Internet by a single state, Nominet believes that we should be looking to take more a pragmatic, incremental approach to internet governance and not seek to completely overhaul a model that allows for flexibility, innovation and is founded on private sector investment. We hear the political debate with regard to the root zone - our perspective is operational: it should be secure and authoritative. Requests for changes must be authenticated and acted on quickly." Nominet's - and many others' - fear is that the EU proposal would see governments fight out world political issues at the top level of the internet, as opposed to letting the internet get on with its own thing - a situation that undeniably led to its enormous success. The EU is extremely keen to spell out however that it sees the governmental body it proposes as being very hands-off. The issue is simply: why should the US government remain in control of a vital global resource? ®
Kieren McCarthy, 19 Oct 2005

Saddam trial set for live webcast

Legal junkies needing an as-it-happens fix of court-on-former-dictator action will be pleased to learn that the trial of Saddam Hussein will be broadcast live on the web - sort of. The Iraqi Special Tribunal charged with bringing Saddam to justice has its own website, but will not be running live footage itself, CNET reports. Instead it will provide live untranslated material for exterior agencies, leaving the news boys to add their own English soundtrack and disseminate to an expectant world. Asscociated Press says it will stream proceeedings "as close to a live timeframe as possible", although the Tribunal has already imposed a delay of either 20 or 30 minutes, depending on who you talk to. MSNBC.com will offer "some kind of a video stream" but has yet to decide on the final mix of courtroom drama and correspondents. CNN affiliate Court TV will show the whole trial to subscribers, while CNN itself will concentrate on TV coverage, dishing up selected morsels on its website. ABC News, meanwhile, intends to stream the trial tapes through its 24-hour broadband channel. As of this morning, the BBC's website was promising continuous coverage of the trial with a 20 minute delay. Saddam's dateline with legal destiny is due to kick off today. Whether it proves a OJ-Simpson-style ratings hit remains to be seen. ®
Lester Haines, 19 Oct 2005
arrow pointing up

Bullguard signs up Ingram Micro

Bullguard, the British IT security firm, has signed up Ingram Micro European to aid expansion onto the Continent. Ingram will carry Bullguard's software for PCs and mobile devices, and says the technology is "attractive to our value-added resellers [and] represents an interesting opportunity for the retail market". Bullguard is gunning for "rapid market capture and territory growth in Europe", mirroring its experience in the UK. It says its has experienced "very high levels of success" with an aggressive distribution strategy in its home country. ®
Team Register, 19 Oct 2005

Leeds City Council scoops technology Oscar

Leeds City Council has scooped the tenth annual Socitm Excellence in IT award for a new mobile working initiative for its Social Services and Home Care workers. The digital pen and paper project allows staff to file information about service activity directly from clients' homes, cutting out a huge amount of paperwork and duplication of effort. Doug Sutherland, Leeds council's head of corporate business relationship management (really, that is his job title), says that the project has improved staff morale as well as efficiency. He estimates that the council's 1,700 carers look after more than 6,000 clients, and in so doing, used to have to fill in upwards of two million pieces of paper every year. This used to involve a lot of duplication of effort, something he says is "very demoralising" for staff who really want to be out in the field, doing their jobs. The Digital Pen and Paper Technology Within The Home Care Environment project means staff now use digital pen and paper to record information when visiting clients. The data captured is then sent back to head office via a secure mobile phone link. "Carers have more time for patients now, more time to work with each other," Sutherland said. "It is also much more efficient for the council." He explained that the project has not been without its challenges, particularly in relation to the lifecycle of the mobile phones, and the phone company's keenness to follow a regular and (we think) fast upgrade cycle. Sutherland says his team has made sure the learning process has been well documented, along with the return on investment figures the council has achieved. "The information is there for other councils who want to do this," he says, adding that Leeds city council will support other councils who are interested in implementing a similar project. Charles Ward, chief operating officer of IT trade association Intellect and chair of the awards judging panel said that the project is "an excellent example of how emerging technologies can be used in a simple way to improve service delivery to citizens, and make life easier for front-line delivery staff". Lambeth Council and Cumbria County Council were also recognised for their projects. More information on those can be found here. ®
Lucy Sherriff, 19 Oct 2005
homeless man with sign

Bruce Schneier talks cyber law

RSA Europe 2005RSA Europe 2005 ISPs must be made liable for viruses and other bad network traffic, Bruce Schneier, security guru and founder and CTO of Counterpane Internet Security, told The Register yesterday. He said: “It’s about externalities – like a chemical company polluting a river – they don’t live downstream and they don’t care what happens. You need regulation to make it bad business for them not to care. You need to raise the cost of doing it wrong.” Schneier said there was a parallel with the success of the environmental movement – protests and court cases made it too expensive to keep polluting and made it better business to be greener. Schneier said ISPs should offer consumers “clean pipe” services: “Corporate ISPs do it, why don’t they offer it to my Mum? We’d all be safer and it’s in our interests to pay. “This will happen, there’s no other possibility.” He said there was no reason why legislators do such a bad job of drafting technology laws. Schneier said short-sighted lobbyists were partly to blame. He said much cyber crime legislation was unnecessary because it should be covered by existing laws – “theft is theft and trespass is still trespass”. But Schneier conceded that getting international agreements in place would be very difficult and that we remain at risk from the country with the weakest laws – in the same way we remain at risk from the least well-protected computer on the network.®
John Oates, 19 Oct 2005

Boffins hook bone-eating snot flower

Scientists from London's Natural History Museum and Göteborg University have pulled off a bit of a coup in discovering a previously-unknown species of worm feeding on a dead minke whale bones. The 1-2cm creature, boasting "frond-like tentacles" was found devouring a minke skeleton off the Swedish coast at a depth of 120m, the Times reports, adding little except to note that it has been named Osedax mucofloris - or "bone-eating snot flower" to its mates. Not the most attractive of scientific monikers, it must be said, but not as silly as Cummingtonite (a mineral made up of magnesium iron silicate hydroxide and named after Cummington, Massachusetts where it was first discovered) or Arsole - the the arsenic equivalent of pyrrole. ® Bootnote There's more on the bone-eating snot flower down at the BBC.
Lester Haines, 19 Oct 2005

Tackle Linux kernel programming

Site offerSite offer The Linux® Kernel Primer is the definitive guide to Linux kernel programming. The authors' unique top-down approach makes kernel programming easier to understand by systematically tracing functionality from user space into the kernel and carefully associating kernel internals with user-level programming fundamentals. Their approach helps you build on what you already know about Linux, gaining a deep understanding of how the kernel works and how its elements fit together. One step at a time, the authors introduce all the tools and assembly language programming techniques required to understand kernel code and control its behavior. They compare x86 and PowerPC implementations side-by-side, illuminating cryptic functionality through carefully-annotated source code examples and realistic projects. The Linux® Kernel Primer is the first book to offer in-depth coverage of the rapidly growing PowerPC Linux development platform, and the only book to thoroughly discuss kernel configuration with the Linux build system. If you know C, this book teaches you all the skills and techniques you need to succeed with Linux kernel programming. Whether you're a systems programmer, software engineer, systems analyst, test professional, open source project contributor, or simply a Linux enthusiast, you'll find it indispensable. Save 30% on this and thousands of other comprehensive IT guides, at The Register Bookshop. Linux® Kernel Primer, The RRP £35.99 - Reg price - £25.19 - Saving £10.80 (30%) Learn Linux kernel programming, hands-on: a uniquely effective top-down approach. Network Administrators Survival Guide RRP £35.99 - Reg price - £25.19 - Saving £10.80 (30%) The all-in-one practical guide to supporting your Cisco network Microsoft Exchange Server 2003 Unleashed RRP £42.99 - Reg price - £30.09 - Saving £12.90 (30%) Most extensive Exchange 2003 reference found on the market from one of the world's leading Microsoft server experts. Build Master, The RRP £28.99 - Reg price - £20.29 - Saving £8.70 (30%) Written by a top Microsoft consultant, this book will become the standard guide to the build process in the software engineering lifecyle. Unix for Mac OS X 10.4 Tiger RRP £21.99 - Reg price - £15.39 - Saving £6.60 (30%) Handy reference upgraded and updated for Mac OS X Tiger. WebWork in Action RRP £31.99 - Reg price - £22.39 - Saving £9.60 (30%) WebWork helps developers build well-designed applications quickly by creating re-usable, modular, web-based applications. Novell GroupWise 7 User's Handbook RRP £24.99 - Reg price - £17.49 - Saving £7.50 (30%) Master use of the improved and enhanced GroupWise client from the authoritative source, Novell Press. Sams Teach Yourself Macromedia Dreamweaver 8 in 24 Hours RRP £21.99 - Reg price - £15.39 - Saving £6.60 (30%) Sams Teach Yourself Macromedia Dreamweaver 8 in 24 Hours is fully updated to the newest version of Dreamweaver, providing you a comprehensive and up-to-date learning and reference tool. Macromedia Fireworks 8 RRP £31.99 - Reg price - £22.39 - Saving £9.60 (30%) Officially authorized by Macromedia, hands-on, project-based lessons include 16-20 hours of instruction aimed at getting readers completely up to speed with Fireworks 8. Tak RRP £10.99 - Reg price - £7.69 - Saving £3.30 (30%) BradyGames’ Tak: The Great Juju Challenge Official Strategy Guide. Includes detailed area maps and a comprehensive walkthrough of the entire adventure. Don’t forget your opportunity to review current and previous offers: The Reg Bestsellers Last week at The Reg Great new releases This week's book bag
Team Register, 19 Oct 2005

Tom's Hardware probes the front bottom

Reader Alan Potter has just written in to point us in the direction of an illuminating Tom's Hardware review of Chenbro's Granite Case: Yes indeed, what caught Alan's eye was the unit's Front Bottom - not a common sight on a PC it must be said. What's more, the front bottom in question "has two interesting features. The first is the screwless fan bracket". OK, that's enough. Back to work for you lot and we're off to write some proper stories about Symbian driving the m-commerce economy. ®
Lester Haines, 19 Oct 2005

Google loses its G-spot

A trademark dispute has forced Google to re-brand its Gmail web mail service in the UK. Existing users get to retain their Gmail address (at least for now) but from Wednesday onwards new UK users will be given a Googlemail email address instead. UK-based financial services firm Independent International Investment Research (IIIR) said its subsidiary ProNet Analytics has been using the Gmail name for a web-mail application since the middle of 2002, two years before Google began offering Gmail accounts to consumers. The email service offered by ProNet, by contrast, is used mainly by investors in currency derivatives. The two companies entered talks into the right to use the Gmail brand but the negotiations broke down several months ago after they failed to agree a financial settlement. An IIR-commissioned assessment put a minimum value on the Gmail brand of £25m ($46m), a figure Nigel Jones, Google's senior European counsel, described as "exorbitant". Google continues to dispute IIIR's trademark claim. Jones told the BBC that to "avoid any distraction to Google and our users" it was switching brands to Googlemail in the UK while trademark lawyers attempt to resolve the dispute. A separate trademark dispute forced Google to switch from Gmail to Googlemail in Germany back in May. The BBC reports that German Googlemail users sent email to their username at "@gmail.com" instead of "@googlemail.com" still receive these misdirected messages. ® Related links Google's statement
John Leyden, 19 Oct 2005

BT waking up to unbundling threat

AnalysisAnalysis Thanks to local loop unbundling (LLU), internet users in the UK are now able to hook up to broadband services with speeds of up to 24 meg - a million miles away from just a couple of years ago when getting your hands on a 512k service was all that many could hope for. Snag is, LLU is only economically viable in large towns and cities which means large chunks of the UK are unlikely to benefit from increased competition and instead be forced to rely on wholesale services from incumbent telco BT. Yet despite its recent setbacks, LLU is starting to have a marked impact on the UK's telecoms sector - not least because it has forced BT to act more swiftly to compete with rivals. So says Douglas Lamont, Corporate Development Director at Wanadoo UK, who believes that without LLU, BT Wholesale would still be plodding along with little or no impetus to offer improved services. "If LLU was not around, we would still be at 1 meg," Lamont told The Register. Instead, the UK's dominant fixed line telco looks set to increase broadband speeds across its network to 8 meg from next spring as it plays catch-up to an ever-increasing number of ISPs lining up to take advantage of LLU and the promise of faster speeds and more innovative services. LLU - the process by which ISPs and telcos install their own kit in telephone exchanges to provide services direct to end users while by-passing BT - is on the verge of taking off in the UK. According to the latest figures there are around 125,000 unbundled lines in the UK with an extra 4,000 being added each week. On Monday, Wanadoo UK which is owned by France Telecom and boasts more than 800,000 broadband users in the UK, became the biggest name to date to press the button for LLU. From next month, Wanadoo will begin connecting customers to 150 unbundled exchanges across Leeds, London, Bristol, Manchester and Birmingham. "The clear economic driver [for investing in LLU] is our existing customer base," said Lamont. Migrating them over to LLU gives Wanadoo greater freedom to provide other services such as VoIP and video-on-demand. And it can make providing broadband more cost effective than reselling a service from BT Wholesale. But Lamont insists that the ISP is not looking solely to cherry pick the most lucrative exchanges. Instead, the ISP plans to target specific towns and cities so that its marketing and advertising can be tailored to draw in new consumers. Over the next year or so Wanadoo expects to install its kit in around 500 BT exchanges, and has plans to unbundle more still. But unless the current economics of LLU change noticeably, Lamont reckons that Wanadoo is only ever likely to unbundle between 1,000 and 1,200 exchanges. Which is still a formidable task given the "disappointment" registered by the Office of the Telecoms Adjudicator (OTA) last week. The OTA - whose task it is to oversee the development of LLU - warned that operational problems reported over the last couple of months "continue to persist and are giving me significant cause for concern". With so much at stake, isn't Wanadoo being rash by going ahead with LLU before all the processes needed to switch customers are slick and error free, especially in light of the damaged reputation caused to Bulldog? The Cable & Wireless-owned LLU operator was the centre of an Ofcom investigation after the regulator received hundreds of complaints from people who had signed up to the service. Lamont doesn't see it that way. "The Bulldog service is fully unbundled [providing unbundled phone and broadband] which makes the migration of consumers much more difficult. Ours is shared unbundling [just broadband] and we have had no such problems with BT...no more than you would expect with existing services." What's more, Wanadoo is confident it has the nous and the people to deal with any spike in calls to its support centre if something were to go wrong. "Our expertise in customer service means Wanadoo won't get caught out," he said. But he acknowledges that "the biggest test is still to come" as LLU becomes a mass market product. "The systems are not perfect. It is a concern," he admits. "But, we're working very closely with BT to get it right." "We've got to get it right," he said. ®
Tim Richardson, 19 Oct 2005

Ofcom gives Bulldog the all clear

Ofcom has closed its investigation into Bulldog after receiving assurances from the broadband ISP that it's introduced a range of measures to improve customer service. Bulldog has also promised to compensate punters who were with the ISP back in July during the height of the problems. And the ISP - which is owned by Cable & Wireless (C&W) - has agreed to update Ofcom each month to ensure that it doesn't see a recurrence of the problems. Ofcom launched an official investigation into Bulldog at the end of August after receiving hundreds of complaints from customers that they had been left without phone and broadband services. In its statement today Ofcom said it had obtained "detailed information from Bulldog relating to the operation of its business, including details of the circumstances leading to the difficulties some customers experienced during the summer". "The information Bulldog provided...indicates that Bulldog has implemented a range of steps to improve its customer service operations. Since August 2005, a number of indicators of customer service, such as call waiting times and call abandonment rates, have shown significant improvement. "Bulldog has also presented evidence to Ofcom demonstrating that it has implemented a number of improvements to both its billing and operational processes." It went on: "Ofcom is satisfied that as a result of these measures, customers affected by Bulldog's earlier service issues will receive, in aggregate, a material level of credit. "Ofcom is also satisfied that Bulldog has put in place, and committed to adhere to through the provision of monthly reports to Ofcom, practices to help avoid any recurrence going forwards. Accordingly, Ofcom has closed this investigation." But Bulldog isn't off the leash yet with Ofcom prepared to re-open the investigation if it receives another spike in complaints about the ISP. In a statement Bulldog said: "We regret the customer support issues that occurred during summer 2005, which have been addressed. We have implemented a wide range of measures to ensure that we are now able to deliver and maintain a high level of customer satisfaction." ®
Tim Richardson, 19 Oct 2005

Men still take lion's share of IT management salaries

ExclusiveExclusive Number-crunchers at Britain's Office of National Statistics have trashed claims that women are earning more than men in IT management positions. Women IT managers still earn less than men, ONS figures produced for The Register show. But their salaries are growing - and men's are starting to droop. In September the Chartered Management Institute pulled the rug from beneath equality campaigners with a survey that showed women IT managers earning more than men for the first time. The upshot was that either women were not the underdogs of the profession, as is widely claimed, or campaigners had succeeded in their mission and could hang up their hats. As it turns out, the data was misleading and the CMI is at a loss to explain why. The ONS has answered a Register request with an exclusive analysis of its labour force survey that shows equality campaigners still have work to do. However, women's lot has improved considerably in the last twelve months. In 2004 women IT managers earned a whopping £14,500 less than men, the ONS data reveals. In 2005 men earned £4,900 a year more than women. The gap was closed by the salaries of women in lower management positions, which jumped by nearly £200 a week in 2005, yet are still just short of men. Senior women’s wage packets lagged men’s by £170 a week - or nearly £9,000 a year. However, it appears that some of the increase in women's salaries has come out of men's pay packets. Men's salaries have decreased in both senior and junior management positions by an average of £50 a week. The CMI's findings that women IT managers now earn more than men caused a bit of a stir among equality campaigners, so much so that Intellect, the IT industry association, held a special meeting of its Women in IT Forum this week to discuss the matter. Gillian Arnold, Chairwoman of the Intellect Women in IT Forum, and a regional sales executive for IBM, said: "Anecdotally, it seemed we had an issue with pay within the industry, and clearly the private sector does not make pay scales as transparent as in the public sector." "It's because of this that we decided to hold the conference...We were also interested in the views of the Chartered Institute of Management given their recent press coverage," she added. The CMI found that women IT managers, averaged across all levels, earned £45,869 a year in 2005, which was £780 more than the average man. A CMI spokesman speculated that there might be a level of management, perhaps middle management, in which women were earning considerably more than men and this had skewed the overall average. But he could not be sure because a breakdown was not available. The crucial difference between the surveys appears to be the sample sizes. The CMI gathers its data from interviews with 21,000 people in 200 organisations across all sectors. The labour force survey is based on a considerably larger sample of 56,000 households. Elizabeth Pollitzer, chairwoman of Equalitec, an industry partnership that helps women in IT careers, was also sceptical of the CMI findings. "One reason is that women take a career break and get paid less when they return," said. "There should be a principle of equal pay for equal jobs,” said Pollitzer. “At present the problem is that its equal jobs and not equal pay."®
Mark Ballard, 19 Oct 2005
channel

DSG speeds up Euro expansion

DSG international plc is stepping up its expansion into Central Europe. The UK-based retail outfit, formerly known as Dixons, is to open an Electro World store in Warsaw tomorrow with a second store due to open in Poland before Christmas. Electro World is also accelerating its store opening plan for Central Europe with plans to add a further 20 new stores "as soon as possible". With this latest store opening in Poland, DSG international now trades in 14 countries with international operations accounting for more than a third of the group's turnover. Dixons changed its name during the summer as part of a corporate makeover to reflect the international nature of the electricals chain. ®
Tim Richardson, 19 Oct 2005
arrow pointing up

Kaliski not convinced on electronic passports

RSA Europe 2005RSA Europe 2005 RSA's cryptography guru Burt Kaliski has warned the US' planned introduction of electronic passports represents a long-term challenge for the security industry. The US government will begin trialling the passports containing an RFID chip in December before the full introduction early next year. United Airlines staff have been testing one version of the technology since June. Speaking at the RSA Security conference in Vienna, Kaliski, chief scientist at RSA Security, Kaliski told the Reg: “You have to look the whole deployment over its lifetime to make sure you don't introduce new problems and it must improve on the prior generation.” “A passport obviously contains personal information so you need fine-grained access control. But with a chip you don’t know what information you are giving away. You don’t know where and what data you are giving away.” Kaliski said: “There’s been good public dialogue – it’s good to see it getting this attention. But it’s important not to just stop with release number one.” Kaliski said there was a tendency in the security industry to look at the early stages of any effort but then move attention on to the next “new” thing. RSA began working with other companies on a specification for one time passwords in February. Workshops were held in May and earlier this week. The informal collaboration has one specification ready for formal acceptance and five more in draft form. Looking back at the conference Kaliski said he enjoyed Dame Stella Rimington’s endorsement and appreciation of the industry because the intelligence services had initially been so suspicious of encryption technology. More on Kaliski’s blog here.
John Oates, 19 Oct 2005

RAS puts the case for manned space missions

The Royal Astronomical Society Commission has called on the UK government to reconsider its deep-seated opposition to space exploration by humans, and outlined several areas of scientific exploration that can only be done with real people. The University of Oxford's Professor Frank Close, lead author on the report and chair of the RAS Commission, told The Register: "The received opinion among scientists is that it [human space exploration] is a waste of money. And if you start by looking at the cost, that is likely to continue to be the conclusion people will come to." He argues that the more important question, and the question he and his colleagues started by trying to answer is: is there science that can only be done if humans are in space, and if so, how highly do you prioritise that research? In the report, the commissioners conclude that there are "profound scientific questions relating to the history of the solar system and the existence of life beyond Earth can best - perhaps only - be achieved by human exploration on the Moon or Mars, supported by appropriate automated systems". They identify three specific areas of research that need human involvement, often because present or "foreseeable" robotics technology isn't capable of recovering or analysing deep core rock or permafrost samples. The first research project is mapping the solar system. Deep lunar core samples, down to 100m in depth, are needed here to track the history of cometary bombardment, perhaps looking for signs of organic molecules. Secondly a thorough search for life on Mars will probably also involve deep drilling. "We are not persuaded that a robotics approach alone can deliver this now or in the foreseeable future," the commissioners write. Finally, the commissioners argue, if Mars is found to be dead, further exploration of the planet would be better handled by humans than by robots. The report also concludes that although such exploration would be expensive, the wider commercial, educational, social and political benefits would help justify the financial outlay. "There is no doubt that any such venture would have to be multinational. Given that the US is pretty committed at least to going back to the moon, then Europe, and in turn UK, need to decide how involved it wants to be in this process," Professor Close continues. "The question then becomes what are the implications for the UK, in terms of industry and human spirit, if we decide not to join in?" As to the question of cost, Professor Close is not a fantasist. He acknowledges that the funding for such missions could not be raised by trimming other space exploration budgets: "New money would have to be raised". He also argues that robotic exploration would have to continue alongside any manned missions. "Human space flight beyond Mars is just not possible because of the increased exposure to galactic cosmic rays," Professor Close told us. Epidemiologists can just about be satisfied that a six month trip to Mars, and the same return journey leads to an acceptable accumulation of rads, provided there is no exposure to solar flares. Traditional shielding, when bombarded with galactic cosmic rays would only induce secondary radiation as bad as anything you started with, Close goes on. "Until such time as we have found ways of doing things like genetic repair, human space exploration will not go beyond Mars, and the realities of this should be the focus of wider research." Ultimately, Close says, people should read the report carefully before coming to any snap judgments. You can do this by clicking here (PDF). ®
Lucy Sherriff, 19 Oct 2005

Read two biometrics, get worse results - how it works

A regular correspondent (thanks, you know who you are) points us to some calculations by John Daugman, originator of the Daugman algorithms for iris recognition. These ought to provide disturbing reading for Home Office Ministers who casually claim that by using multiple biometrics you'll get a better result than by using just the one. Although that may seem logical, it turns out that it it isn't, necessarily. Daugman presents the two rival intuitions, then does the maths. On the one hand, a combination of different tests should improve performance, because more information is better than less information. But on the other, the combination of a strong test with a weak test to an extent averages the result, so the result should be less reliable than if one were relying solely on the strong test. (If Tony McNulty happens to be with us, we suggest he fetches the ice pack now.) "The key to resolving the apparent paradox," writes Daugman, "is that when two tests are combined, one of the resulting error rates (False Accept or False Reject rate) becomes better than that of the stronger of the two tests, while the other error rate becomes worse even than that of the weaker of the tests. If the two biometric tests differ significantly in their power, and each operates at its own cross-over point, then combining them gives significantly worse performance than relying solely on the stronger biometric. This is of particular relevance to the Home Office's current case for use of multiple biometrics, because its argument is based on the use of three types of biometric, fingerprint, facial and iris, which are substantially different in power. Daugman produces the calculations governing the use of two hypothetical biometrics, one with both false accept and false reject rates of one in 100, and the second with the two rates at one in 1,000. On its own, biometric one would produce 2,000 errors in 100,000 tests, while biometric two would produce 200. You can treat the use of two biometrics in one of two ways - the subject must be required to pass both (the 'AND' rule) or the subject need only pass one (the 'OR' rule). Daugman finds that under either rule there would be 1,100 errors, i.e. 5.5 times more errors than if the stronger test were used alone. He concludes that a stronger biometric is therefore better used alone than in combination, but only when both are operating at their crossover points. If the false accept rate (when using the 'OR' rule) or the false reject rate (when using the 'AND' rule) is brought down sufficiently (to "smaller than twice the crossover error rate of the stronger test", says Daugman) then use of two can improve results. If we recklessly attempt to put a non-mathemetical gloss on that, we could think of the subject having to pass two tests (in the case of the 'AND') rule of, say, facial and iris. Dropping the false reject rate of the facial test (i.e. letting more people through) in line with Daugman's calculations would produce a better result than using iris alone, but if the facial system rejects fewer people wrongly, then it will presumably be accepting more people wrongly. Which suggests to us that simply regarding a second or third biometric as a fall back to be used only if earlier tests fail constructs a scenario where the combined results will be worse than use of the single stronger test, because in such cases the primary biometric test would have to be sufficiently strong to stand on its own, because you won't always be using the second or third test. The deployment of biometric testing equipment in the field is also likely to have a confusing effect on relative error rates, because environmental factors will tend to impact the different tests to different degrees. Poor lighting may have an effect on iris and facial but not on fingerprint, while the aircon breaking down may produce greasy fingers and puffy red faces, but leave iris intact. Which would presumably mess up attempts to sync error rates. But we feel ourselves beginning to intuit, and had perhaps best back off before phalanxes of irate mathematicians come after us. On the upside for the Home Office, Daugman points out that the combination of two tests of equal power - the iris patterns of both eyes, or two of a person's fingerprints - can enhance performance fairly easily. This actually provides some justification for the Home Office starting to count eyes and fingers individually, although the way they're putting it still sounds like the techies told them something, and now they're trying to repeat it without really understanding. The extent to which they really do count the biometrics separately will also be important. Daugman points out that his calculations only deal deal with "decision-level fusion" (i.e. applying the decision rules to the individual biometrics separately), but there are other approaches such as sensor fusion, where the data is combined before decision rules are applied, or combining similarity scores before applying decision rules. As far as fingerprint is concerned, the Home Office certainly intends to have all ten prints on file, but there are all sorts of different ways that a test could read the data. Is a 'handslap' reading five individual biometrics read at once, or just the one? It depends how you treat it and how you use the decision rules on the data, and how you do this will have an effect on the validity of your claims about multiple biometrics. ®
John Lettice, 19 Oct 2005

Whereonearth is Yahoo!'s

Yahoo! has beamed up British local search developer Whereonearth as the big global web players continue their scramble to get really really local. If nothing else, the deal means Yahoo! redresses one of its biggest current failings: the lack of a major product line with Earth in the title. Since the launch of Google Earth and MSN Virtual Earth Yahoo had been embarrassingly terra free. Whereonearth’s technology is already used by Yahoo!, as well as Hutchison 3G, lastminute.com Royal and Sun Alliance and Recruitsoft. The ten year-old London-based company is currently owned by Elderstreet Capital Partners, M J Technologies, and Reuters Venture Capital. Yahoo! said Whereonearth’s technology would soon find its way into its search marketing, search, local/maps, mobile and personals offerings. The companies did not discuss how much Yahoo paid. It’s fair to assume the price-tag will barely dent Yahoo’s cash balance, which according to results published yesterday, currently stands at $2bn, up from a comparatively skimpy $650m a year ago. Revenues for the third quarter ending September 30 were up 47 per cent on the year to $1.3bn. This led to operating income of $270m, up 57 per cent while net income was $254m, virtually unchanged on the previous year.®
Team Register, 19 Oct 2005
hands waving dollar bills in the air

DVD Jon joins MP3tunes.com

Norwegian programmer Jon Lech Johansen, AKA "DVD Jon", has moved to San Diego after being hired by tech entrepreneur Michael Robertson to work on a new digital media project called Oboe. Wired describes the link up as the most portentous since Butch linked up with Sundance and although this one is unlikely to end up in a shot-up with the Bolivian Army we can even now imagine the entertainment industry lining up a legal posse to range against two thorns in its side. Johansen, a world famous reverse engineer while still in his early teen, was twice acquitted on charges relating to his involvement in creating and distributing a utility (called DeCSS) for playing back DVDs on his own Linux PC. Since then his hacks on copy protection components of Apple's iTunes and Microsoft's Media Player software have made him a hero to programmers and the bete noire of media conglomerates. Robertson is no stranger to controversy either having founded digital music company MP3.com, sued by recording companies from over its online music lockers service, before taking on Microsoft on the desktop with an open source alternative that was changed its name from Lindows to Linspire last year in settling a protracted legal battle with the software giant. Robertson founded a new firm, MP3tunes.com, last year. In a posting on his blog, Robertson said he'd hired Johansen to work on Oboe, which is cryptically described as an open system project at MP3tunes.com to "bring digital music into the 21st century". "I knew he'd be a great fit for the team, so I quickly extended him a job offer. It took a few months to process the immigration paperwork, but now he's living in San Diego and working on Oboe," Robertson writes. Johansen told Wired that after Norway adopted a European Union directive which outlaws the circumvention of copy protection technology in July he is no longer safer from legal assault in his home country than in the US. "In Norway, you have the same laws (as in the United States) now. So it makes no difference if I'm doing my work here or there," he said. "I plan to continue my research, but I won't be writing any tools (while in the US)," he added. Johansen said that he'd moved to the US because he wanted to work on consumer (as opposed to enterprise) software development projects. "I'm not scared about being arrested now that I'm here. Michael has good lawyers," he said. ® Related links Johansen's "So Sue Me" blog
John Leyden, 19 Oct 2005
channel

Fujitsu and Software AG set sail to chart services

Fujitsu Software and Software AG are collaborating to make it at least a little bit easier to map the software services floating free around the typical organizaton. One of the troubles with Service-Oriented Architectures is the very obvious fact that just about every software vendor in the world has a story to tell users, and most of the stories are subtlely – and sometimes significantly – different. If the vendors were ancient mariners, they would be spinning yarns of adventures in uncharted waters, and if they had any charts at all they would be littered with the legend: `Here Be Dragons’. So, if users looking towards the notion of exploiting SOA can get their hands on some technology that provides a map of some sort it could be an advantage, especially if it also helps them turn some of the dragons into pussy cats. That is, at least in part, the objective of CentraSite, a joint development between Fujitsu Software and Software AG. This is a repository of applications metadata that aims to provide a topology of available services within an enterprise, covering the relationship between the functions being employed to create them. The idea is that it should become the map of what makes up a service, be it new applications code, existing legacy applications, or component functions effectively `extracted’ from those legacy apps. This latter capability should mean that users can get a better level of re-use of existing applications investments. The idea of a repository or map of a service is not new. Companies such as Systinet have already launched products in this area. But Fujitsu and Software AG claim that CentraSite adds new capabilities, such as the management of metadata concerning the initiation of services, plus providing dependency information and impact analysis reports. To begin with, Fujitsu and Software AG are aiming to exploit the development themselves. Fujitsu will be incorporating the technology into its Interstage Business Process Manager, while Software AG will be adding it into its Enterprise Information Integrator and Enterprise Service Integrator. The pair does, however, plan to start a community program aimed at getting both external business partners and other software vendors to start using CentraSite.®
Martin Banks, 19 Oct 2005
channel

Cautious welcome for new MS shared source licences

Microsoft has announced three new licence templates for its Shared Source Initiative that it says should help combat the problem of licence proliferation in the open source developer community. Jason Matusow, director of the SSI, said Microsoft has tried to write licences that are simple to understand without the assistance of a highly priced lawyer, that make it easier for the developer to know what is and isn't allowed under the licence conditions, and makes it easier for Microsoft developers to get code shared. He made the announcement at the EuroOSCON conference in Amsterdam. The three licence templates will not retrospectively replace licences already issued, but Matusow says Microsoft will consider changing projects over to one of the new templates on a case by case basis, if developers ask. The first of the three templates is the Permissive Licence - Ms-PL. As its name suggests, this is the most flexible of the three. Matusow says the only restrictions it places on what you do with the code are that you mustn't use the Microsoft trademark, and that the code must carry proper attribution. "It should be compatible with most open source licences, but I think it would still conflict with the GPL," Matusow said. Second is the Community licence, Ms-CL. This is a reciprocal licence that Matusow says is probably most similar to the Mozilla public licencing model. "It basically says that if you modify and distribute the code, you must give that file back to the community," Matusow told us. There are also limited versions of both the Ms-PL (Ms-LPL) and the Ms-CL (Ms-LCL) that restrict the use of source code to the Windows environment. The final template is the Reference Licence, MS-RL, which essentially is a look-but-don't-touch licence for developers working on interoperability issues and so on. Georg Greve, president of the Free Software Foundation in Europe, says that on first glance, the licences look "quite interesting", adding that more careful analysis is required. "At a first glance, the Ms-PL and Ms-CL both appear to satisfy the four freedoms defining Free Software. In particular the Ms-CL also appears to implement a variation of the Copyleft idea, which was first implemented by the GNU General Public License (GPL)," Greve told The Register. This, he added, is something of a turnaround for the company, which he says has previously referred to GPL as "viral", "cancerous" and "communist". He went on to stress that while publishing well crafted licences is one thing, making software freely available is quite another: "It would have been preferable if Microsoft had made the decision to use the GNU General Public License (GPL) and Lesser General Public License (LGPL) for its Shared Source program." Professor Larry "Creative Commons" Lessig and Ronald Mann, Law Professor at the University of Texas have also given the licences the thumbs up. Lessig welcomed the move because it reduced the number of licences floating around the OS community, arguing that the proliferation of licences is the single most important problem in the world of free and open source licensing, because many are incompatible. You can read the new licence templates for yourself here. ®
Lucy Sherriff, 19 Oct 2005
chart

Snort plugs Back Orifice as Oracle issues mega-fix

Patch roundupPatch roundup Wednesday became a busy patching day for sys admins with the release of Oracle's quarterly patch roundup - boasting an impressive 85 software fixes - and an update designed to defend the popular Snort open source intrusion detection application against possible hacker attack. Oracle's mega update covers a variety of security flaws involving vulnerabilities in its database software and business applications that, at worst, create a means to launch potent SQL injection and cross site scripting hacker attacks. The uber fix also includes less serious security bug fixes and software tweaks unrelated to security issues. Oracle has put together a patching matrix designed to help IT managers to make sense of the importance and impact of these patches within their environments. The vulns were discovered by David Litchfield of NGSSoftware, among others, and summarised by security notification service Secunia here. Elsewhere there's a buffer overflow vulnerability with the Snort security tool to worry about. A component of the product designed to recognise Back Orifice hacker tool traffic is flawed to such an extent that it might be used to inject hostile code onto affected systems. "A stack-based overflow can be triggered with a single UDP packet, allowing an attacker to fully compromise a Snort or Sourcefire installation," security tools vendor ISS, which unearthed the security glitch, warns. The vulnerability has been reported in Snort version 2.4.0, 2.4.1, and 2.4.2. Users are advised to upgrade to version 2.4.3. Sourcefire, the commercial firm whose technology is based on Snort, issues similar advice here. Possible workarounds and more info can be found in a US-CERT advisory. ®
John Leyden, 19 Oct 2005
channel

EMC cheers software surge and Q3 sales swell

Aided by higher software and services revenue, EMC posted double-digit growth in its third quarter. This session marked the ninth consecutive quarter of double-digit gains for the storage giant. EMC posted revenue of $2.37bn - a 17 per cent year-over-year rise from the $2.03bn posted in the same quarter last year. Net income during the third quarter reached $422m, which compares to net income of $218m reported in 2004. All of EMC's major businesses enjoyed gains, but software and services showed the most significant growth. The solid quarter put CEO Joe Tucci in a mood to praise his job well done. "Few companies in our industry have been able to match the record of consistency and execution that EMC has achieved so far this year," Tucci said. "Each of our major business segments and geographies continued to grow by double-digits during the third quarter, reflecting the power of our model, the soundness of our strategy and the excitement customers have about our expanding portfolio of solutions." EMC's extended software company buying spree continues to pay dividends, especially from a growth perspective. The company often points to subsidiary VMware as one of its high-flyers and has seen software revenue swell to account for 37 per cent of total sales. It's hard to say if EMC can keep up such growth in the long-run without making similar, large acquisitions. In the meantime, however, it has the luxury of gloating over hardware rivals stuck in single-digit growth territory. EMC's storage systems revenue increased 15 per cent year-over-year to $1.09bn. Software license and maintenance revenue jumped 16 per cent to $865m, while services revenue surged 25 per cent to $402m. In the fourth quarter, EMC expects revenue to come in between $2.67bn and $2.69bn. ®
Ashlee Vance, 19 Oct 2005

IBM unsheaths blade PCs

PCs live on at IBM, where Big Blue reckons it can squeeze up to 20 traditional desktop users on a single blade server, and has tapped VMware and Citrix as software partners for its new "IBM Virtualized Hosted Client Infrastructure" program. Start-up ClearCube has been hammering away at the blade PC idea for some time, and continues to partner with IBM. HP too has a PC blade, although IBM was quick to remind us that this product is designed for a single user instead of numerous users on one machine. The big sell with these products has always been that they're easier to manage, quieter, cheaper and more secure than standard PCs. The big downside has always been that they're not PCs. Few companies can stomach the idea of ripping out their traditional PC infrastructure in favor of this newfangled blade thing. By tapping close partner VMware's technology, IBM thinks it may have an edge over other PC blade and thin client rivals. VMware's code helps divvy up IBM's Xeon- and Opteron-based blade servers. "VMware virtual infrastructure boosts server utilization rates to up to 80 per cent, driving more efficiencies than a solution that is merely dedicated hardware to support desktop features," said VMware VP Brian Byun, who many of you will remember from yesterday. IBM will then use Citrix Presentation Server to transmit data to the client terminals and give users a virtual Windows XP experience. IBM has started beta testing the Virtualized Hosted Client Infrastructure and will sell the package through global services starting in the first quarter of 2006. ®
Ashlee Vance, 19 Oct 2005

Apple intros dual core PowerPC Macs

With what may be the final crank of the PowerPC handle, Apple has introduced dual-core machines into its professional desktop line-up, including a liquid-cooled virtual 4-way machine. The PowerBook range doesn't gain any chip improvements, but two of the models gain higher resolution screens. The base PowerMac now features a single dual core 2.0GHz G5 970MP CPU, NVidia 6600 LE display card for $1,999. A faster 2.3Ghz with the uncompromised 6600 card and a larger disk is available for $2,499 and a "quad" - with two dual core 2.5Gz G5 processors - will ship at $3,299. These replace the two dual-CPU models using 2.5GHz and 2.7GHz that were offered by Apple. Apple claims the quad encodes video from Final Cut Pro 60 per cent faster than a the two-way single core 2.7Ghz PowerMac. Prospective PowerBook purchasers will be pleased with the new displays on the 15" and 17" models, but disappointed by the absence of faster CPUs and no changes at all to the popular 12" model. Apple's 15" PowerBook has shipped with a 1280x854 resolution display since April 2002, when it was in the Titanium enclosure; this has been increased to 1440x960. Apple's 17" PowerBook has shipped with a 1440x900 since its launch in January 2003, but is upped to 1680x1050. Both machines retain the $1,999 and $2,499 price points. Apple also claims to have wrung an extra hour's battery life from the new models. The twelve incher, described as the "the redheaded-step child" of the PowerBook family, is untouched but receives a $100 price cut in the United States. Unlike its 15" sibling, the 12" model has no FireWire 800 port, inferior graphics, and the memory ceiling is lower. When IBM unveiled updated 970 processors back in July, it also gave details of the 970FX. This low power dual-core notebook chip consumes less power than a Centrino running flat out, and is available 1.2GHz, 1.4GHz or 1.6GHz clock speeds. According to an AppleInsider report, Apple's depleted PowerPC engineering teams were fighting bugs in the 970FX chipset, and decided to stick with the Freescale processors used today. Apple also announced a professional photography tool Aperture today: a major application with hefty system requirements. ®
Andrew Orlowski, 19 Oct 2005
graph up

Acopia promises to give you a good look at a very broad NAS

The last thing the folks at Acopia Networks said to us was, "Be nice." So, of course, we'll have to be mean as hell. Only it's hard to eviscerate a storage and networking virtualization start-up. You can hardly ever tell exactly what these companies do. Their customers, although few, tend to love the product in question. And there's a whole bunch of companies floating around that seem to do very similar things, which would be bad if it didn't validate the market. Acopia fits into that series of broad generalizations well. The company makes a switch for linking numerous NAS (network attached storage) systems from the likes of NetApp and EMC. Using proprietary software, Acopia can create a single file system that stretches across all of these NAS boxes and creates a global namespace - something all the big boys are trying to do. It boasts big name customers such as Merrill Lynch and Warner Music Group that just worship its products. These guys use Acopia's gear to tie together huge numbers of systems and to wring better performance out of each box. (You probably know this already though because Acopia has been pointing to the same two customers during presentations since at least June of 2004, which makes one curious to say the least.) In addition, Acopia dukes it out with other NAS start-ups NeoPath Networks, NuView and Rainfinity, which was purchased by EMC in August. It's hard to say how accurate customer counts from any of these vendors are, but analysts seem to give the edge to NuView and Rainfinity - both of which are a couple years older than Acopia. The basics Having proven that Acopia fits the storage start-up mold, we should probably go ahead and tell you a bit more about what they do and how they do it. Acopia sells three flavors of its ARX switches. On the low-end is the ARX500 "departmental switch," which is a 1U box that can handle close to 100MB/sec of data and manage more than 120m files. In the midrange sits the ARX1000 - a 2U box capable of 400MB/sec data rates and managing up to 300m files. On the high-end is the 13U monster ARX6000, which can pump data at 2GB/sec and manage more than 1bn files. The company reckons that customers will typically pay $200,000 for a deployment, including two of the Acopia systems for redundancy. Larger configurations can hit $600,000 or even over $1m, said Joe Wisniewski. product marketing manager at Acopia. Customers will plug the Acopia switches into their existing data centers and use the boxes to virtualize CIFS and NFS file systems. The nice bit about the Acopia technology is that customers can leave their NAS hardware up and running when first installing the gear. In addition, administrators can increase file system sizes or move a file system from one NAS box to another without shutting down any end user services. "You can do things like moving data off filers onto cheaper storage or setting up more complex tiered storage systems," Wisniewski said. "All this happens without end users ever knowing the data was migrated." So far, customers have tended to use the Acopia gear to virtualize large pools of NetApp and EMC systems. Increasingly, however, the company says that customers are looking to use the technology for heterogeneous networks as well. Acopia faces competition from the other start-ups as well as larger players such as EMC, IBM and even Cisco that all have broad virtualization aspirations. The debate rages on as to whether or not servers, storage boxes or switches will ultimately control most of the virtualization functions. In the meantime, Acopia can help out customers with NAS management disasters on their hands. The company has vowed to make its management software better and to add more sophisticated tools for keeping track of large numbers of systems - all of which sounds lovely and nice. Two of Acopia's staffers chatted with us this week but didn't seem to know thing one about El Reg. As mentioned, asking us to be nice is typically a recipe for total disaster. Luckily for the marketing drones, we have it on high authority that members of the Acopia engineering staff rely on El Reg for their daily information dose. As usual, the engineers have saved the day. ®
Ashlee Vance, 19 Oct 2005
globalisation

Exchange Server SP arrives

Microsoft has beefed up mobile capabilities and security in its collaboration platform, with the release of Exchange Server 2003 Service Pack (SP) 2. This updated email server now uses an http connection to automatically push out new email messages, calendar and contact information to the end user, instead of relying on short message service (SMS). Improvements to Outlook include a new Global Address List (GAL) to look-up other users. Security has been enhanced on a number of levels. Greater device control is introduced, with the ability to decide how many incorrect log-on attempts are permitted, and reset lost, stolen or misplaced passwords over the web. Protection against spam is also improved. SP2 updates the latest data and updates for the Exchange Intelligent Message Filter and support for Microsoft's Sender ID email authentication protocol to tackle phishing and unwanted mailbox spoofing. The next full version of Exchange Server, codenamed Exchange 12, is expected in 2006. ®
Gavin Clarke, 19 Oct 2005
channel

Process in crosshairs at Rational, Borland

The sparring match for developer mindshare between Borland Software and IBM/Rational has continued, as the companies focus on "process" instead of tools. IBM/Rational is reported to have replaced its Rational Unified Process (RUP) product and Rational Process Workbench with the IBM Rational Method Composer with a new offering, the Rational Method Composer, based on Eclipse. RUP provides processes and methodologies to help developers improve the way they build software. Rational Method Composer extends RUP, co-developed by Rational chief scientist Grady Booch, by adding best practices to develop Service Oriented Architectures (SOAs), portfolio management and distributed development. Last week, IBM released a subset of RUP to the open source Eclipse Foundation. IBM/Rational wishes to foster an "industry wide effort to synthesize, share and automate development processes and best practices." IBM/Rational's Java and application lifecycle management (ALM) competitor Borland, meanwhile, announced its acquisition of IT management and governance specialist Legadero Software. Legadero's tools help organizations manage requests for changes to systems, enforce policy and identify and assign resources to projects. Legadero's tools will be re-branded Borland Tempo. The acquisition follows Borland's purchase of TeraQuest Metrics in January, to help improve aspects to application development such as project planning and requirements gathering. TeraQuest was home to co-authors of the Capability Maturity Model (CMM) used to drive best practices in software development. Borland said the Legadero purchase helped fulfill its vision for Software Delivery Optimization (SDO), of streamlining application development and reducing costs. Financial terms of the deal were not revealed.®
Gavin Clarke, 19 Oct 2005
SGI logo hardware close-up

What does Microsoft's new shared source mean for you?

AnalysisAnalysis Open source advocates are doing what was once unthinkable - giving the thumbs up to a Microsoft source code licensing program.
Gavin Clarke, 19 Oct 2005