The promise that space travel will one day become as cheap, as safe, and as mind-numbingly tedious as air travel will inspire millions of youngsters to dedicate their lives to science and engineering, SpaceShipOne Master and Commander Burt Rutan apparently believes. Today's young lack the inspiration of heroic figures like Yuri Gagarin, Alan Sheppard, Chuck Yeager, and others who flew rickety junkers to the outer edge of acceptable risk and came back to talk about it, Rutan suggested during a Washington press conference last week. After ridiculing NASA's appalling cowardice in creating "an environment in which we right now don't have the courage to go back to [repair] the Hubble telescope," and noting that NASA's profound risk-aversion has actually made leaving the atmosphere more dangerous than it used to be, Rutan suggested that gutsy entrepreneurs like himself can revive a dormant sense of hero worship among children and so lead them to productive careers in aerospace engineering. NASA is destined to be sidestepped by commercial outfits, because it is not doing anything fun or inspiring, and it kills too many people. "The public is not excited about spending billions for a space station that gets only a tiny amount of science and isn't built as a staging area to go somewhere and explore," he said. "I don't think the American taxpayer is excited about any orbital or moon work that doesn't involve being about to fly us." Not only will space tourism satisfy some tremendous, pent-up demand for novel holiday activity, it will bring about an exploration Renaissance. "I think the big thing we need to do is to inspire our present generation of kids, because, if we don't, what are we going to expect in the future?" Rutan asked rhetorically. "The folks that were inspired by the invention and quick development of the airplane turned out to be - every darn one of them turned out to be on my list of the top ten mentors and heroes." "I was inspired by the opening up of the jet age and the missile age," he added. "But we have to have something to inspire our kids now, and we have to do it by taking risks and we have to do that by moving there," he explained. However, space travel must first become safe enough, and dull enough, for mass public consumption, he explained. Rutan echoed his previous testimony before the House Subcommittee on Space and Aeronautics back in April, when he noted that so long as space travel remains as expensive and dangerous as, say, climbing Mount Everest, likely consumers of such services will remain in the range of 300 to 500 per year. Nevertheless, Rutan sees a mass payoff for all humanity if space flight can be made safe, fun, and predictable, like a Disney cruise. He reckons that something like 100,000 eager consumers per year can be expected to line up for relatively cheap flights above the atmosphere within twelve years' time, so long as the industry shapes itself as he imagines. Of course, suggesting that mass space tourism could possibly inspire the next generation of Alan Sheppards is to suggest that Carnival cruises have been inspiring the next generation of Jacques Cousteaus. It's a preposterous claim. But Rutan seems intelligent enough to discern the glaring differences between, say, Ferdinand Magellan on the one hand, and a pleasantly-drunk tourist wandering about in what amounts to a floating Vegas hotel, to whom the sea is the single least noticeable feature in his vicinity, on the other. So if we rule out dullness, Rutan's proposition can only be explained as a deliberate hustle, playing on techno-utopian fantasies. He needs to attract capital, and he's also likely seeking legislation that will, at least, keep NASA out of his way, to do what he intends to do - which is to enact his own ambitions inspired by his own heroes. And since claiming the X prize, he's automatically been granted access to the audience that he needs to translate his scheme into action. If he does turn space travel into the next Disney cruise, so be it. He's certainly chosen the ideal partner in stunt-master Richard Branson, a man who clearly knows how to make a pretty penny off tourism, entertainment, and manufactured fun. But he's got to drop the heroism hustle: a trip on Virgin Galactic is hardly going to become a badge of honor. And no amount of marketing rhetoric will make it otherwise. Flying Concorde between New York and Paris did not make us Chuck Yeager - after all, he had to disdain the drinks cart to stay alive up there. ® Related stories Sigourney Weaver books flight with Virgin Galactic Amazon supremo joins space race Burt Rutan takes a V2-powered wander down memory lane SpaceShipOne claims X-Prize
Joichi Ito, the American businessman who after a kind of immaculate conception hatched forth as a pre-formed internet celebrity a couple of years ago, is having a crisis. Ito's gauche weblog has built up a cult following amongst hopeful software authors, lonely maternal housewives, out of luck marketing consultants, and excitable South American computer enthusiasts. His reports from the high table of the internet's High Society have enthralled an audience that runs well into three figures, and will keep sociologists busy for years to come. "He doesn't need employees; he has the posse," burbled one typical piece of ass-kissing published in Fast Company magazine. But now Ito now wonders if the public diary format of the weblog is too restrictive for his talents as venture capitalist, tireless networker and ICANN board member. Soon after he blogged this a coterie of blog fans urged him to keep his fingers firmly welded to the keyboard. The blog must go on, they insisted. One fan begged, "please do continue to post silly-opinionated-not-well-thought through stuff. i for one love your blog for just that reason". "Don't let the bastards install a cop-chip in your head," was the rather bizarre advice from someone called Cory Doctorow. Having got to know Ito personally we're in a privileged position. For a start, the private Ito is an engaging fellow who bears only a passing resemblance to the naif who writes the Joi Ito weblog. Far from being the "Help - I can't speak Engrish so well!" character that prickles the maternal instincts of menopausal Coffee Klatch Mums across the web - the real Ito is a funny, smart and slick American businessman. He asks the right questions - a rarity amongst his cult following. Maybe we just recognize a cynic when we see one, but Ito is clearly a survivor. So what's behind his deeper malaise? "Emergence is our religion," Ito once told your reporter with a knowing wink, after a few beers. Over at life-enhancement.com, Ito recently enthused about how "a sort of intelligence will form just by connecting everyone together." It's the old 1980s AI rhetoric, updated for the TCP/IP age. But society isn't a computer network, and computer networks are a lousy metaphor for society. Systems thinking once excited a lot of people because it appeared to offer a way out of the over-specialization of the sciences. That may still be so, but it's a mistake to believe it can offer a substitute for a religion. Compared to the rich metaphysical belief system offered by a real religion, faith in computer networks invariably leads to disappointment, and from there, it's a short step to a fatalism and in some cases, a very cynical brand of misanthropy. (When "getting everyone connected" is the goal, and a third to a half of the world stubbornly refuses to "get connected", the techno utopian invariably blames the people, not the computer). So this kind of thinking attracts a lot of flakes, and it also produces rather flakey computer systems, which is where we begin to take an interest. If technology is going to benefit society it has to be much, much better than it is now. If it isn't, the results could be catastrophic. So you don't need to be a paid-up God-botherer to see the shortcomings of this faith. The collected works of the Brothers Grimm - or perhaps even Captain WE Johns - offer a more coherent and useful framework. And the techno-utopians have great plans for us, if only the world would listen - and the call to arms wasn't so comically bathetic - "I think the recent back and forth including Paul Boutin's defense of Andrew shows that we're hitting a nerve, but that we probably should show some progress soon," Ito once wrote in a mailing to the "Emergent Democracy" mailing list, shortly before "Emergent Democracy" sank out of sight for good. "Also, can someone send me the link to the Wiki? I moved machines and lost my bookmarks." Would you buy a used PC, let alone a New Model Democracy, from these people? Perhaps if Joichi could leave his career considerations aside for a moment, and he could make a lasting impact by blogging how ICANN really works from the inside. Or perhaps he doesn't need an online identity anymore, and he'll vanish from the "blogosphere" as rapidly as he arrived, leaving only trackbacks behind, as a kind of cybernetic placenta. We rather hope not, because in contrast to the gallery of grotesques that appears daily on his "Random Faceroll" - self-selected to be thin-skinned, humorless and outright creepy - he's been quite entertaining. ® Related stories The RoTM™ Vaulting into a Rapturous techno-future with Jaron Lanier US netizens: white, wealthy and full of it - shock! Google founder dreams of Google implant in your brain Digital memories: cheap to take, cheaper to lose Anti-war slogan coined, repurposed and Googlewashed in 42 days
This story has expired from The Register's archive. You can now find it at its original location on the Forbes.com website: http://www.forbes.com/business/2005/05/25/cz_dl_0525linux.html?partner=theregister.
Asia is shaping up as a competitive area for two open source middleware organizations, following JBoss's latest partnering deal. The open source specialist has signed a consulting and systems integration agreement with Nomura Research Institute (NRI). NRI will promote deployments of JBoss's Enterprise Middleware System (JEMS), making JEMS a "safe choice" for Asian enterprise customers, JBoss says. NRI will provide Level 1 and Level 2 professional support, systems integration and management services while JBoss will provide NRI with development support, training and professional certification. JBoss's partnership follows increased activity in the region by ObjectWeb Consortium. ObjectWeb is developing its own open source middleware stack that includes a Java application server; this month it reached an agreement with China's Guangzhou Middleware Research Center (GMRC) which will foster ObjectWeb's presence in China. Among its activities, GMRC will recruit new consortium members and promote contributions to the ObjectWeb code. The consortium, meanwhile, is sponsoring Vietnam's COSGov conference on use of open source in e-government. ® Related stories IBM moves onto JBoss turf with Gluecode buy JBoss moves up to business processes Red Hat makes money, pledges full open architecture
A Novell-backed project seeking to create an open source version of web services technology in future versions of Windows could hit an IP hurdle from Microsoft. Evaluation has begun on an open source version of Indigo, the web services communications layer due in Longhorn and updates to Windows XP and Windows Server 2003. Supporters of open source Indigo, part of the long-running Mono Project, say it will spread the adoption of web services, because it enables developers to use Indigo on MacOS, Unix and Linux. But Microsoft suggests there could be issues over licensing of its intellectual property (IP) used in Indigo - even though previous elements of Mono have been developed by the community (thanks to some nifty standards work by Microsoft). So far, Mono has delivered an implementation of Microsoft's .NET Framework that includes ASP.NET, ADO.NET and compilers and tools for the Visual Basic .NET and Visual C# .NET languages, along with open source Unix, and Gnome libraries. Mono has benefited from Microsoft's decision to register some key elements of the .NET Framework as standards in 2001 with the European Computer Manufacturers' Association (ECMA) and in 2003 with the International Organization for Standardization (ISO). But it seems that either Longhorn falls outside of Microsoft’s definition of the .NET Framework or that Microsoft is irritated that a competitor is now supporting Mono. Novell has begun using Mono in its iFolder file sharing application for Linux, Windows and Mac workgroups. Novell inherited Mono and project leader Miguel de Icaza with the acquisition of open source desktop specialist Ximian in 2003. A Microsoft spokeswoman told The Register:"While Microsoft is quite open to discussing with Novell the licensing of potentially applicable intellectual property, Novell has not licensed anything or even approached Microsoft on this topic." Novell could not be reached for comment at the time of writing. At least one ISV, specializing in cross-platform application development, is enthusiastic about the project's potential. Yaacov Cohen, Mainsoft president and chief executive, called Indigo "phenomenal", saying it would help developers build Service Oriented Architectures (SOAs). Mainsoft has worked on Mono for two years and intends to contribute resources to this latest project. De Icaza told The Register that Indigo has most promise for middleware companies, adding that it would likely emerge in pieces through the Mono community's work on elements such as a transaction manager and queuing system. Mainsoft, meanwhile, hopes it can encourage Windows developers to enhance Mono. Mainsoft's MainWin for J2EE Developer Edition, launched yesterday, exposes Mono's ASP.NET and ADO.NET class libraries to Visual Studio developers, enabling them to add class libraries, bug fixes and enhancements. MainWin for J2EE Developer Edition will also help Windows developers build applications for Linux using their existing Visual Studio tools. Developer Edition is a plug-in that can re-compile Visual Basic .NET and Visual C# .NET applications to Java bite code for deployment on Linux. The suite features a Linux Virtual Machine, the PostgreSQL database and Apache Tomcat web server. MainWin hopes it can tap some of the 22 per cent of Visual Studio developers who built applications for Linux during 2004. "Linux is becoming so popular and getting into the mainstream that VisualStudio developers want to be part of it... they don't want to be left out of the Linux phenomenon - Linux is seen as a very cool operating system," Cohen said. "We are making it very easy for Visual Studio developers." ® Related stories Avalon faces axe as Microsoft dismembers Longhorn Novell debuts open source toolkit for .NET Mono and dotGNU: what's the point?
IBM and Oracle remain locked in a tight race to control the lucrative relational database market, with Oracle's business enjoying a sizeable boost from Linux. According to Gartner, IBM retained its narrow lead over Oracle in terms of world-wide revenue during 2004, having scooped Oracle for pole position in 2002. However, the analyst firm notes that Oracle's long-term investments in Linux is paying off and attributes to Linux most of a 15 per cent increase in new license business. IBM's nine per cent growth was generated by sales of DB2 on the zSeries and Unix. "Oracle saw strong growth of nearly 15 per cent, much of it coming from its performance on the Linux platform," Gartner said. The difference between the giants in terms of revenue was only $30m, making it too tight to declare a clear winner, it says. Worldwide database sales in 2004 grew 10.3 per cent to $7.8bn. IBM took 34.1 per cent of the market while Oracle came second on 33.7 per cent, compared to last year's 35.5 per cent and 32.4 per cent respectively. Business for the relational database management systems (RDBMS) on Linux was the fastest growing segment, outpacing Windows, the sector's other hot performer. Linux RDBMS new license revenue grew 118.4 per cent to $654.8m, with Oracle accounting for 80.5 per cent of that business. A small decline in RDBMS on Unix, meanwhile, was attributed to Linux - Unix dropped almost one per cent. Linux outstripped growth in sales of RDBMS on Windows. That platform grew 10 per cent to $3.1bn in 2004, with Microsoft accounting for 50.9 per cent of business. ® Related stories IBM outfits blade servers with cheap middleware for the masses Why do people hate Oracle? Open source databases - a sword that cuts both ways? Gartner warns of inappropriate Oracle sales tactics
Cable & Wireless is accelerating its broadband rollout by doubling its investment in local loop unbundling (LLU). Through its LLU ISP, Bulldog, C&W has already installed its kit in 400 BT exchanges enabling the firm to provide broadband services direct to punters. It has already hit its target of unbundling 400 BT exchanges seven months ahead of schedule, which means that a third of UK punters are wired up to an exchange served by Bulldog. It now plans to invest even more in broadband and wire up a further 400 exchanges by the end of next year. "This will increase our investment losses in 2005/6 but we see it as vital in capturing the real and increasing UK customer demand for broadband," said the company in a statement today. C&W's decision to invest more in LLU - the process that enables rival companies to provide telecoms services direct to end users by cutting BT out of the loop - comes as regulator Ofcom is about to utter its latest pronouncement on the future direction of the UK's telecoms market. It's spent the last 18 months trying to figure out how to promote competition in the UK's telecoms sector at a time when former monopoly BT still retains a dominant position, especially in the provision of wholesale broadband. Indeed, C&W's decision to ramp up its investment in LLU is based on the premise that the regulatory framework will not be skewed in favour of BT. "In particular, Ofcom's Strategic Telecoms Review provides an opportunity to create a more transparent and effective regulatory regime in the UK. We are encouraged by the review's emphasis on infrastructure-based competition and the principle of equivalence. "It is vital, however, the review delivers an effective and enforceable regulatory settlement. Fair competition must be at the heart of the UK telecoms market if customers are to benefit from the variety of services that new technologies can offer," it said. Publishing its results for the 12 months to the end of March, C&W reported that revenues dipped to £3.22bn from £3.67bn. At the same time profit before tax and exceptional items rose from £278m to £377m. ® Relted stories Billing snafu hits Bulldog Bulldog to extend reach of unbundled broadband Bulldog service suffers 'performance issue' Coming months 'critical for LLU' Bulldog to resubmit BT complaint UK LLU improving says Telecoms Adjudicator
Scientists in the US have linked the spread of the hospital super bug MRSA to a sharp increase in the use of technology in hospitals. Researchers working in hospitals have found that the deadly bacteria clings to the keys of the computer keyboards used to update patient records and therefore can re-infect the hands of staff even after they had washed their hands. There were 55 deaths from MRSA in UK hospitals in 1993, but fatalities have increased every year since and by 2003 were running at nearly a 1,000 annually, according to the National Office of Statistics. The US findings, which were presented to the Society for Healthcare Epidemiology of America earlier this month, found that just touching a keyboard is enough to pick up the bacteria and pass it onto a patient. The researchers also found that cleaning IT equipment with soap and water was not enough to remove the bacteria. The only way to clear the infection from the keyboards, according to Dr Gary Norskin from the Northwestern Memorial Hospital in Chicago who carried out the study, was to rinse the keyboard with disinfectant. "A computer keyboard is like any other surface in a hospital and has to be sterilised," said Norskin, Northwestern Hospital's director of healthcare. The Chicago study is part of a new trend in the US which is now taking a long hard look at how the introduction of computer equipment into hospitals can often represent a health risk. Computers quickly become magnets for airborne dust and bacteria-harbouring dirt, which builds up on their internal cooling fans. The fans represent a further health hazard because of their potential to blow that same dust around a ward. "Anything that can put bacteria into the air is a risk," said Norskin. "If you bang into a computer and disturb that dust you can effectively create a dust cloud." Doctors at the Oklahoma Heart Hospital have already started to address the problem. "We have computers everywhere because our goal it to have a paperless hospital and to have computers everywhere a patient goes,” said Jeff Jones, Oklahoma Hospital's lead system specialist. "Computers harbouring bacteria is a very big concern of ours because we have computers only three feet away from patients in our operating rooms and we can't have dirt in places like that," he said, adding that tuberculosis is another potential risk from technology as it is the world's number one airborne disease. "We did experiment with waterproof keyboards that you can wipe clean but found out that they were generating a lot of keystroke errors that could have been just as dangerous for patients," said Jones. A spokesman for the UK’s National Health Service confirmed that the department's computer specialists were looking into the concerns and that the agency responsible, NHS Connecting for Health, was conducting a study into the issue at University College London to find the risks. Dr Paul Grime, the British Medical Association's spokesman on MRSA commented: "If computers and keyboards are going to be next to people's beds then this is something that we have to be aware of because this equipment is no different from any other hard surface in a hospital but the key to this is hand hygiene and staff have to get used to washing their hands before and after touching a patient.” Such health risks have not gone unnoticed by the computer industry which has moved quickly to respond to the threat created by technology in hospital. "Very shortly UK hospitals are expected to switch to electronic medical records in line with the national Programme for IT which means there is going to be a computer device in every patient care room," said Ken Nott, of the computer company ClearCube, which supplies clean computer systems without fans. "You can wash your hands but not your PC." ® Peter Warren is a freelance journalist specialising in technology, undercover investigations and science issues. You can find out more about him at Future Intelligence. Related stories UK docs demand live liver transplants Malaysia to fingerprint all new-born children US hospital loses patient info BMA tells doctors: avoid NPfIT's flagship project
The European Union has approved a €660,000 grant for FLOSSWorld, a two-year project to promote FLOSS collaboration involving 17 partners in 12 countries. The funding is part of the EU's 6th Framework Research programme. FLOSS stands for Free/Libre/Open Source Software, and the project aims to support research and policy development within the OSS community on a global level. This, the organisers say, will pave the way for better collaboration between the EU and developing countries on open source projects. Organisers argue that while free or open source software is one of the best examples of collaborative work in the world, there is still very little empirical data on the impact of such work, its use and development. The work that was begun under the EU's 5th Framework (FP5) has helped to understand how FLOSS is developed and used in Europe, but now a more global approach is needed. FLOSSWorld is structured around three main research tracks: Human capacity building, looking at the economic impact of FLOSS communities; Software Development, and how approaches vary by region and country; and finally e-Government Policy, loowking at how this varies from country to country. The project is led by the Maastricht Economic Research Institute on Innovation and Technology (MERIT) in the Netherlands and involves researchers from MERIT and the United Nation University Institute for New Technologies (UNU-INTECH). It will itself be a collaboration between major European research institutes and leading public research institutes in the target countries Argentina, Brazil, Bulgaria, China, Croatia, India, Malaysia and South Africa. More about the FLOSSWorld project is here. ® Related stories Open source search engine trawls free code MS unfazed by OSS schools report Open source ahoy!
Financial analysts like to second-guess the results of the firms they cover: it's a harmless sport - after all they are guided heavily by the firms - and they grab some press coverage into the bargain. So caveat aside and onto Credit Suisse First Boston's predicitons for Tech Data, the world's second biggest IT distie, which reports later today. CSFB expects Tech Data to meet consensus Q1 estimates of 60 cents a share on revenues of $5.03bn. But it reckons that "Tech Data's results and outlook will pale in comparison to chief rival Ingram Micro, which has Asia to cushion the fall," of continued price pressure in Europe, Forbes.com reports.
A massive collection of highly-skilled, dedicated, brave law enforcement officials managed yesterday to shut down a web site alleged to facilitate the illegal trade of the latest Star Wars movie and other content. Yes, it took the FBI and the Homeland Security Department to pull off "Operation D-Elite" - an action directed at BitTorrent hub Elite Torrents. The Feds, working off 10 search warrants, seized control of the site's central server in a quick, decisive maneuver and obtained information from the site's alleged administrators. More than 17,800 movie titles were shuffled about by 133,000 Elite Torrent members, according to a statement from the US DoJ (Department of Justice). "Our goal is to shut down as much of this illegal operation as quickly as possible to stem the serious financial damage to the victims of this high-tech piracy-the people who labor to produce these copyrighted products," said Acting Assistant Attorney General John Richter. "Today's crackdown sends a clear and unmistakable message to anyone involved in the online theft of copyrighted works that they cannot hide behind new technology." And later. "Internet pirates cost U.S. industry hundreds of billions of dollars in lost revenue every year from the illegal sale of copyrighted goods and new online file-sharing technologies make their job even easier," said Assistant Secretary Michael Garcia. "Through today's landmark enforcement actions, ICE (Customs Enforcement) and the FBI have shut down a group of online criminals who were using legitimate technology to create one-stop shopping for the illegal sharing of movies, games, software and music." And later. "The theft of copyrighted material is far from a victimless crime," said Assistant Director Louis Reigel of the FBI. "When thieves steal this data, they are taking jobs away from hard workers in industry, which adversely impacts the U.S. economy. The FBI remains committed to working with our partners in law enforcement at all levels and private industry to identify and take action against those responsible." The Feds always use almost comical language to describe P2P and BitTorrent sites, portraying them as the work of evil, swollen-brained mad computer scientists. This time we find that Elite Torrent was a "technologically sophisticated P2P network" and not just a link hub or search engine like you might find in myriad forms on the internet. One gets the feeling that such language is meant to cover the P2P operations with a very sinister aura in the hopes that this will explain why the Homeland Security department is wasting time making sure George Lucas receives all his cash instead of protecting citizens from actual danger. Not to mention that Silicon Valley churns out far more cash for the US economy than Hollywood, meaning that jobs taken away from Disney might end up at Intel or Microsoft because of a P2P breakthrough. But such foresight would be asking a bit much of bureaucrats, especially ones greased by pigopolist pork. We digress. The Feds were especially pleased that visitors to the hijacked Elitetorrents.org would see the message "This Site Has been Permanently Shut Down by the Federal Bureau of Investigation and U.S. Immigration and Customs Enforcement." That message, however, seems to have been quickly replaced by the "Coming Soon" note that is up now. Have the Feds been bested so soon? "The content selection available on the Elite Torrents network was virtually unlimited and often included illegal copies of copyrighted works before they were available in retail stores or movie theatres," the DoJ said. "For example, the final entry in the Star Wars series, 'Episode III: Revenge of the Sith,' was available for downloading on the network more than six hours before it was first shown in theatres. In the next 24 hours, it was downloaded more than 10,000 times." Kinda makes tapping phone calls seem more worthwhile, doesn't it? ® Bootnote This is the last call for submissions in our "Biting the Pigopolists" badge design contest. Be sure to show off your PhotoShop skills. Related stories Shiver me timbers: we are all pirates Mashboxx opens beta test scheme Hollywood calls BitTorrent Brits to US Court Hong Kong scouts gain IP proficiency badge Movie downloads will be a big business... but for whom? Silent tech majority invites Mickey Mouse to poison P2P
Digital rights activists are celebrating this week with the expiry of powers in the UK's Electronic Communications Act of 2000 that gave the Government the right to regulate companies selling encryption services. The Foundation for Information Policy Research (FIPR), an independent body that studies the interaction between IT and society, said the expiry of the rights marks the end of the "crypto wars". The FIPR says these wars began in the 1970s when the US government started treating cryptographic algorithms and software as munitions and interfering with university research in cryptography. In the early 1990s, the Clinton administration tried to get industry to adopt the US government's own encryption system – the so-called Clipper chip – an encryption chip for which the government had a back-door key. When this failed, they tried to introduce key escrow – a policy that all encryption systems providers should leave a spare key with a 'trusted third party'. The third party would have to hand the key over to the FBI on demand. They tried to crack down on encryption products that did not contain key escrow. When software developer Phil Zimmermann developed PGP, the free mass-market encryption product for emails and files, the US Government even began a prosecution against him. The FIPR says the crypto wars were eventually won in the US when Al Gore, the most outspoken advocate of key escrow lost the presidential election of 2000. Despite a number of proposals to introduce a compulsory key escrow system in the UK, the Government finally conceded in 1999 that controls would be counterproductive. But the intelligence agencies remained nervous about his decision, and in the Electronic Communications Act passed in May 2000 the Home Office left in a vestigial power to create a registration regime for encryption services. That power was subject to a five year "sunset clause", whose clock finally ran out on 25 May 2005. Ross Anderson, chair of the FIPR and a key campaigner against government control of encryption, commented: "We told government at the time that there was no real conflict between privacy and security. On the encryption issue, time has proved us right. The same applies to many other issues too – so long as lawmakers take the trouble to understand a technology before they regulate it." Phil Zimmermann, an FIPR Advisory Council member and the man whose role in developing PGP was crucial to winning the crypto wars in the US, commented, "It's nice to see the last remnant of the crypto wars in Great Britain finally laid to rest, and I feel good about our win. Now we must focus on the other erosions of privacy in the post-9/11 world." Gavin McGinty, an IT lawyer with Pinsent Masons, the law firm behind OUT-LAW.COM, also welcomed today's expiry of the provisions for regulating the industry. But he warns that this does not mean that there are no controls on the use of encryption software. "There are still licensing requirements for the transfer of encryption software, which could include encrypted material, to other countries," he said. While the UK's Export Control Act sets out the procedures for transfer out of the UK, McGinty says it is important to also consider the import restrictions in the country into which the software or material is being transferred. He also points to the powers potentially available to the security services, the Police, the Courts and others under the Regulation of Investigatory Powers Act, better known as RIPA. "RIPA grants a power which allows certain authorities to force the disclosure of information that is stored in an encrypted form," said McGinty, "and in certain circumstances it can force the disclosure of the encryption key itself." He added: “Although the relevant sections of RIPA have not been brought into force, the existence of these powers will have given the Government confidence to decide against enforcing the regulatory measures in Part 1 of the Electronic Communications Act." © Pinsent Masons 2000 - 2005 See: The Electronic Communications Act Related stories Crypto regs still tricky UK gov't reveals Big Brother bill The maths prof., free speech and encryption
SAS introduced a number of new point releases in March of this year, notably to its Enterprise Reporting and ETL (extract, transform and load) solutions. Many of the new features are, to a certain extent, catch-up capabilities, in the sense that SAS has been relatively late to become a serious general-purpose player in the ETL and conventional BI markets. For example, its Microsoft Office plug-in only now includes integration with PowerPoint as well as Word and Excel. Similarly, it is with the latest release that Web Report Studio supports rich text. There are still some features that we would like to see. For example, Web Report Studio only partially supports asymmetric reports. That is, you can have different divisions within different countries, say, which is asymmetric in terms of rows but you cannot do the same thing with columns. For example, you might want to show sales figures for the last four quarters but in the latest quarter you might also want to see forecast sales, which you cannot do at present. But progress is encouraging and some of the new features will definitely be beneficial. For example, the Web OLAP Viewer is now integrated with ESRI (for geographic mapping) directly rather than via a bridge. I am also pleased to see: that debugging in the ETL tool is very much simpler and no longer requires any knowledge of SAS code; that impact analysis is more granular, now going down into information maps (metadata); that there are significant new facilities for reporting against metadata; and that there is improved flat file support. However, the neatest new feature is one that I have not seen in any other ETL product, which is what SAS calls “forecasting transforms”. Given SAS’ reputation in the field of data mining and predictive analytics it is not hard to imagine what these are but, in short, they allow you to build (via a wizard) transformations that directly feed analytic applications. Apart from the new features that SAS has introduced it is worth pointing out that the company is now on a 120 day release cycle. In other words, they will be adding features to these products roughly three times a year. If you now think about the likes of Business Objects and Cognos on the one hand, and Informatica on the other (it remains to be seen what IBM does with Ascential), then you are lucky to get one update a year. Of course, this is a consequence of the fact that SAS remains a private company and can afford to put resources into R&D. The long and the short of it is that SAS is beginning to achieve its objectives of being recognised as one of the major players in both the BI and ETL spaces. I have certainly heard that from elsewhere as well as from SAS itself, and this process looks likely to accelerate as SAS continues to upgrade its capabilities. © IT-analysis.com Related stories EII - it's all go out there SAS 9 on intelligence-gathering mission SAS gets tough with rivals
Virgin Mobile says it has been unaffected by a broader slowdown in consumer spending even though the amount of cash spent by each user is down on last year. Takings over the last year were up 15 per cent from £453.3m to £521.3m while operating profit was also up from £48.6m to £53.7m. Reporting numbers for the year to March, Virgin Mobile - which is predominantly a pre-pay business but recently unveiled a contract service - said the number of active users were up 24 per cent to 4.03 million from 3.24 million. The snag for Virgin Mobile - which doesn't own its own network but instead piggybacks on T-Mobile's infrastructure - is that the average amount of revenue (ARPU) it gets from each punter is sliding. In 2004, ARPU was £147 but this year it's fallen to £127, blamed in part to a decline in interconnect fees (call charges between operators). With ARPU falling and competition from no-frills rivals such as easyMobile yet to bite, Virgin Mobile is looking to drive down costs. Even so, Virgin Mobile chief exec Tom Alexander is extremely chipper today: "In the past three years we've tripled our revenues, and 2005 has been our best year yet. Our strong performance shows Virgin Mobile has the ability to sustain sector-leading service revenue growth within a competitive environment." Looking ahead, Virgin Mobile remains "confident in achieving strong growth" for the year ahead and expects to increase revenue by around 15 per cent over the year. ® Related stories Virgin Mobile hits five million UK users easyMobile.com takes aim at easymobile Fresh undercuts discounted easyMobile tariffs
A security architecture touted as one of the core benefits of Microsoft's next major Windows upgrade look like being the next casualty of the Longhorn death march. Mary Jo Foley reports that only some parts of Longhorn will be based on .NET 2.0, rather than the entire OS, as originally intended. If this latest bout of indigestion is true, developers gain compatibility at the expense of the superior developer environment of .NET. But the casualty, as we predicted a year ago, looks like being the ground-up Managed Code architecture which forbids one process from hijacking another: a favorite ploy of malware. (Managed Code is very succinctly summarized here.) So Managed Code is a good thing: a key weapon in the war against viruses. It's long been rumored as a casualty, as we wrote in May 2004 - "All of this points to the Managed Code API project being offshored to somewhere closer to Siberia, and more modest lock-downs, such as No Execute pages (due to appear in XP Service Pack 2) being promoted as a good-enough answer." Mary Jo reminds us that last summer Microsoft "decoupled" WinFS search and storage technology from Longhorn and out into a service pack, throwing in the sweetener for corporates reluctant to upgrade that the Avalon UI libraries would be available on XP. Now she wonders if the latest roadmap modification will be publicly acknowledged. We think it will. Probably with a press release entitled "Longhorn Promises Greater Compatibility" with the task of imagining the additional words "than we originally planned" being left as an exercise for the reader. ® Related decouplings Avalon, WinFS decoupled for Windows Shorthorn (Almost) everything may go, as Longhorn rushes to release MS Trusted Computing back to drawing board No Windows XP SE as Longhorn jettisons features MS delays Yukon Windows Shorthorn is dead-on-arrival Even Microsoft can't wait for Longhorn MS moves into get Longhorn on the road mode Longhorn to erase Cairo mis-step with 1995 ship date Windows Longhorn build leak starts hype two years early Longhorn RTM what it means to you Microsoft delays Longhorn. Again Only kidding? MS may ship Longhorn server after all Gates confirms Windows Longhorn for 2003, Blackcomb MIA? Related stories Indigo not so open as .NET Framework? New Microsoft Longhorn chief is indigestion expert Microsoft going to JavaOne
ReviewReview Today, Intel's officially released Pentium CPUs that offer both high clock speeds and dual-core loveliness, although you won't get both in one package. The Pentium 4 660, which is a 3.6GHz 'Prescott' chip with 2MB of L2 cache, will now be play second fiddle to the Pentium 4 670. All the same internals; just 200MHz faster and, obviously, more expensive. On the other hand, the near-£700 3.2GHz HyperThreading-capable Pentium Extreme Edition 840 gets a little brother. The Pentium D 820 runs in at 2.8GHz, is dual-core, but does not support HyperThreading. The end result is a dualie that comes in at a more palatable £200 or so. Which is better: high clock speed and HT (the 670) or relatively low MHz and two cores (the 830)? Are either of them worth it?
Visto's dominance of intellectual property in email push - and its willingness to enforce its legal rights - has seen Vodafone, and now Nextel, go with the "ConstantSynch" technology, rather than wait for Microsoft and Exchange to support this technique. The Nextel announcement yesterday claims that it not only works with today's version of Microsoft Exchange - where Windows Mobile 5.0 requires a new Service Pack in July - but also, works with rival email servers such as Lotus Notes, or web services like Earthlink, and Yahoo! mail. To use the Visto system, you need a network provider that supports it, and a ordinary Java based mobile phone. The advantage of RIM and Microsoft (when it's available) alternatives, will be the ability to work over other wireless networks; but the drawback is the need for a special smartphone or two-way pager. Can Microsoft nonetheless make a big splash with "Magneto" and Exchange push? Well, yes: Exchange is easily the dominant corporate email server on the planet, and people who have an Exchange Server machine on the premises will almost certainly go with the next Service Pack; and will follow their primary supplier with mobile Pocket PC email and Smartphone email, in preference to rivals. But the longer the legal dispute delays Microsoft, the bigger the share that will go to rivals, and the smaller the Redmond tsunami will be in the pool when it makes its big entrance. © NewsWireless.Net Related stories RIM takes Blackberry harvest to 3m users Market gears up for MS Magneto Mobile email hits the road Mobile email consolidation kicks off Microsoft goes after Blackberry with Magneto Seven snaps up Smartner Kill the Crackberry!
Tiscali UK will begin to migrate broadband punters in London onto its new unbundled platform from tomorrow. Those getting the local loop unbundling (LLU) treatment shouldn't notice much difference - not to start with anyhow. For Tiscali, though, uncoupling itself from BT and providing broadband directly to consumers by installing its own kit in exchanges means increased financial margins and better control over the service it provides to end users. It also gives the ISP the flexibility to offer faster services and new products such as internet telephony (VoIP) and video on demand. Tiscali won't say how many punters it intends to migrate initially but revealed that 27 BT exchanges in London were part of this trial. If this proves successful, Tiscali has ambitious plans to install its kit in a further 200 exchanges by the end of the year with an eye on 600 exchanges by 2006/7. News of Tiscali's LLU plans were revealed on the same day to announced that it has chosen Huawei Technologies to supply the kit needed roll out LLU in the UK. The Huawei equipment - which is already supporting more than a million lines across Europe - is already being installed as part of Tiscali's initial LLU trial. In April Tiscali announced that it planned is to plough €90m (£61m) into unbundled broadband services in the UK over the next three years as part of a renewed effort to secure a sizeable share of this ever-growing sector. The ISP plans to cherry-pick the most lucrative exchanges in the UK to develop its "selective unbundled network". At the time the ISP said: "The UK is an extremely attractive market, offering substantial growth opportunities which Tiscali plans to seize through significant investments, spending around €90 million over the next three years, on developing its selective unbundled network." ® Related stories Tiscali UK to invest £61m in LLU UK flies broadband flag for Tiscali C&W goes large on LLU Tiscali confirms sale of French opo - finally
Quocirca's Changing ChannelsQuocirca's Changing Channels IBM, a high priest of enterprise IT delivery, wants to spend more time ministering to the needs of small and mid-sized businesses (SMBs). Will it find a receptive flock? IBM has transformed itself in the last 15 years by turning into a services-led organisation. Sure, it still sells software and hardware, but this is on the back of a range of business and technical consultancy services that it offers to enterprises. Can IBM scale this model down for SMBs or is an alternative approach required? What are the alternatives? IBM’s legacy is as a hardware supplier. But this is now the most highly commoditised area of IT delivery. IBM itself admitted this last year by selling off its personal computer division. But IBM still has a strong server business – albeit focused mainly at enterprises. The trouble is that to reach the SMB market with a hardware message IBM runs directly into the king of the tin shifters – Dell – and into the long-term incumbent – HP. Dell is taking an increasing share of the SMB market without even bothering to have a reseller programme. Resellers, who tend to the day-to-day needs of SMBs, are increasingly buying servers from Dell, rather than IBM or HP, because it is often the cheapest source for a well known brand. To sell hardware to SMBs IBM must add value through software and/or services. Three of IBM’s five software brands are based around deployment and collaboration software (the other two, Tivoli and Rational, are for systems management and application development respectively). All three are well known and respected names: Lotus for collaboration and email management, DB2 for data management and WebSphere for application deployment. IBM has been busy producing “Express” versions of products from the Lotus, DB2 and WebSphere brands targeted specifically at the SMB market. These are aimed to be easy to deploy, manage, install, learn and use with a competitive price point. They are also packaged to be easy to demonstrate, pilot and sell, hopefully making them attractive to resellers. To simplify things further IBM has also released “IBM Express Run Time” a single environment for running all of its Express offerings IBM’s problem with a software led sale is that it will find itself head to head in the majority of sales with Microsoft who already dominates the SMB market and has software solutions in all the same areas. For IBM a “me too” strategy will be hard grind, it will find it hard to convert the faithful – resellers and SMBs alike – on a software message alone (although, if any have lost faith, IBM and its press team will welcome them with open arms). So what about IBM scaling down its enterprise success and going for a service led sale? Dell and Microsoft don’t have a huge service capability like IBM, but this in itself is part of the point. They don’t have services capability because they rely on resellers to deliver services. IBM marching in with a services offering could be seen as a threat to resellers themselves who, as has been said, tend to the day to day needs of SMBs. IBM sees a way round this. Nearly all IBM’s consulting, be it business or technical is delivered large scale and bespoke for enterprise customers, so it plans to bundle up its capabilities into service packages which its resellers can deliver to SMBs. IBM will use its enterprise experience to help build the content for “IBM Express Managed Services” but the resellers will deliver them, their own skills only being supplemented by IBM’s when required. This may well work, but resellers are fairly bullish about their own capability to deliver services and Microsoft and HP can certainly respond to this, although they would not be able to claim the same enterprise experience on which to base their advice to resellers, especially when it comes to business consulting. But IBM has one final trick up its sleeve, also under the guise of services. It has started accumulating a series of applications that it will resell as hosted services. Is this IBM re-entering the applications market that it stepped out of several years ago? Then its plan was to focus on delivering applications to its customers though independent software vendors (ISVs). This move does not change that, IBM is simply providing a platform for its ISVs to deliver hosted solutions to the SMBs through its “Application Enablement Programme”. Microsoft is also working on delivery of hosted applications to SMBs through some of its larger partners. The difference is that IBM will be hosting the applications itself and its sheer size will mean, if successful, it will achieve economies of scale that Microsoft’s partners will find hard to match. IBM will be able to sell its hosted applications via any resellers, including the smallest, who would not be able to build up hosted offerings themselves, and who spend their time tending the remotest parts of the SMB flock. Of the four approaches IBM can take, a direct hardware push would almost certainly fail, a direct software approach will be hard, slow and will probably fail, the pure services play will be an interesting punt, especially if IBM leverages its business consulting experience, but the one that could out flank all its main competitors is hosted services. IBM should speed up its deployment of these – and when possible they might as well be based on IBM’s own hardware and software – the nature of hosted service is such that the end user organisations, whatever their size, probably won’t know or care. Bob Tarzey is a service director at Quocirca focussed on the route to market for IT products and services in Europe. Quocirca (www.quocirca.com) is a UK based research and analysis firm with a focus on the European and global markets for IT. Related stories IBM has moment of SOA clarity IBM moves onto JBoss turf with Gluecode buy Can IBM's Euro problems continue?
Good news for radio hams: communications regulator Ofcom plans to replace annual amateur radio licences with a new electronic licence that lasts for life. The regulator says it is seeking a balance between maintaining regulatory control and reducing expensive and unnecessary bureaucracy. The proposed new system would mean licences only need to be changed if the licence holders' details - such as home address - changes. Updating the licence could also be done online, making it faster and cheaper for everyone. Although the new licensing system would be web-based, with licences issued for free to those using the online system, Ofcom says it will continue to offer licences by post to those who either don't want to use a computer, or who don't have access to the internet. Postal applications will be subject to an administration fee, but Ofcom promises that disabled licence holders will not be disadvantaged. The terms of the licence will not change, however, and the access rights granted will remain the same: Ofcom will still hold a database of names and addresses, and anyone who wants a licence will still have to pass the Radio Amateur Examination, and will need a valid Pass Certificate. The licence would still come with the same conditions, and could be revoked by Ofcom if the holder is deemed to have broken the rules. The regulator will continue to monitor frequencies and will deal with undue interference as it always has done. The closing date for contributions to the public consultation on the proposals is 18 August, so point your browser here to find out more. Over, and out. ® Related stories PC tax could replace TV licence Ofcom ponders open UWB spectrum Ofcom outlines radio spectrum plans
UK government ministers gave a vote of confidence to the technology underpinning its controversial ID card scheme, as proposals for the national scheme were reintroduced in Parliament on Wednesday. The scheme will link personal information such as names and addresses to biometrics - a computer scan of a person's iris, face or fingerprint. From 2008, UK passport applicants will also receive an identity card, under plans outlined in the government's ID Card Bill. Junior Home Office minister Andy Burnham told reporters that biometric technology is already used in identity documents in countries such as Hong Kong, the Philippines and Belgium. These are much smaller deployments than envisaged in the UK, where government IT schemes have a famously patchy record. Burnham acknowledged there had been problems in the past but said the phased introduction of the scheme, and support from the IT industry throughout the planning process, would help a smooth introduction. He said the technology is ready for widespread deployment. Under the Identity Cards Bill, ID cards would be phased in from 2008 before been made compulsory at some later (as yet unspecified) date. The government estimates running costs will amount to £584m a year - or £93 per card, around 9 per cent up on November 2004 estimates of £85 per card. The government says 70 per cent of these costs will be spent to introduce biometric passports in any case, arguing now is the perfect time to introduce ID cards. The reasons - and emphasis - behind "why we need ID cards" varies each time we visit the Home Office. This time around guarding against identity theft, a crime estimated to cost £1.3bn a year, was highlighted as the top reason ahead of strengthening immigration controls, guarding against the misuse of public services and (last year's number one) fighting organised crime and terrorism. It's questionable whether ID cards can play any meaningful role in combating ID fraud but Burnham has sticking to his guns on this point. Banks would pay for verification services based on ID card technology that left customers less open to fraud, he said. ® Bootnote The Register arrived for press conference fashionably late and were further detained as reception by a Home Office security guard who demanded an NuJ card as identification. A passport, with a US-issued biometric visa inside, and a Register business card were not enough for our man. He called a press officer who arrived quickly and helpfully ushered us into the meeting. Related stories UK ID scheme rides again, as biggest ID fraud of them all EU biometric visa trial opts for the tinfoil sleeve ID cards: Part II Home Office defends ID card plans (again)
ReviewReview When Rio launched the CE2100 at the Consumer Electronics Show in Las Vegas in January, the internet was awash with excitement. As a 2.5GB hard drive-based player in a small casing and with 20 hours of playback, what was not to like? Despite that buzz, five months in the world of the gadget industry is a long, long time, writes Stuart Miles. In that time we've had the iPod Shuffle, the Sony NW-E507, plus a slew of other players from Creative and iRiver. Amazing as it sounds, the Rio has become over-sized, over-priced and just not as exciting as it was on announcement day. Based on the chassis of the Rio Carbon, the controls - the D-pad and central button on the front - are identical. This makes the navigation very simple, and the D-pad offers fast-forward, rewind, stop, play and pause. The central button as been designed as an action button and is used to select songs, menu options or to reveal information about the songs themselves. On the side of the player is the switch that accesses the menu system and that's it as far as buttons on the player go. Just like the Carbon, the CE2100 uses a standard USB 2.0 connector. Songs can be dragged and dropped on the player and this means that not only will it work on both Macs and PCs, it also means that it's the first Rio to do so. You don't have to rely on dedicated software, although Rio has bundled Rio Music Manager if you really must. Listening to the player, it all sounds good. The sound was equivalent to the Karma and the Carbon, with plenty of bass. Although the bundled earphones feel cheap, that's par for the course for most personal music players released in the past decade. Verdict While Rio got it right with the Carbon, simply putting in a smaller hard drive and re-badging it as the CE2100 doesn't really wash with us. Had we been playing with this model back in January when it was announced, without knowing what we now know about the rest of the market and the releases that followed, we're sure that the review would have been very different. As it is, any ground gained by the Carbon has been lost with the CE2100. It might sound good, but as we said at the beginning, it's overly large for the size of the drive - the shuffle might be half the capacity, but it's the size of a pack of chewing gum - and over priced - Rio is quoting £149. More to the point, it just doesn't offer the wow factor the competition does. Disappointing. Review by Rio CE2100 Rating 60% Pros Easy to use; good sound quality. Cons Its size and styling when compared to competition. Price £149 More info The RIo site Recent reviews Intel Pentium D dual-core desktop CPU ATI X600 Pro All-in-Wonder PalmOne Treo 650 smart phone Garmin iQue M5 GPS PocketPC Dell Inspiron XPS Gen 2 gaming notebook Epson P-2000 Multimedia Storage Viewer
New Intel CEO Paul Otellini has flashed a level of marketing savvy unseen with his predecessor by making the unusual suggestion that consumers buy Apple's Mac computers if they wish to avoid immediate security risks. Confused? You're not alone. Otellini had attendees of a Wall Street Journal technology conference in Carlsbad, California scratching their silicon this week, as they puzzled through his pro-Mac statements. The paper recounts the episode as follows; Pressed about security by (a reporter), Mr Otellini had a startling confession: He spends an hour a weekend removing spyware from his daughter's computer. And when further pressed about whether a mainstream computer user in search of immediate safety from security woes ought to buy Apple Computer Inc.'s Macintosh instead of a Wintel PC, he said, "If you want to fix it tomorrow, maybe you should buy something else." Apple advocates will, of course, declare that Otellini is speaking the gospel. Others will claim Otellini's statement is really a public flirtation for Apple Chief Steve "Bono" Jobs, who has been rumored to want a line of Intel-based PCs. We suspect the truth is a little more prosaic. Isn't Otellini, who this month took over the CEO post from Craig Barrett, simply laying the groundwork for a long, merciless marketing campaign? Your current computers are insecure. They're frightening. They're disasters. They are a risk to your business and your home. It's upgrade or die time, friends. Or so the less subtle message goes. Intel today, in fact, released a new desktop platform, which includes improved security tools for business customers such as being able to audit PCs and contain viruses. What a coincidence. Intel has been trying to push this "platform" idea on customers now that it's unable to rely solely on improved GHz as the main sales point of new product. Changes in chip manufacturing mean that processors will arrive at close to the same speeds as their predecessors but with more tools for churning through different types of software better and for even more advanced functions such as running multiple operating systems on the same chip. A big chunk of the platform idea is better security for Intel. And it will roll out lots of jazzy things for keeping code under control. In a larger context, Otellini is clearly waving the flag now for this future product, saying the PCs you have really must be replaced if you want to operate a secure business. It's nice that Microsoft and Intel can benefit from the insecure world they've nurtured for so long. Until Intel's goodies arrive, Apple may sell a couple more Macs because of Otellini's advertisement. But even Apple's best quarter is hardly noticed at Intel. Otellini clearly knows what he's doing. ® Related stories Fearless Feds sink Star Wars pirate website GoogleNet - the ultimate embrace and extend? SMT vulnerability 'not critical' - Intel Intel partner sues... Intel (and AMD too) That classified US military report's secrets in full
The anticipated "web affiliate program" which Skype announced today didn't generate the stir here in Stockholm. Instead, a passionate plea to regulatory authorities was made by Niklas Zennström, asking them to regulate the incumbent carriers, not the newcomers. The spur for this clearly heart-felt call was the decision by the Norwegian regulator to insist that any Voice over IP providers in that country should provide standard emergency calls. Skype responded by disabling its links to the PSTN in Norway. That means, no SkypeOut and no SkypeIn services, pending resolution of the dispute. The decision echoes the decision in America to force VoIP providers to offer standard "911 emergency" - 999 or 912 fire police and ambulance calling - within 120 days. Zennström said: "Let’s make sure we know what we want from these emergency services. Don’t force legacy emergency calling onto innovative networks." For example, he said, the emergency service should not force you to make a voice call. "When there is a burglar in my house, I don’t want to call the police; I want to email or IM them. The burglar may hear my voice!" He concluded: "Let’s work with emergency centres and IETF and others, to come up with emergency services that match what people have. The solution is to provide an open interface to emergency centrals, to receive text, voice and video over IP; and also to build up national IP geographic mapping databases, managed by national authorities." © NewsWireless.Net Related stories Cost of net phone calls may rise Parents blame Vonage over girl s death Vonage bows to 911 pressure
Toshiba and Sony this week as near as makes no odds confirmed there is no chance their rival blue-laser optical disc formats will be combined into a single offering. The two companies' presidents, along with the head of Matsushita, met this week to see if they could thrash out a solution to stalled negotiations on bringing Toshiba's HD DVD and Sony's Blu-ray Disc formats together. However, all three executives appear to have taken much the same line chosen by their underlings. In essence, each told the others to abandon their preferred format and embrace the other. Sony's Ken Kutaragi told reporters in Japan today that "the only hope [of getting a uniform disc] is if we can reach an agreement in a week or two on a new format that is not that different from Blu-ray physically". There lies the rub. HD DVD and Blu-ray Disc, despite using comparable laser wavelengths to increase the capacity of an 12cm optical disc, have different physical disc structures. That's what gives Blu-ray the higher capacity, and makes it possible to produce HD DVDs with existing DVD pressing equipment, albeit with some modifications. Toshiba President Tadashi Okamura took a more gloomy view: "We may actually have a situation where merchandise from both sides is put on store shelves," he said, according to Reuters. "But the market would not allow that situation to last very long." Not necessarily. The real problem will twin formats is the content industry, which doesn't want to have to offer the same movie, music album or whatever in different formats. Not that that stopped it continuing to offer vinyl, cassette and MiniDisc content long after CD become the dominant format, of course. Standardisation makes for lower production costs - though, as we saw with CD, not always lower prices for consumers - but multiple formats don't necessarily mean prices will remain high. And there is room for both, as pre-recorded formats and future recordable/rewriteable media. The battle over recordable and rewriteable DVD formats doesn't appear to have hindered the market, and as we've seen there, disparate formats will eventually be united in drives that support them all. ® Related stories TDK touts 100GB recordable Blu-ray Disc Sony details PlayStation 3 Toshiba unveils 45GB HD DVD Toshiba slams Blu-ray/ HD DVD convergence claims Sony to add Blu-ray and DSD to Vaio Sony 'open' to Blu-ray Disc/HD DVD bonding talks
The German government plans to record the biometric facial features of those present in stadiums during the World Cup in 2006. By comparing these features with images stored in a database, the police hope to identify potential hooligans. When the software recognises a suspicious person, security forces on location can immediately be alerted. The security plan was presented this week in Stuttgart by Germany's Interior Minister Otto Schily. Face recognition, of course, is not new, not even to identify hooligans. Although there are about 80 differences in facial features between people, only 14 are needed to confirm identification. Plastic surgery, beards and glasses can't obscure these measurements. In Germany, however, they will go one step further. The police, Heise Online reports, will be equipped with mobile optical fingerprint systems for fast identification based on data for people with a criminal record. Furthermore, it won't be possible for soccer fans to enter stadiums without an official RFID ticket. ® Related stories World Cup 2006 'abused for mega-surveillance project' World Cup tickets will contain RFID chips World Cup 2006 organisers clash with eBay.de
A new Belgian electronic ID card contains typos introduced purposely to confound potential fraudsters, Luc Vanneste, General Director Population and Institutions of the Belgian Home Office, proudly announced this week. To trick fraudsters, the Home Office has introduced three circular arcs on the card - just beneath the identity photos - where you will find the name of the country in the official languages spoken in Belgium - French, Dutch and German, as well as in English. But instead of 'Belgien' in German, the ID card incorrectly uses the name 'Belgine' and instead of 'Belgium' in English, the card reads 'Belguim'. Vanneste has promised other errors will be printed on the card to "further confuse fraudsters". With any luck, these will not be revealed. Belgium is the first European country with a nationwide electronic ID card. The personal information on it is stored at the country’s central population register, and contains a digital certificate so that users can securely access e-government applications. The card - valid for five years - will gradually replace the existing ID card system in Belgium. By end-2005, over three million eID cards will be distributed in the country. ® Related stories ID cards technology is ready, says UK minister EU biometric visa trial opts for the tinfoil sleeve Malaysia to fingerprint all new-born children
Shares in software company Novell slumped on Wednesday after the company posted a $16m loss in its second quarter results. The loss was attributed to Novell's increased investment in repositioning the company as a Linux provider. After the announcement, shares in Novell fell seven per cent in after-market activity. Despite foreign currency exchanges adding $8m in revenue, Novell failed to meet Wall Street expectations of $0.03 per share in quarterly profits on revenue of $301.9m. Reported revenues increased by $3m on last year's figures, reaching $297m. The $16m loss translates into $0.04 loss per share, the same loss suffered in the second quarter of 2004. A year earlier, the company posted a $15m loss. New license revenue also fell compared to the previous year's figures, reaching only $45.8m in comparison to $60.3m in 2004. Cash and short-term investments were $1.6bn as of 30 April 2005, compared with $1.7bin at 31 January 2005. Excluding restructuring costs and other expenses, the company broke even with $1m net income. In the same period in 2004, the figures were a more robust $14m, or $0.03 per share. Despite the disappointing results, Novell chairman Jack Messman remained upbeat. "Our results this quarter reflect the significant investments we are making to reposition Novell," he said. "I am confident that these investments will lead to increased customer acceptance of our solid solution offerings in the Linux and identity driven computing segments." © ENN Related articles IBM and Red Hat to browbeat Sun Solaris users for free OSS gains business ground in Europe Linux versus XP on the desktop: Reg readers speak Netline Open-Xchange - the next Firefox? Red Hat Q4 sales soar Deutsche Bahn dumps Intel, pumps SUSE onto IBM mainframe
Cisco is advising users of its IP telephony kit to update their software following the discovery of a flaw that might allow hackers to mount denial of service attacks. The bug, involving flaws in the processing of maliciously crafted DNS (Domain Name System) packets, also affects some of Cisco's content networking and secure router products. The vulnerability is limited to Cisco products running DNS clients, rather than DNS Server functions, and creates a means for remote attackers to crash vulnerable devices, Cisco warns. Cisco has made a series of free software upgrades available to address the vulnerability. The scope of the vulnerability - and the number of products affected - promises to create a lot of work in Cisco shops, so users are advised to scope out remedial work sooner rather than later. More technical details (but not a list of affected vendors) can be found in a UK government UNIRAS alert here. ® Related stories Unholy trio pose DDoS risk for Cisco kit Cisco patches VoIP vuln Networks on yellow alert over ICMP flaw Cisco source code theft part of 'mega-hack'
ReviewReview While the third update to Mac OS X, Panther, was an essential upgrade for Mac users, the fourth has presented Apple's marketeers with something of a challenge. The ritual that we call the annual OS upgrade is Apple's best publicity showcase after January MacWorld - a chance to remind the world that it doesn't just make iPods. And it's a sensible occasion to introduce major system wide updates. It's also an opportunity to charge rent - and a predictable revenue stream is something software vendors have longed for years. Microsoft, Oracle and Sun amongst them. But Microsoft's Licensing 6.0 scheme has flopped even amongst business customers, not least because enterprises are sceptical that the company can deliver the goods within the lifetime of the subscription. Apple has a different problem. As OS X improves, it becomes harder to convince OS X users to make the jump. If Microsoft had announced that the next version of Windows XP would sleep and wake up within three seconds with near 99.99 per cent reliability, would pick up a WiFi network within 10 seconds with similar consistency, and was now free of viruses, then users would flock to upgrade. But even the first, barely usable version of Mac OS X boasted this when it first appeared March 2001. The issue here isn't tempting Windows switchers, but whether an annual $129 can be justified. 2003's Panther release was everything OS X should have been in the first place, fixing many long standing performance issues. Tiger continues that all round improvement, and feels crisp compared to Panther. (It's hard to overestimate the cumulative effect of these tiny performance gains. Computer enthusiasts will often spend hours or days writing on a macro that saves a few seconds, and isn't used very often. Yet the tiny improvement in Preview, and its ability to allow you to view a slideshow, will probably gain the typical user as much as a complicated workflow.) So OS X Tiger sees the Mac in excellent health. However for many existing OS X users there isn't a single compelling reason to upgrade. As a consequence, reckon some users, it is a focus on the cosmetic and the superficial. Which is exactly where the focus shouldn't be. "Today's reality is that Apple has to convince consumers that making these [kernel] changes is worthwhile. The only way they can do this is to hype visible, cosmetic changes with lots of attention getting bling," writes one poster on an Apple enthusiast board, in a thread entitled "Were the Most Anticipated Features of Mac OS X 10.4 Just A Gimmick?" Is this justified? Noose media After a month of hands on use, it's hard not to sympathize with the accusation. Apple claims to have made over 200 improvements in Tiger. But on closer examinations these include such essentials as "Buy Printing Supplies", a graphics equalizer for the DVD Player, and "Export Bookmarks" from Safari, giving the impression that Apple was stumped as to how to sell the Tiger upgrade. Each bundled Dashboard widget is listed as a separate improvement. We counted over 20 new desktop backgrounds in Tiger - why aren't these listed as new features, too? More troubling however is that in areas where Mac OS X Tiger does offer impressive potential advantages over its predecessors, these are hampered by poor and often inexplicable interface design decisions. Tiger also loses points by removing features computer users have long taken for granted. Let's deal with the latter first. Saving an MP3 that you've loaded from a web page and played in Safari now requires an additional $29.99 payment for Quick Time Pro. That will be reason enough not to upgrade for many. Roxio's Toast no longer burns songs purchased from Apple's Music Store. Right-clicking to download a file from Safari still works, but for how long is anyone's guess. The trend isn't in the right direction - Apple has gradually been removing multimedia features from its software products (see Apple de-socializes iTunes). It's hard to escape the conclusion that Apple now views the Mac as a platform for a closed home entertainment system - based on iTunes and QuickTime - rather than an open computing platform. AirPort Express is a great example of how a little vision, and terrific engineering, can be spoiled by this new approach. Using Airport Express, it ought to be possible to pipe audio wirelessly from any Mac application to the remote speakers - which should appear as another sound output device in the control panel. But Apple crippled the software, forcing the user to pipe music through iTunes. The Mac is becoming the incredible vanishing media platform! The Life Aquatic Apple makes much of three Tiger features: Dashboard, Automator and Spotlight. Of these two are laudable attempts to solve long standing issues with personal computing. The other is a silly gimmick that typifies the demoware approach to software development. Dashboard widgets ripple onto the desktop with what we hope will be the last Aquatic metaphor from the UI team. But as a metaphor, a watery desktop never made much sense to begin with. Documents "liquify" to and from the Dock - but when documents get wet, shouldn't they curl up, and the ink smudge, too? Configuring Dashboard slides the entire watery pool to the top of the screen, rather like a Bond villain's swimming pool sliding back to reveal an ICBM launch pad. When metaphors start silly, they can only get worse. On Apple's own support boards, concern about the performance impact of Dashboard is widely expressed. But this is overstated, and isn't the real issue with Dashboard. (With 30MB free memory - on a machine with 1.5GB of RAM - Dashboard claims half of what's available when invoked. It doesn't matter how many widgets are active. The memory is slowly reclaimed by the garbage collector, and this compares very favorably with Konfabulator, a clunky rip-off of Stardock's DesktopX product for Windows.) The real issue with Dashboard is that it's a solution looking for a problem. We've had equivalents such as Desktop X and Konfabulator for several years now, and they've yielded thousands of clocks, media controllers and dancing Hula girls. Shed the gimmicks and redundant applications and what's left isn't too different from the set of desk accessories that shipped with the original Mac. As a consequence, Sherlock has been neglected, and the third party application that inspired it, Watson, has been acquired by Sun Microsystem and forgotten. Someone needs to go dumpster diving at Sun and rescue the latter. We always thought the hype behind web services was overstated - but Sherlock is still an excellent way of navigating eBay and the easiest way of finding CD album cover artwork (try the FirstRiver Amazon plug-in if you don't believe us). The marketing focus on the slick, but useless Dashboard does rather detract from Spotlight and Automator, which are extremely promising. We'll leave an in-depth review of Automator for when it's had time to mature (our version has two top-level "Help" menus); but it was surprising to see it didn't pick up AppleEvents exposed by third party applications. It does however support shell scripts. Which leaves Tiger's crown jewel, Spotlight. In the Spotlight Spotlight is a system level content indexing engine that's also available as a service for developers. Mail and the Address Book make good use of it. You can group all your contacts at Sunshine Desserts in a moment, or email on particular topics, or from certain users. We were delighted to discover that the 101kb limit on content that Spotlight indexes has been removed since the first Tiger beta. This handicap renders Google useless as a serious research tool, although the search engines have been loosening their corsets over the past year, indexing slightly larger files. We can't say what the upper bound is, but test documents over 1.5MB were fully indexed in Tiger. Because the kernel now supports notifications, Spotlight queries are kept right up to date. Add a word to a file and it's reflected in the queries almost immediately. So from a technical point of view, the file system team has done a terrific job. The problem with Spotlight is in everyday use. For example, some simple searches are now much harder. If you merely want to search for a particular file by name, you'll need to use an undocumented feature, and wrap the search term in quotes. Unlike the search results window in Panther, or in Mac OS versions prior to Sherlock, the Spotlight window is now an orphan. If it loses the focus, you need to use another undocumented feature: apple-space-space to return the focus. If you want a simple list of files over a certain size, you need to resort the results displayed. More seriously, the user interface severely hampers what queries can be made. The dearth of boolean operators (AND is permitted) means that it isn't possible to query for documents containing "Microsoft" and "Antitrust" but not "EU". There is a query builder that allows content to be specified as a field, but without even the limited boolean qualification that an iTunes Smart Playlist allows: permit all or any, it's useless. As a "smart folder" this is pretty dumb. A "smart" folder: the only boolean it understands is 'AND'. Dumb. Nor is it possible to tell if it the document contains one or many instances of the phrase. Most serious of all however is the lack of context provided by the search results. Google returns a couple of fragments from sentences surrounding the text, and mature standalone search tools such as dtSearch and Copernic highlight the text in a two pane view. But in its first implementation the Spotlight API doesn't provide this information. Well organized users may be able to infer the context. If you're spectacularly well-organized, you may have a folder called "EU Court of Appeals Decisions", but Spotlight results don't display that alongside the result (you need to click each item returned), and of course most of us aren't that methodical. If Google results offered this much context, Larry and Sergey would still be at school Hopefully this will be implemented in the next update. It's so very nearly there. For example, a query for "Tevanian deposition QuickTime" found a PDF document and loaded it with the search term. But Preview doesn't support multiple word queries, and so mistakenly returns "0 occurances". Mail is another example where searching is slower than before. Quite idiotically, it isn't possible to search by sender or subject without first embarking on a mailbox-wide query. The previous version of mail permitted this. Now with the Spotlight-enabled mail search (which takes up twice as much disk space as before, 2GB in this case) you need to start a search, stop it right away, and then use one of the buttons that appear only after the first results from the aborted search have been displayed. There is no dialog that allows you to limit the query to specific fields or build a specific query. Does no one at Apple use Mail? Persistent queries, or as Apple calls them Smart Folders, show the promise of system level searching, but also show how poor UI designs can scare users away. "All documents in the last three days" displays images, even though Apple lists Images (or "Presentations") as a Document. Want to look only for files by extension? Good luck - search by extension has disappeared. A simple Boolean "NOT" would solve this kind of nonsense, but it isn't an option. A UI summary Some of the cosmetic changes smack of change for change's sake - such as the pale blue look for the Mail application. We won't dwell on these particular UI aspects - John Siracusa has done a fine job over the years and goes into depth here. What's of more concern is that Tiger's best feature, Spotlight, violates two of the founder's favorite maxims. One was expressed in a Wired interview in 1996. " Some people think design means how it looks. But of course, if you dig deeper, it's really how it works. The design of the Mac wasn't what it looked like, although that was part of it. Primarily, it was how it worked," he said, correctly. The other is the often repeated, but unofficial corporate goal of "making great technology easy to use." Tiger fails on both of these counts. Spotlight is great technology, but it fails because the poor UI lets it down: its potential isn't tapped. And Dashboard was only ever about bling. How did this happen? Plenty of people will blame the demo sensibility: if it looks great in a demo then Apple considers the job done. But I suspect it owes more to corporate paranoia. By guarding Tiger secrets so closely Apple has created an echo chamber. Surely someone, somewhere could have pointed out the lack of boolean operators Is Tiger in good shape? Resoundingly so. Is it worth the upgrade? Well, at Patip Plaza in Bangkok I was offered Tiger on DVD for 600 Baht. At $15, or £8.20, you'd be silly not to.®
Strong sales from HP and Dell carried the worldwide server market higher during the first quarter of 2005, according to the latest data from Gartner. Total server sales hit $12.3bn - up 4 per cent from the $11.8bn sold in the same period last year. Gartner characterized this rise in sales as a solid start to the year. In particular, the uptick in server shipments seemed to indicate that a feared slowdown in corporate spending is not in play. "It is fairly good news for the quarter," said Adrian O'Connell, principal analyst at Gartner, in an interview. "There were a few reports and concerns about the economy, and this shows that the market is still relatively stable." HP stood out as a vendor on a mission during the first quarter. The company has struggled against rivals IBM and Dell over the past year but managed strong 13.1 per cent year-over-year growth in the period. Dell was the only vendor to beat out HP with 13.4 per cent growth. Meanwhile, IBM saw sales rise just 1.1 per cent and, Sun watched as sales fell once again - this time by 4.2 per cent. "HP is improving here," O'Connell said. "Their ProLiant (x86 servers) sales are up, and that's really where they had lost out to Dell. It's too early to say if it's permanent, but they've turned it around in the near-term." HP out-shipped all vendors with 498,000 boxes sold. Dell followed with 402,00, IBM with 250,000, Sun with 81,000 and Fujitsu with 72,000. IBM, however, led in revenue with $3.7bn in sales - a figure boosted by more expensive Unix and mainframe systems. HP followed with $3.5bn, Dell with $1.3bn, Sun with $1.2bn and Fujitsu with $887m. All in all, HP has to be most pleased with the quarter. It and Sun have watched IBM and Dell make huge gains over the past year, eating up parts of the Unix and x86 markets. For all the turmoil HP's server division has been through, this was a good showing. ® Related stories NetApp opens fire on EMC Intel's platform shift BEA: services up, licensing down New HP CEO coasts through mediocre Q2 Sun plays hide and seek with key Solaris 10 goodies Dell keeps double-digit growth groove going in Q1 Intel feels healthy and dual-core happy Sun shares surge on 'joke' that firm will go private
Some of the UK's leading telcos have finally agreed to work together to try and stamp out those "few rotten apples" that rip-off punters with expensive phone services. They've effectively signed up to an early warning system that should help the industry spot scams and take action more swiftly against rogue operators that lure unwary punters into running up huge phone bills by calling numbers charging up to £1.50 a minute. 3, BT, Carphone Warehouse, Kingston Communications, NTL, O2, Onetel, Orange, T-Mobile, Telewest, Vodafone and Your Communications have all agreed to keep an eye on premium-rate services. If they suspect a service provider is acting illegally - by employing expensive premium rate number for rogue diallers or non-existent prize scams - then these telcos will swap information and, if necessary, report companies to watchdog ICSTIS. The Memorandum of Understanding (MOU) was signed after a report by communications regulator Ofcom in December called on the industry to do more to stamp out rogue operators. ICSTIS denied that today's MOU was "forced" on the industry following the report and defended operators for not doing more in the past to weed out illegal operators. In a statement Telewest's Bryan Petch, who took a lead in drawing up the MOU, said that the "vast majority of service providers operate legitimately under the ICSTIS Code". "Unfortunately a few rotten apples have found ways of exploiting PRS (premium rate services) for their own ends. In publishing this MOU, the communications providers involved are expressing their determination to stamp out abuses and allow consumers to continue to receive the benefits from genuine premium rate services," he said. Last week ICSTIS confirmed it had yet to recoup a single penny after fining 16 premium-rate phone services £1.3m following a crackdown on rogue operators in March. ® Related stories Ofcom slaps premium rate industry UK phone scammers yet to pay fines Watchdog fines prize call telco £100k 16 scammers fined £1.3m New 0871 rogue dialler scam spotted Citizens Advice warns of 'shocking' rogue dialler scams UK unfurls ratings system for adult content on mobiles
Cisco has bought application optimisation start-up FineGround Networks for approximately $70m in cash and options. The deal, announced Thursday, is expected to close before the end of July 2005, subject to regulatory approvals. Privately-held FineGround makes network appliances that "accelerate, secure, and monitor application delivery". Cisco plans to integrate the data centre-orientated kit within its extensive portfolio of networking gear as a means to help its customers accelerate application response time and minimise costs. According to FineGround, its technology can improve end-user response times by up to five times, reduce application bandwidth usage by up to 90 percent, and reduce the load on servers by up to 90 percent. The technology also allows firms to add application firewall functions to web-enabled business transactions. FineGround was founded in June 2000 and has 42 employees. Post acquisition, the firm will join Cisco's Security Technology Group, reporting to Jayshree Ullal. Last month Cisco's main rival Juniper Networks splashed out $500m on two firms in a move also broadly geared towards making IP-based applications more secure and reliable. Wide Area Network optimisation technology firm Peribit Networks and application front end firm Redline Networks have joined the Juniper stable ahead of plans by both Cisco and Juniper to make application optimisation a key plank in their product development plans. ® Related stories Juniper takes two for improved application push Cisco pays $65m for Protego Cisco warns over DNS glitch
Estonian president Arnold Ruutel has put the kybosh on plans to allow internet voting in the country, saying that the process needs to be made more secure. He called for a more thorough debate on the uniformity of elections and the reliability of voter identification. According to press agency AFP, Ruutel specifically mentioned the principle of uniformity in his ruling. He argued that the provision in the e-voting bill that would allow online voters to change their minds several times violated this principle. Voters using traditional ballots can, of course, only vote once, which Ruutel said, means voters are not give equal opportunities to vote. The Estonian parliament okayed internet voting a month ago, for use in local elections coming up this October. The system uses the country's identity card system as the basis for voter identification. Voters would need to have an ID card reader attached to their computer to vote online, but although around 60 per cent of the population owns an ID card, the number with card readers is much lower. The original plan was to extend this system to the national elections in 2007, but Ruutel wants more debate before that happens. But he still backs the scheme in principle. "Electronic voting is an important additional opportunity in the development of our state, which would also help highlight Estonia's progress in promoting e-governance," he said. ® Related stories Councils not generating interest in local elections Brits voice fraud fears over high-tech voting Ireland faces 50m e-voting write-off
Supercomputer maker Cray has been on a fantastic voyage this week, receiving another dose of funding from the US government, two class action lawsuits and a new CFO. Cray nailed down $17m from the Feds over the next two years to continue work on the company's future system code-named Black Widow. The computer is due out in 2006 and will be Cray's pride and joy. The government often subsidizes Cray's work in the hopes of receiving very specialized, very powerful systems for crunching away on military and scientific tasks. "The development of these systems advances our product roadmap," said Jim Rottsolk, the CEO at Cray. "With continued funding, we expect Black Widow to reach a peak performance of several hundred teraflops in its initial design, and to exceed a petaflops (a thousand trillion calculations per second) in its product lifetime." That, however, was the good news for Cray. A pair of shareholder lawsuits have been filed, alleging that Cray hid business problems from investors. Cray's stock dropped 40 per cent last July after it revealed disappointing second quarter revenue and announced a 15 per cent workforce reduction. "We intend to show the leadership at Cray intentionally misled investors through a well-orchestrated campaign of misinformation," said Steve Berman of law firm Hagens Berman Sobol Shapiro and lead attorney in one lawsuit. "Their actions have hurt thousands of individuals who should have had the benefit of the same information that executives used when they sold their stock." Lerach Coughlin Stoia Geller Rudman & Robbins also filed suit for investors who held shares of Cray between 31 July, 2003 and 12 May, 2005. This suit, however, focuses more on Cray's decision to delay the filing of its 10-K annual report with the government and what the law firm describes as poor operational controls within the hardware maker. In a statement, Cray responded to the first lawsuit. "We believe this lawsuit is without merit, and the Company intends to defend it aggressively," Rottsolk said. Cray did not immediately return a call seeking comment on the second lawsuit. Away from the lawsuits, Cray this week also named a new CFO when it tapped Brian Henry for the post. Henry previously served as CFO of Onyx Software and arrives at Cray with quite the turmoil on his hands. Cray's former CFO departed in October, its auditor Deloitte & Touche resigned last month and earlier this month the Nasdaq threatened to delist Cray due to its failure to meet a Sarbanes-Oxley compliance requirement. ® Related stories French bail out Bull with $690m US compute labs in desperate need of Federal swill Major server vendors in giant, supercomputing cluster cluck NASA's Columbia benchmarks 43 teraflops Cray comes to market with XD1 Dell dances past the IT sector with strong Q2 Cray pours Red Drizzle over anxious investors Cray's Q2 revenue gigaflops Met Office bags shiny new supercomputer
Netcraft this week released a Firefox version of its free anti-phishing toolbar. The release follows the availability of a similar Internet Explorer plug-in, released in December 2004. The toolbar blocks access to phishing sites reported by other members of the Netcraft Toolbar community and validated by Netcraft. More than 7,000 such phishing sites have been detected and blocked so far, Netcraft reports. The release runs on any operating system supported by Firefox, displaying the hosting location, popularity, and an "abstracted risk rating" for each site visited. In addition, the toolbar defends against pop up windows which attempt to hide the navigational controls and traps suspicious URLs containing obfuscated characters. The toolbar can be downloaded here. Scam emails that form the basis of phishing attacks commonly pose as 'security check' emails from well-known businesses. These messages attempt to trick users into handing over their account details and passwords to bogus sites. This simple trick has become the increased focus of fraudulent activity. In response, security firms have developed various paid-for and free products designed to help surfers to stay one step ahead of fraudsters. For example, digital certificate firms Comodo has developed technology to tie the visual components of a site to its website address using digital certificate technology. Vengine has been around for the best part of 18 months as a discrete download but was this week made available to third-party toolbar vendors. Earlier this month, digital certificate rival GeoTrust released a free IE plug-in, called Trustwatch, which verifies the security and trustworthiness of websites using a simple traffic-light system. The toolbar works in conjunction with the Anti-Phishing Working Group blacklists. ® Related stories Netcraft crafts anti-phishing service Underground showdown: defacers take on phishers UK banks hope to send phishing mules packing Opera beefs up browser to thwart phishers Gone Phishin'