Hollywood has abandoned its attempt to stifle publication of DVD decryption code, by dropping its lawsuit against a Californian publisher. The DVD CCA (Copy Control Association) filed a trade secrets lawsuit against Andrew Bunner (and others) for disclosing details of the DeCSS, which circumvents the CSS encryption scheme used on DVD discs. Three years to day after the courts issued a preliminary injunction against Bunner et al, the DVD CCA yesterday filed a motion to dismiss its case, noting that "this court should not be reviewing this case on the basis of a less than fully developed record." It had become impossible to argue that Bunner was publishing a trade secret: CSS simply wasn't a secret any more. You can read the final ruling here [PDF, 1MB]. John Gilmore's lucid testimony giving background to the case can be found here. The Norwegian programmer who published DeCSS was acquitted on retrial earlier this month. Related Stories California Supremes issue DVD crack setback California Supremes hear DeCSS case DeCSS temporarily banned from the Net US inspired copyright laws set to sweep the globe – for fun and profit Norway throws in the towel in DVD Jon case iTunes DRM cracked wide open for GNU/Linux. Seriously
For the first time in Internet history there are more DNS rootservers outside the United States than within, following this week's launch in Frankfurt of an anycast "instance" of RIPE NCC-managed K root server. The K-root DNS server is one of the 13 official DNS rootservers which answer lookups for domain names all over the world. It is operated by RIPE NCC, the organization in charge of IP adress allocation throughout Europe, the Middle East and parts of Africa, and is sitting at the London Internet Exchange. Of the original 13 rootservers only three were outside the US, which fuelled criticism of US centrism in Internet management. "The launch of the K anycast instance will make the German Internet Community more independent and the DNS in general more reliable," said Denic CEO Sabine Dolderer. The registry for .de-adresses co-sponsered the new server together with the German ISP association ECO. According to Axel Pawlik, CEO of RIPE NCC, the anycast system will for example help to mitigate DDoS-attacks on the thirteen root servers. RIPE NCC technicians were among the pioneers of the anycast concept for the root servers. So far they brought instances of the K server that sits at the London Internet Exchange (LINX) to their headquarters in Amsterdam and to Frankfurt. Until the end of the year there will be up to ten identical servers in all of Europe. The anycast systems makes the central zone files of the original root server available under the same IP adress on different machines in different locations. The spreading of the servers across the net make it more difficult to attack them and lower the response times for local communities. "We talk about milliseconds," says Dolderer. Most users would not notice the change, and only traceroutes would show that request are now answered by the Frankfurt server. But tests at the anycast instance of the F root server in Dubai resulted in a drop from 130 to 30 milliseconds. The operators of F, the Internet Software Consortium, so far have spread most rapidly over the globe. F root servers already have been installed in Ottawa, Madrid, Hong Kong, Rome, Auckland, Sao Paulo, Beijing, Seoul, Moscow, Taipei, Dubai, Paris and Singapore. "It happens that our offers for a K instance were answered by 'oh, we already have F'," says Pawlik. But there was no race between root server operators. Up to now, five of them already joined the anycast effort and more will join, said Pawlik. "The more instances of the root servers the better", he says. Some-well connected places house the slaves of several root servers. London for example has the K root and also hosts instances of I and J. Now with the K instance at the Frankfurt German Internet Exchange (DeCIX) there are 24 root server locations outside, compared to 23 in the US. Not counted in are instances of root servers organized by local communities like the N-IX Internet Exchange at Nuremberg, Germany. The N-IX is anycasting root servers of ICANN, the NASA and WIDE in Japan. "Technically this changes the concentration," says Dolderer. "But the political problem is still there." What information is fed into the system has to be decided by the Internet Corporation for Assigned Names and Numbers ICANN) and be propagated through the master, the A root server in Dulles. Everyone of the other 12 root servers are the so-called slaves and so are their new "children". And master A is under the oversight of the US Department of Commerce. ®
Intel will follow its recently delayed 'Dothan' mobile microprocessor with 'Jonah' late next year, Japan's PC Watch claims, speculating that the part may even be one of Chipzilla's first 65nm products. The latter claim is based on indications from the paper's sources that Jonah's likely to appear late in 2005 and will come in with a power consumption rating of 45W. The figure comes from a statement made by an IBM ThinkPad engineer that the company is designing next-generation thin'n'light notebooks with that kind of power consumption in mind. Dothan, by contrast, consumes 21W of power, the article suggests, citing OEM sources. The design envelope is 25W. That will rise to 30W when Intel launches a version that supports a 533MHz effective bit rate frontside bus later this year. The article reckons, not unreasonably, that the Low-Voltage and Ultra-low Voltage versions of Jonah will consume significantly more power that their Dothan equivalents. Little else is known about Jonah other than it will almost certainly clock beyond 2.13GHz, the highest speed Dothan reaches in current roadmaps. The upshot of all this, PC Watch suggest, is an opener for Transmeta, whose 90nm Efficeon is expected to clock comparably with Jonah but potentially with a much lower power consumption - around 25W. ® Related Stories Dothan slips again Intel Centrino 2 to offer 533MHz FSB - report
Strike action among Bradford's IT staff has been put back a month while union officials continue negotiations over the council's plans to privatise its IT department. Earlier this week 100 members of UNISON voted overwhelmingly to strike at the end of the month. They're concerned that any move to bring in private finance to run the IT department would put their jobs and pay-and-conditions at risk.. Yesterday afternoon, though, workers voted to give the union and council officials more time to discuss the plans in what has been described as "goodwill" measure. Unless staff receive the necessary assurances about pay and conditions industrial action will go ahead between 26-27 of February. A further seven days of action will begin early in March. Speaking after the meeting UNISON Regional Officer, Chris Jenkinson, told The Register: "Our members wanted us to explore issues further with employers. It is a matter of goodwill. But unless something substantial comes out of the talks this is the last time they are prepared to defer action." In a statement Simon Cooke, Deputy Leader of Bradford Council, said: "We are very pleased that staff have voted to suspend industrial action until the end of February. "This will allow us to continue working with trade union representatives and bidders to find a solution that meets the needs of the people of the district but also addresses the genuine concerns of staff," he said. ® Related Story Bradford IT staff vote to strike
Microsoft posted its first ever $10 billion quarter, although profits were down. "Home and Entertainment" more than doubled to $1.26 billion; Microsoft says the Xbox live service now has 750,000 users "Information Worker" revenue grew 27 per cent, helped by the release of Office 2003, to $2.89 billion in the quarter. Client revenues reached $3 billion and Server and Tools $2.1 billion. After a blip in the preceding quarter, Business Solutions revenue was up to $190 million, which in the corresponding SEC statement, Microsoft attributed to licensing of Navision products and Business Contact manager. The group also includes Great Plains and bCentral. The MSN group grossed $546m, up 19 per cent year on year, although Microsoft says that was down to growth in paid search and an improving advertising market; MSN subscriptions were down two per cent. But the Mobile and Embedded group continues to be the runt of the litter, bringing in $63 million, up $10 million from the previous quarter. Microsoft expects stock-based compensation expenses to be $5.7 billion in the current quarter. ®
Siemens' mobile phone business sold 26.7 per cent more handsets year on year during the group's most recently completed fiscal quarter, Q1 2004, the company said yesterday. The group's Information and Communication Mobile (ICM) division shipped 15.2 million handsets in the three months to 31 December, up from ten million in Q1 2003. Mobile phone sales totaled €1.47 billion, up from €1.31 billion in the year-ago quarter, yielding a €64 million to group profits, up from €52 million this time last year, However, competitive pressures and a product mix weighted toward lower-priced phones continued to limit average selling price per unit, the company admitted. ICM's other component, Mobile Networks, recorded quarterly sales of €1.2 billion, generating earnings of €26 million - reversing the loss of €25 million it reported this time last year. In total, ICM achieved sales of €2.96 billion and income of &euro:123 million. The group as a whole reported a net income of €726 million (82 cents a share), up from €521 million (59 cents a share) during the year-ago quarter. First-quarter sales of €18.33 billion were two per cent higher year-over-year on a comparable basis, the company said. ® Related Stories Phone sales drive Nokia Q4 profitability Motorola beats the Street with profits surge Related Products Find Siemens phones in The Reg mobile store
Adoption of virtual private networks (VPNs) based on the browser-based SSL technology is shaping up to be one of the key trends of 2004, especially as enterprises mobilize their workforces. SSL allows for secure access to corporate networks from virtually any browser and so provides flexibility for roaming workers with laptops or smartphones, as well as simple mechanisms that reduce support costs. This week, two security companies, NetScreen and F5, said that their first quarter revenues had been boosted by their acquisitions last year of SSL specialists, whose products have been added to their more traditional IPsec VPNs and firewalls. The main demand is coming from companies with large mobile workforces, said F5. The company bought the FirePass VPN with uRoam last year, and the product contributed $1.5m or 5% of revenue in the first quarter. This was 50% above the targets F5 had set, and in the second quarter SSL sales of over $2m are expected. Total revenue grew 33% to $36.1m and the company plans to incorporate FirePass into its core product, Big-IP. NetScreen said that it sold $2.2m in SSL VPNs, about $1m more than anticipated, between acquiring Neoteris on December 14 and the end of its first quarter on 31 December. Neoteris now forms a separate unit called Secure Access Products. NetScreen’s total revenue grew 59% year-on-year to $81m. According to a survey sponsored by AT&T, simpler remote access to corporate networks is the number one factor driving uptake of remote and mobile workforces. The mobile phone and networking giants – such as Nokia, Cisco and Nortel - are increasingly introducing SSL VPNs into their ranges and most have seen an upsurge in interest and sales in the past three or four months. Frost & Sullivan believes sales of SSL will pass the $500m mark by 2006, from just $20m in 2002. Lower costs are another attraction to large companies. F&S estimates that the cost per use is $60- $220, compared to $150-$300 per user for the more cumbersome IPsec, which is optimized for site-to-site communications rather than mobile access. The savings are mainly down to lower support requirements and the fact that there is not need to preconfigure every PC. IPsec still dwarfs SSL in terms of revenue – in 2003, revenues were about $2.4bn for IPsec, compared to $100m for SSL. The two products are complementary and some companies, including Cisco and Nortel, plan to integrate them into a single VPN. The main challenge now is to improve the security features of SSL itself, particularly to improve authentication, which is far weaker than using IPsec. Nokia has developed the Secure Access System, which exchanges digital certificates with the machine in use and performs client integrity scans to check for any weaknesses and adjust user privileges accordingly. Many SSL specialists will look for security partners and companies such as NetScreen are aiming to provide an integrated suite of security products. © Copyright 2004 News IS News IS is a weekly newsletter published by Rethink Research, a London-based publishing and consulting firm. News IS covers the news announcements, business transactions and financial statements of the top 150 or so IT vendors, along with other news of interest to the modern senior IT manager working within data centre technologies. Subscription details here.
It's just over a month since new anti-spam legislation was introduced into the UK with almost universal condemnation that the new laws would have a limited effect in the fight against junk email. The £5,000 fine for offenders has been branded by some experts as an "inadequate deterrent". And even those who've welcomed the new legislation - which is part of an EU-wide directive - doubt it will have any real impact on combating spam. For although there is one EU anti-spam law, how it is interpreted and implemented is up to each individual European country. Earlier this week, for example, Denmark fined a telecoms hardware company a record £37,000 for sending fax spam. In the UK, on the other hand, spammers can expect to receive fines of up to £5,000 if the case makes it to a Magistrates' Court. What's more, the process of bringing legal action against spammers is so cumbersome and clunky that it could be at least a year before anyone makes it inside a courtroom. The Information Commissioner, the body charged with administering the anti-spam law in the UK, declined to reveal how many complaints it had received in the first month, except to say that it had "received a steady stream of complaints". In the UK at least, it comes as no surprise that some people simply don't believe the new anti-spam laws will work. Nevertheless, there are some who believe the new legislation is a significant step forward in the right direction. Joe McNamee, EU Policy Director from Political Intelligence, public affairs consultancy specialising in ICT, accepts that the new legislation "is not a complete answer to the problem of spam" and that there are plenty of issues that still need to be resolved. Despite the limitations, he's convinced that the new rules - that outlaw the sending of electronic communication to people unless they've agreed beforehand to receive them - deserve a better press. "The new EU legislation is about far more than just email spam and the wider picture must be understood to make a fair appraisal of the legislation," he told The Register. Here, he explains why: over to you, Joe. "It would be fair to say that the introduction of new anti-spam legislation has not been well-received. Most, if not all, the articles I have read take the view that the legislation simply will not work. As far as I can see, there are three main arguments that keep being rolled out concerning much of the coverage. While it's much easier to be critical rather than constructive, I'm keen to give some balance to a debate that is at risk of being completely one-sided. "In essence, critics claim the anti-spam law is unenforceable. They insist that the borderless nature of the Internet means that it is impossible to enforce regional or national legislation. While others throw up their hands in despair casting doubt on a law that covers an infringement that occurs almost always outside our jurisdiction. The spam law is unenforceable "Many argue that the spam law is unenforceable. But I want to clear up a key issue here. The recent legislation outlawing unsolicited commercial email is not a just a law on email spam - it is a law on all unsolicited electronic communications. Have you ever considered how much spam is received via SMS, MMS, etc? What's more, how much more spam could be sent via SMS, MMS etc, if safeguards weren't in place? "For me, the issue of whether mobile spam should be addressed is beyond question - failure to regulate this would have serious consequences for the development of mobile marketing and would act as a disincentive to using mobile technology, thus seriously damaging the whole sector. This has been recognised by the mobile marketing industry itself, which supports the view that consumers must "opt-in" to all mobile messaging programmes. "Faced with the huge difficulties presented by trying to frame any law on unsolicited electronic communications, legislators were presented essentially with three options: "First, they could have ignored the problem and decided not to interfere in the issue. Clearly, this was not a realistic possibility and would have shown a real lack of political courage, leadership and forward thinking on behalf of legislators. "Second, the new rules could have been devised that resulted in "technology specific" legislation, for instance, treating phone, fax, email, SMS, MMS, instant messaging (IM) etc in different ways. The problem is that such an approach would have made it impossible to adapt these new rules to convergence. For instance, how could this deal with cross-platform communications such as Instant Messaging to SMS communications or email to SMS, for example? What's more, with the time it takes for legislation to get from the drawing board to the statute books, it's likely that new technologies will have developed in the time it takes to create any technology specific legislation. The adoption of technology specific legislation is therefore not the answer. "The third option, which is the one that Europe took, was to introduce legislation, which is "technology neutral". By that, I mean that it establishes principles for electronic communications generally, rather than referring to specific technologies. The idea is that the legislation should be "future proof" and not tied to individual communications methods. This is the approach of the European legislation which deals with mobile messaging, Instant Messaging, cross-platform communications AND email - rather than just email. It's not just a European problem, the Internet is borderless "Even if you accept that Europe was right to adopt a "technology neutral" approach in a bid to cover all electronic communications, there are still plenty of critics who say that spam isn't just a European problem because the Internet is, by its nature, without borders. "According to some estimates, more than half of all spam originates from outside the EU way beyond the jurisdiction of the EU legislation. The point, argue critics, is that we're powerless to do anything about it on our own. If one assumes that the spam problem is never going to change in any way, then this is correct. However, what of the spam that does originate from within the EU - or could originate from within the EU if some sort of legislation was not in place? "Necessity is the mother of invention and spam is the last refuge of the lazy marketer, more willing to abuse the medium, potential customers and email infrastructure than use a more acceptable means of communication. By removing the temptation of using the "lazy option" (spam) the new rules can only have the effect of pushing European marketers to greater innovation. There is no doubt that unsolicited email is counter-productive, consumer-unfriendly and, crucially, undermines the whole concept and credibility of email as a marketing tool. However, there are numerous perfectly legal forms of direct marketing still available within the EU. The advertising embedded in some free mailing lists is just one example. At least within the borders of the EU, email as a marketing tool can use the new rules to develop a new credibility and marketers can use this to develop original and innovative ways of using the array of legal online advertising options at their disposal. "After all, under the old system (before the new rules were introduced) things were really confusing and messy. We had eight countries with "opt-in" and seven countries with "opt-out" approaches to spam, leaving consumers with no idea of whether the email marketing they received was legal, illegal or credible, and legitimate online marketers struggling to distinguish themselves from spammers. "Now, we have a harmonised approach where the law is approximately the same in all EU countries, where the law is clear, where the "lazy option" is eliminated and marketers have both the legal certainty and incentive to develop new and innovative ways of marketing using messaging systems. What's the point in a law which covers an infringement which is largely outside our jurisdiction? "Even so, much of the debate surrounding the issue keeps coming back to the same core arguments - what's the point in a law which covers an infringement which is largely outside our jurisdiction? "Well, Europe is the world's largest international trading bloc, soon to count 25 countries from Portugal to Estonia. The new legislation creates the most effective global example of best practice, enabling the EU to fight credibly for better spam laws elsewhere. Europe is the best-placed international trading bloc in the world to establish best practice and, thankfully, took the opportunity to do so. "Critics of the new law also assume that a spam problem emanating from Europe was mysteriously never going to develop - an assumption that does not appear to have a very solid basis. Without harmonised and clear laws, lazy and/or unethical companies would always have the temptation to use spam. "Of course, legislators could have just turned a blind eye to the problem, assume that it was never going to take hold in Europe, turn down the opportunity to establish global best practice and do absolutely nothing. "Instead, they've pre-empted the problem of unsolicited electronic communications developing in Europe. Not only will this avoid the mobile marketing market from being damaged by the growth of mobile spam, the EU is also taking a stand for fair and responsible handling of the legal framework for marketing via electronic communications. "As far as I can see, there is only one realistic option - and that's the one we have in place. "There is no doubt that legislation is not a complete answer to the problem of spam. There is no doubt that issues such as jurisdiction are significant and problematic. But, the new EU legislation is about far more than this and the wider picture must be understood to make a fair appraisal of the legislation." ® Related Stories Danish spammer fined £37k UK anti-spam law goes live
The two big computing ideas of the twenty first century grid computing and web services were brought closer together by an announcement this week at GlobusWorld the grid conference run by Globus Alliance, writes Bloor Research analyst Peter Abrahams. The Globus Alliance is a research and development project focused on enabling the application of Grid concepts to scientific and engineering computing and has developed the toolkit used by most scientific and university grid computing projects. It is an alliance between several university and laboratories in the US and Europe, including Edinburgh. Web services have concentrated on creating an environment for 'applications on-demand' whereas grid has concentrated on providing 'computer-resources on-demand'. The announcement is made up of two new web services specifications that are necessary to make the grid computer resources available to the applications without the applications having to be grid aware. The two are the WS Resource Framework and the WS Notification specifications. Note the careful use of the word specification, and not standard, they are specifications put out for public discussion before they are submitted to a standards body. The authors of the specification say that they will submit them to a standards body soon. A word of caution IBM and Microsoft said that when they announced their WS Reliable Messaging specification nearly a year ago and there is still no indication when WS-RM will be submitted. The new specification use WS-RM as a building block so I await movement in this area. IBM is one of the main authors of the new specification whereas Microsoft is not involved, so movement may be faster. WS Resource Framework specification was contributed to by IBM, Globus and HP. It defines how to access a WS Resource through a web service. A web service is stateless whereas a WS Resource is stateful, the state could be such things as data in a purchase order, current usage agreement for resources on a grid, or metrics associated with work load on a Web server. The framework describes a standard way to access a WS resource by referencing it through an endpoint reference made up of the web services address and a resource id. It also defines how a web service can access or create new WS resources. The WS Notification specification was contributed to by IBM, Globus, Akamai, HP, SAP, Tibco and Sonic. It provides a publish-subscribe messaging capability for Web Services. It enables changes or events to be published in a standard way and then notification send directly or via a broker. This is bringing some of Sonic's Enterprise Service Bus (ESB) concepts into a standards specification. WS Notification is important for grid computing because it provides a standard way for grid resources such as web servers to notify grid schedulers of changes in state. Changes in state include coming on and off stream, being fully loaded or having spare capacity, having a fault that needs attention etc. All of this information is required for a grid scheduler to make informed decision on what resources to use next. With well defined WS Resources being accesses through well defined Web Services it is possible for grid resource to be scheduled without the application having to be grid aware. If IBM. HP and Globus can take these specifications quickly to a standards body then there is a high chance that the rest of the industry including integration, system management and application systems suppliers will support the standards in their products. © ENN
Online voting is fundamentally insecure due to the architecture of the Internet, according to leading cyber-security experts. Using a voting system based upon the Internet poses a "serious and unacceptable risk" for election fraud and is not secure enough for something as serious as the election of government officials, according to the four members of the Security Peer Review Group, an advisory group formed by the US Department of Defense to evaluate a new on-line voting system. The review group's members, and the authors of the damning report, include David Wagner, Avi Rubin and David Jefferson from the University of California, Berkeley, Johns Hopkins University and the Lawrence Livermore National Laboratory, respectively, and Barbara Simons, a computer scientist and technology policy consultant. The federally-funded Secure Electronic Registration and Voting Experiment (SERVE) system is currently slated for use in the US in this year's primary and general elections. It will allow eligible voters to register to vote at home and then to vote via the Internet from anywhere in the world. The first tryout of SERVE is early in February for South Carolina's presidential primary and its eventual goal is to provide voting services to all eligible US citizens overseas and to US military personnel and their dependents, a population estimated at six million. After studying the prototype system the four researchers said that from anywhere in the world a hacker could disrupt an election or influence its outcome by employing any of several common types of cyber-attacks. "Attacks could occur on a large scale and could be launched by anyone from a disaffected lone individual to a well-financed enemy agency outside the reach of US law," state the three computer science professors and a former IBM researcher in the report. A denial-of-service attack would delay or prevent a voter from casting a ballot through a Web site. A "man in the middle" or "spoofing" attack would involve the insertion of a phoney Web page between the voter and the authentic server to prevent the vote from being counted or to alter the voter's choice. What is particularly problematic, the authors say, is that victims of "spoofing" may never know that their votes were not counted. A third type of attack involves the use a virus or other malicious software on the voter's computer to allow an outside party to monitor or modify a voter's choices. The malicious software might then erase itself and never be detected, according to the report. While acknowledging the difficulties facing absentee voters, the authors of the security analysis conclude that Internet voting presents far too many opportunities for hackers or terrorists to interfere with fair and accurate voting, potentially in ways impossible to detect. "The flaws are unsolvable because they are fundamental to the architecture of the Internet," said David Wagner, assistant professor of computer science at UC Berkeley. "Because the danger of successful large-scale attacks is so great, we reluctantly recommend shutting down the development of SERVE and not attempting anything like it in the future until both the Internet and the world's home computer infrastructure have been fundamentally redesigned, or some other unforeseen security breakthroughs appear, states the report. There is no way to plug the security vulnerabilities inherent in the SERVE on-line voting design, according to the report's authors. The Internet voting plan and touchscreen equipment not linked to the Internet are part of a general move in the US toward greater use of computers, provoked in part by the problems associated with paper ballots during the 2000 presidential election. But the authors of the SERVE analysis conclude that opportunities for tampering are being overlooked in the rush to embrace new election technology. "Voting in a national election will be conducted using proprietary software, insecure clients and an insecure network," concluded report author and former IBM researcher Barbara Simons. The full security analysis of the SERVE system can be viewed online at servesecurityreport.org<. Detailed information about the SERVE system is at < a href="serveusa.gov/public/aca.aspx/>serveusa.gov/public/aca.aspx/. © ENN
LettersLetters The speed camera issue is certainly emotive, if the response to our article this week on Peter O'Flynn and his amazing 406mph Peugeot is anything to go by. Many of the emails we received chronicled scandalous miscarriages of justice such as the chap clocked at 83mph while pedalling a bicycle down the A12. Obviously a technical cock-up in the camera department is to blame for that particular outrage, but in the case of Mr O'Flynn, are there other, darker, forces at work?: I'm probably stating the obvious, but maybe there's been a mix-up between the type of the Peugot (406) and the alleged speed of the car (406 mph)? But I'm still curious on what charges Peter O'Flynn will be brought up... Sincerely Jan-Joost van Kan So are we. For the record, we did notice the chilling numerical co-incidence, but decided to supress it for fear of provoking the black helicopter brigade. Our Pete should just be grateful he wasn't driving a Vauxhall Corsa 1700... As to the matter of how fast our speeding knight of the road was in fact travelling, various readers were keen to take us to task on the finer points of metric etiquette. Perry Newhook took considerable umbrage thus: In an article in 'The Register' you state that the motorist was "clocked at an impressive 406mph (653kmph)". What an abomination of the metric system. The correct spelling should have been 653 km/h. Here is a website showing correct SI usage from (believe it or not) the Americans: http://lamar.colostate.edu/~hillger/correct.htm It's been over 30 years since the UK has started its conversion to the metric system and the fact that people like you can't even correctly spell the units is just ridiculous. Metric? Never heard of it mate. Now if we had a pound for every email like the following from Robert Moore, we would currently be living it up with a handsome €1.44: I realize that in the states we know nothing about the outside world.... But in regards to the article mentioned in the subject, the link provided had a copy of the offending ticket. The speed was quoted in MPH. Wouldnt it have been recorded in KPH being over there across the pond... ? Nah. We still have miles over here in Blighty, one of which equals (and we offer these equivalents purely for the benefit of our European neighbours) a healthy 1.60934 km, or 1.70111e-13 light years or 13,746 French Golden Delicious apples laid end-to-end. Having straightened that out, we now come to another apparent factual crash-and-burn: Sorry to point out a minor flaw in your article, the speed of sound (at sea level at least) is 761 mph. It means the 406 in question would have to be faster than it was clocked at to make a sonic boom! http://www.aerospaceweb.org/question/atmosphere/q0112.shtml Dave McK Well, we did only say the man was attempting to break the sound barrier. And a very good effort it was too, although we are clearly not the only nation to be pushing back the envelope of ground speed. Thomas Conway writes from Australia: Here in Melbourne there has been lots of hoorah over much the same issue. A truck driver was booked doing 180kph though the Burnley Tunnel, which is equaly laughable. Shouldn't that be km/h? For shame. Stuart Lamble offers further evidence of Antipodean speed camera madness: It will no doubt come as no surprise to the more cynical amongst us that this sort of thing is by no means uncommon. Witness the following three articles in the Melbourne (Australia) Age: http://www.theage.com.au/articles/2003/10/30/1067233322213.html http://www.theage.com.au/articles/2003/11/10/1068329487082.html http://www.theage.com.au/articles/2003/12/23/1071941726867.html and consider that there's been a massive backlash in Victoria against speed cameras, on the basis that it's more about revenue raising than safety. And there's the rub: No matter how much the government protests to the contrary, many drivers see these cameras as nothing more than a nice little earner for the powers that be. The fact that they regularly malfunction serves only to rub salt into the wound. All of which rather scotches the idea that you only have something to fear if you're exceeding the speed limit. Still, it provides plenty of entertainment, and we await the first recorded transonic Skoda - driven by a octogenarian member of the Women's Institute on her way to a flower-arranging demonstration - with considerable relish. ®
I recently visited an outsourcing company in the Boston area that provides a back-up and disaster recovery service, writes Bloor Research analyst Robin Bloor. The company was Live Vault, although you're unlikely to hear the name directly as they sell through the channel. There were two interesting points that the company had to offer, beyond the fact that it offers a quick to install service. The first is that the cost equation is just about ready to squash the use of tapes for back-up. I've heard this form other sources too, so I suspect that it is true. The problem with tapes is that, actually many of them fail. The statistics seem to suggest that in well managed data centers the failure rate of tapes is in the region of 20 percent, while in less centralized environments (distributed organizations or SMBs) the failure rate is more likely to be 50 percent. It is easy to understand why data centers get a lower failure rate, they probably buy more reliable media and they also probably test the media. However, I'm not sure I believe the figures - at least not on an individual tape basis. It would suggest that any back-up that involved more than 5 tapes would be likely to fail no matter what the site was. However I can believe that in a given percentage of back-ups there are tapes that have failed. Anyway to some extent it doesn't matter. The situation is that disk is fast becoming a better back-up option from the point of view of cost and outsourced back-up is also looking like a good business area because if the service provider has enough customers, economies of scale kick in. It seems that at the most the cost equation for this service favours mid-sized organizations that do not have sophisticated IT departments. The larger companies tend to have some kind of full-blown disaster recovery capability which is likely to be well organized and likely to achieve significant economies of scale. But as the cost of disk continues to fall and the availability and the cost of bandwidth falls too, an outsourced back-up service becomes more compelling. This trend is interesting in respect of the move to utility computing. Many companies outsource web site management and nowadays there is a good business in some outsourced applications (not the success of Salesforce.com). If outsourced back-up becomes more attractive, it becomes yet another move in this direction. As with all such changes in infrastructure policy, utility computing is destined to happen by increments. First one thing, then another, until outsourcing becomes the preferred solution for most problems. It is becoming a well-established trend and it will alter the face of the computing - probably taking root first among mid-sized businesses. © IT-Analysis.com
IBM’s big push at Linux World is to announce new programs to help customers dump Window NT and move straight to Linux. IBM’s logic is that by the end of 2004, Microsoft will discontinue support for the Windows NT operating system and discontinue the availability of security patches, which will require up to two million customers to develop a migration strategy. The message is crystal clear, IBM wants to intercept the jump up to Windows 2003, and bring all those customers to Linux. IBM says that Linux is gaining popularity in government, retail, finance and manufacturing because of its reliability and low cost and it anticipates an increase in customer migrations this year. IBM has put in place a series of programs for business partners that includes free Linux education, tutorials and customer scenarios that illustrate typical migrations including pre-tested workloads, which will enable IBM’s partners to know which system to quote to offer sufficient performance. Part and parcel of this service will be a move to Move2Lotus program offering messaging and collaboration products, to replace Microsoft Exchange running on Windows NT and the Migrate Now! offering including DB2 Universal database to replace Microsoft SQL Server complete with the DB2 Migration Toolkit, Tivoli systems management tools and Linux-based Websphere. © Copyright 2004 News IS News IS is a weekly newsletter published by Rethink Research, a London-based publishing and consulting firm. News IS covers the news announcements, business transactions and financial statements of the top 150 or so IT vendors, along with other news of interest to the modern senior IT manager working within data centre technologies. Subscription details here.
Microsoft's recently released Blaster clean-up tool was downloaded 1.4 million times during the first few hours of its availability earlier this month. The strong need for the tool makes a case for greater automation of viral removal, according to Microsoft. The tool, which disinfects machines infected with either the Blaster or Nachi worms, was released by Microsoft five months after the original worm hit the Net in response to pressure on Microsoft from ISP and enterprise customers. Normally, such clean-up technology is left to AV firms. But this isn't a normal viral epidemic: ISPs say the worm is still generating malicious traffic, months after its first appearance. Having bought into the AV market last summer, Microsoft can no longer say the issue is somebody else's problem. So Microsoft released a Windows Blaster Worm Removal Tool through Windows Update. Stuart Okin, Chief Security Officer at Microsoft UK, explained that the tool would only be downloaded onto machines that were infected with the prolific worm. Even so, there were 1.4 million "distinct downloads" of the tool within the "first few hours" of it availability, according to Okin. Users can force downloads of the tool but in the vast majority of cases the Blaster download tool was automatically downloaded through Windows Update onto machines that were patched but still infected with the worm. Patching against security vulnerabilities and removing viral infection are separate processes. Okin said the success of the Blaster clean up tool illustrates the need for greater automation in the removal of viral infection. He defended the five-month wait for a Blaster clean-up tool from Microsoft: "We were watching the market and it became obvious there were continued network problems and people were not using cleaning tools. After it became obvious our user base was still affected we responded accordingly." "The tool had to to be tested before we could put it on Windows Update," Okin told The Register, adding that it would be unfair to accusr Microsoft of tardiness. Microsoft shouldn't get too much credit for cleaning up its own mess but the experience of the Blaster clean up tool calls into question current AV approaches. Microsoft isn't ready to disclose what it intends to do with last year's GeCAD acquisition, according to Okin. But he said the company is entering the AV market because a large percentage of its user base lacks protection. Its objective is not to go gunning for dominant market share, he added. Okin agreed that Windows is far more commonly afflicted with worm infections than Linux, but argued that Microsoft offers greater accountability and support than open source alternatives. ® External Links Blaster Worm Removal Tool for Windows XP and Windows 2000 Related Stories Microsoft releases Blaster clean-up tool Blaster rewrites Windows worm rules Nachi worm infected Diebold ATMs Telia blocks spam-sending Zombie PCs The trouble with anti-virus
More than £9 million is being pumped into the North East of England to wire up all the exchanges for broadband. BT will use its own dosh to fund the conversion of 24 exchanges to ADSL as part of its successful pre-registration scheme. A further 87 exchanges will be wired up to broadband using £4.7 million from the local regional development agency (RDA). The announcement by RDA One NorthEast that BT is to ensure that every exchange in the region gets broadband by March 2005 coincided with a visit by PM, Tony Blair, to see his own local exchange being upgraded for ADSL. Said the PM: "Information technology is transforming our world and broadband is at the forefront of this revolution. By speeding up communication, broadband is opening up new opportunities in almost every area of our lives." Yesterday, BT announced that Needham Market in Suffolk has become the 1,000th exchange to be converted to ADSL as part of BT's broadband registration scheme. The monster telco reckoned it was a "great day for everyone involved with the registration scheme", an initiative that enables exchanges to be converted to ADSL once a set number of people register their interest in broadband. ® Related Story 1,000th BT exchange triggered for broadband
Smart small and medium-sized businesses can nab themselves some serious software bargains this year if they negotiate hard with vendors currently bending over backwards to break into the SME market. Research unveiled today by META Group advises small firms that the time is ripe to nail vendors of enterprise resource planning (ERP), supply chain, and infrastructure offerings to the floor in negotiations over software and service purchases. According to the analyst firm, the opportunity to haggle will last throughout 2004 and 2005 as software vendors ramp up their efforts to attract small and medium businesses by modifying their technology, product lines, selling techniques, and pricing models. "Software vendors are continuing to revamp the products they offer SMEs - along with their marketing, sales, and product development strategies - to better meet the buying styles of SMEs," said Carl Lehmann, vice president with META Group's Technology Research Services. "As vendor business models and selling strategies change, SMB buyers will have to improve their negotiating skills." But META points out that the opportunity for SMEs to secure discounts will be realised only if their negotiating skills are up to scratch. Firms must adapt their haggling techniques to highlight understanding of flaws so they can by get the best deals from vendors offering software, services, and maintenance packages. Many SMB buyers focus on software functionality and fail to adequately address the implementation services or maintenance charges associated with an IT project, META's study warned. For example, small firms should be aware of the fact that vendors often try to base maintenance contract prices on the 'list price' of software licenses (usually 20 per cent to 22 per cent) when dealing with SME buyers. Lehmann advised: "It is important for SMB buyers to leverage the interest in their market and negotiate maintenance agreements based on 'as sold' prices, or contract deliverables in return for their business." The study predicts that, using this newly found leverage, SMEs could negotiate maintenance charges down to or below the 18 per cent of list price average, or they could add contract extras such as support and free future software releases to sweeten their deals. ®
If you're the proud owner of a Dell PowerEdge 1650 server and have noticed it has just shut down in a puff of smoke, then rest assured that help is at hand. Dell is well aware of the inflammatory tendencies of its rack-mounted server, apparently provoked by an overheating inductor. Speaking for Dell, Bruce Andersen has assured customers that there is no safety issue, because "When the overheating occurs, the server shuts down". Good job too. The problem may affect all PowerEdge 1650s sold from January-May 2003, and Dell will contact owners to offer a free motherboard replacement. In the meantime, may we remind readers that pouring black coffee over flaming computers is a bad idea, and that beer is the only truly effective solution. ®
Will Apple introduce a 20th Anniversary Macintosh* on Monday, two days after the platform's 20th birthday? Suggestions from some quarters suggest it might well do so, and it's hard to imagine the company not wanting to commemorate this significant milestone. Today's Macs are arguably very different from the one Steve Jobs pulled out of a bag during Apple's annual stockholder's meeting, held in the Flint Center at DeAnza College on 24 January. Originally conceived by Jef Raskin as a computer for the person on the street, an easy-to-use, complete ready-to-run machine with as few cables as possible, the original Mac was genuinely different. Since then, the platform evolved into a variety of standard desktops, towers and notebooks. It's some testament to the iconic status of that original, compact all-in-one case that its descendant, the first iMac, pulled Apple back from the brink in the mid to late 1990s. The first Mac's suitability for desktop graphics saved the company from the inevitable death of the Apple II at the hands of the IBM PC and its clones, so it's appropriate that the iMac should perform a similar trick, this time winning the hearts of consumers rather than publishers. It may not do so next time. The iPod has arguably become the product that the Mac was back in the mid- 1980s - the icon that defines how people view the company behind it. It's not hard to foresee a time when Apple is known more for mobile multimedia devices - not to mention selling the content they play - than for making computers. In apparent acknowledgement of the fact, Apple recently reposted its famous 1984 Superbowl ad on its web site, tinkering with the original footage to allow the hammer-wielding gurlie to be seen wearing an iPod - an act of artistic vandalism exceeded only by the special editions of the first Star Wars trilogy. Beyond their unique styling, construction materials and superior operating system, what really distinguishes Apple's computers from the Wintel rivals? Some fans will point to their PowerPC processor, but a computer's CPU is really just a means to an end: providing a user experience. Apple remains ahead of the pack on both user interface and hardware styling, but the gap is once again narrowing. No matter how many unusual-looking machines it comes up with - the 20th Anniversary Mac one of them, no doubt, alongside the Cube and the anglepoise iMac - they're unlikely to have quite the immediate attraction that the original Mac and the first iMac had. Or the iPod. Indeed, the iPod's styling recalls the original Mac - the compact case, the display dominating the top half of the front of the shell, the connection points smoothly moulded into the casing. We hope the Mac is around in another 20 years. This piece was typed on one - the fourth in this writer's ownership, and just one of many I've used since the mid-1980s. More than a few of the other Vulture Central scribes use Macs too. Mac OS X provides firm foundation for the platform's evolution toward that time. But we have a sneaking suspicion that the next big event Apple celebrates will have more to do with its compact, user-friendly music player than the Mac, by then just one hardware line among many. ® *The second of that name: the first was released to celebrate Apple's own 20th birthday.
TV boss from hell David Brent lives on in a set of two training videos made for Microsoft UK. Comedian Ricky Gervais, star of hit sitcom The Office, recorded the videos at Microsoft's Reading campus last month. Posters inviting staff to enter competitions to appear alongside Brent in the training videos - titled I'm Back, and this time it's Personal Development - are still on display in MSN's central London offices. Microsoft is staying schtum on details on the deal, saying it is an internal matter. Gervais has resisted pressure to make a third series of The Office, the hit BBC series he co-wrote with writing partner Stephen Merchant. The Christmas special episodes shown last month is expected to be the last outing for the Brentmeister, whose hilarious lack of self-awareness has had Brits in stitches for the last two years. A US remake of the series is in development. The Microsoft videos are unlikely to be released externally. Unlike Wernham Hogg, the fictional company ridiculed in The Office, Microsoft is reckoned to be one of the best places to work in the UK, according to list compiled by The Times newspaper. However there are some disturbing similarities between Brent infamous dancing techniques and Steve Ballmer's equally jaw-dropping 'Monkey Boy' antics. Scary. ® External Links The Office online guide
BT's dial-up service is being bugged by an intermittent network problem that is causing frustration for a couple of thousand punters. A number of readers have complained that they've experienced email problems and difficulty accessing certain sites. One reader told us how he'd been experiencing problems for the last week or so. "Like others, I've had a lot of problems accessing some Web pages, email, and Internet banking. Google has been causing significant problems but I have had problems on many other sites. The sites just lock up and fail to download/display. "The main grumble on the newsgroup is lack of information and weak Tech Support. A number of people have stated on the newsgroup that we will be leaving BT if this [problem] persists. Another explained how while he was able to send email he hadn't received any for more than a week. A statement on BT's status page confirms that the telco is experiencing problems on its dial-up service. It reads: "We are aware of an issue that may be preventing customers from accessing their email via an email client such as Outlook Express. Please be aware that your mail should still be available via webmail, available from the BT OPENWORLD homepage. "We are also aware that some customers have difficulty in accessing the btopenworld and bttogetherinternet home pages. Users may also not be able to access secure web sites such as certain banking sites," it said. BT denied that the problems - which it claims have been around for the last couple of days although some readers insist it's been longer than that - is related to the migration of punters to its BT Yahoo! dial-up service. A BT spokesman played down the problem, insisting: "We haven't got major problems but we do have some network problems that are affecting a couple of thousand of our dial-up customers. "BT apologises for this and is working round the clock to put the problems right," he said. ®
Windows NT and 2000 customers should move to Win 2003 as soon as possible to take advantage of lower support costs, according to Microsoft. Stuart Okin, Chief Security Officer at Microsoft UK, said that the cost of supporting earlier Microsoft platforms has gone up because of the expense in applying security updates. But if customers invest to modernise their systems and move on XP and Windows Server 2003 then their support costs will eventually fall to a level lower than they ever had with earlier versions of Windows, according to Okin. "If we can persuade people to move over the hump then they will enjoy lower support costs eventually. Along with this they'll move to an OS with a fundamentally different, more modern architecture that's more built for Web services," he told The Register. Windows Servers 2003 comes with many services turned off by default to the impact of any security vulnerability tends to be less serious on that platform. Okin said any decision to move operating system platform should based on a solid business case; security is only one of the elements to be considered. Security depends more on people and process than OS platform but upgrading technology can improve an organisation's security stance. Recently, Microsoft extended Windows 98 support until the end of June 2006. Giving security coverage to remaining Windows 98 users - particularly consumers - undoubtedly played a part in these decisions, according to Okin, who said the decision will enable users to migrate over a longer period. Okin's comments come in the run-up to a visit to London by Microsoft chairman Bill Gates on Monday, Jan 26. Also the the battle for hearts and minds between Redmond and Linux advocates for control of the multi-billion dollar OS market is intensifying. This week IBM announced a marketing programme to help customers dump Window NT and move straight to Linux. ® Related Stories IBM targets 2m customers with expiring NT support MS Win98 support reprieve was move to block Linux, says Gartner The world shudders as Win98 gets support reprieve Long goodbye begins for Win2k Server Cost of securing Windows Server 2003? Nearly $200m Security 'impossible' for Win9x, buy XP now, says MS exec First Win 2003 patch is really for IE
Bill Gates will be making one of his bigger sales calls on Monday morning. He's over in the UK for Chancellor Gordon Brown's entrepreneurs' summit, and will also be speaking at a Microsoft Longhorn developer conference later in the day. But squeezed in between we'll have a meeting with Gordon Brown himself, Peter Gershon of the Office of Government Commerce, and National Health Service IT head Richard Grainger. It will no doubt be convenient to have the man with the money, the man whose job it is to save the money, and the man who might turn out to be a really big problem all in the one room, but does Sun know about this? The NHS is currently in the throes of a massive IT overhaul, and Grainger has been talking tough on pricing and performance. Last month he announced a trial of Sun's Java Desktop System, observing pointedly that if this were successfully implemented across the NHS' 800k-plus desktops it would save millions. The OGC meanwhile has announced a series of trials of open source desktops, to be held in conjunction with Sun this year. Gershon has come under fire from MPs because of his slowness in levelling the government procurement playing field, and in investigating open source. The IBM OSS trials he announced last autumn seem to have been something of a damp squib, with Newham, now firmly back in the Microsoft camp (see story), cancelling its trial before it actually started. So there's a certain amount of OGC credibility at stake as regards its Sun trials - if they don't prove that the OGC is making a serious attempt to assess alternatives to Microsoft, it could spell trouble. Either of these two trials could result in a key breakthrough on the UK public sector desktop for Sun, so it's not exactly a surprise that Gates is in there selling. But will they buy? On the subject of Microsoft's government sales efforts, incidentally, The Register couldn't help noticing a couple of familiar names holding forth this week. In an email to his staff leaked to eWeek, MS VP Orlando Ayala told them to "separate hype from reality" when it came to selling against Linux. This email does not mention the availability of bags of gold, but we suspect this Orlando Ayala could well be related to the one who told his staff "under NO circumstances lose against Linux", and who revealed the existence of a special fund to combat open source breakthroughs in government. Meanwhile, speaking to CNet Microsoft senior vice president of business strategy Maggie Wilderotter downplayed the importance of open source government breakthroughs. "I really believe it's more about publicity than government moving faster than anyone else," she said. Could this unperturbed Maggie Wilderotter be possibly related to the Maggie Wilderotter we hear took a personal interest in the negotiations between Microsoft UK and humble, poverty-stricken UK council Newham? We suspect she could. Considering this close interest, one would expect Wilderotter to be able to explain Newham accurately to the press. But apparently not - according to the Cnet report: "Microsoft's Wilderotter said that many of the factors that prompt governments to look at open-source software turn out to go in Microsoft's favor once a government does an impartial investigation. She pointed to the London Borough of Newham, which engaged in a high-profile project a few years ago to promote open-source development. The local government body, in London in Great Britain, engaged an outside consultant to help direct long-range IT planning, and the decision went to Microsoft. "'They looked at TCO (total cost of ownership), security and other issues, and based on a number of those factors, they chose Microsoft,' she said. 'It was very surprising to a lot of people they chose Microsoft, given the stance they'd taken before, but the facts were there.'" As regular Reg readers will be aware, the consultants were engaged and paid for by Microsoft, with the brief to show how Microsoft software could help Newham achieve lower TCO. To claim that it was an impartial play-off between Windows and open source is simply not true, and we fear Wildetrotter must know this. Newham itself was so impressed, as we pointed out yesterday, that it promptly made an in-principle decision to implement an open source desktop system, and Microsoft subsequently won Newham by slashing its own prices. To, we suspect, levels that might even undercut Sun's Java desktop rates. Get the facts, as the saying goes. ®
Does anyone have a direct line to the Pope? We've witnessed another Itanic miracle and think it's time Intel's server processor chief Mike Fister begin his walk toward sainthood. Joe Clabby, of analyst firm Clabby Analytics, is the third man to undergo the miraculous transition from Itanium basher to EPIC worshipper. In a recent note, Clabby professed to having "seen the light" after Intel executives spoke to him during a server processor presentation. Intel cleansed Clabby of all his Itanic wickedness and, in so doing, managed to elevate the Itanium processor to a religious status that far exceeds that of mortal chips. "Only a year ago Clabby Analytics would have classified EPIC/Itanium architecture as a failure," Clabby writes. The analyst points out that it took Intel close to ten years to bring Itanium to market, the chip cost billions in research and development, it had poor yields and poorer adoption and failed to capture the hearts of developers. And then on the 3,700th day of Itanium's creation, Fister placed his hand on Clabby's head and all became clear - in an instant. "But what a difference a year makes! In 2003 Intel got its Itanium act together, shipping 100,000 units while expanding its ISV offerings to over 1,000. And, in 2003, Intel managed to capture most of the leading performance benchmarks with EPIC/Itanium architecture. After almost a decade of research and development; after a re-spin of Itanium to produce Itanium 2; and after a major campaign to capture ISVs (including “seeding” the ISV community with funds to port applications to Itanium), Itanium finally established a solid foothold in the 64-bit computing market-place." Praise the Lord! The first man touched by "The Fist" was Merrill Lynch's server analyst Steve Milunovich - known at El Reg as "The Loon." Miloonovich once favored RISC processors from Sun, IBM and HP, but as the analysts around him began hyping Itanic, Merrill's top dog jumped to Intel's side, calling Itanium an "industry standard" when it held less than one percent of the server processor market - far less than one percent. The man who once favored differentiation succumbed to Intel's grace. The second healing took place in Berkeley. For it was there that UCB graduate student Nick Weaver went from Itanium hater to Itanium lover overnight. Of all the Itanic miracles, this one was the most spectacular. To fully understand Clabby's transition, we must travel back to July of last year. At that time, the analyst issued a scathing critique of HP's planned Itanium migration. HP would make life very difficult on its customers by shifting them to an immature processor that lacked the necessary software for serious business use, Clabby said at the time. Clabby maintains his concerns about HP. The company's plan "to bet the farm" on Itanium is risky. How could shifting thousands of customers onto a new architecture be easy? But for Intel the story is much brighter. Clabby rightly points out that Intel has destroyed the competition on most benchmarks with IBM's Power processor the only rival even close to keeping pace. In addition, Intel has delivered a rich set of porting and migration tools to help customers move from RISC to EPIC. And last, Microsoft finally delivered code for Itanium in 2003. Most importantly, customers have started picking up the third generation Madison chip. It's this processor that pushed Intel to 100,000 unit shipments and prompted the company to declare 2003 "The Year of Itanium." The 100,000 figure, however, ignored the fact that Itanium server shipments have yet to break the 5,000 unit per quarter barrier, relegating the chip to a rather insignificant place in the overall high-end server market. Clabby is also bullish on where Intel plans to take the Itanium architecture. By the time Tukwila rolls out in 2007, Intel will have multiple cores per die and multithreading on the processor. This should give the chip about 7 times the performance of Madison and outpace performance gains that trend alongside Moore's Law. But while Intel bills itself as being on the cutting edge of multicore design, the reality is that it is woefully behind IBM and even trails Sun. IBM paved the way for dual core processors with Power4 and is ready to deliver Power5, and Sun plans to have chips with tens of cores by 2007. Clabby, however, notes that Intel is paying special attention to using EPIC to its advantage "Intel's competitors are also building multi-core, multi-threading processors, but do not preprocess instructions before handing those instructions to a CPU for processing," he writes. Er, we'll see. Clabby is right that Intel finally started to show some momentum with Itanium in the latter half of last year. This helped breath life into a processor many people in the industry had given up on. But other analysts have noted that this success may be short lived. Intel is already facing competition from AMD's x86-64-bit Opteron processor and may well start competing with itself should the fabled Yamhill come to market. Based on the emergence of x86-64-bit chips, IDC has lowered its 2007 forecast for Itanium sales down to $7.5 billion from $8.7 billion. To add a bit of perspective here, IDC once forecasted that 2004 Itanium sales would reach $28 billion. It seems a little premature to celebrate the second-coming of the Itanic just yet. ®
Kyocera has issued a recall today for 140,000 batteries used in its 7135 PalmOS smartphone after a customer received "a minor burn injury". "The recalled batteries can short-circuit and erupt with force or emit excessive heat, posing a burn hazard to consumers," the company says in a statement. Owners should stop using the phone immediately and call Kyocera for a free replacement, the company said. Customers should look for a product code ending with -05 printed on the underside. For details, go here. A spate of exploding cellphone batteries in Europe recently were traced to fake Nokia batteries. ® Related Stories Woman burned by exploding cellphone Nokia phone explodes - again Another Nokia phone explodes Nokia phone explodes in Finland Nokia batteries not safe either - Belgian watchdog Belgian watchdog reconsiders 'unsafe' Nokia battery claim Unsafe Nokia batteries - or counterfeits? Trading Standards seizes fake Nokia gear