If an Itanium engineer whimpers in a lab, does anyone hear her? Not if it's Jerry Huck, HP's lead Itanium engineer, sitting alone in a dark data center, smoking an endless stream of cigarettes and wondering what went wrong. Because if rumors that HP has decided to ship servers based on AMD's Opteron processor are true, the Itanic experiment may well be coming to an end just as bacteria began to make its way around the petri dish. CNET reports that HP plans to announce a line of Opteron-based ProLiant systems next month, although official delivery of the product may take "a while." Similar rumors have circulated for months, but there appears to be some weight behind this one with industry sources confirming HP has an announcement of sorts on tap. The Opteron boxes would, of course, join HP's Itanium Integrity servers as 64-bit options. If true, HP would follow IBM and Sun Microsystems as an Opteron backer. But HP's acceptance of the 64-bit processor would blunt the server industry landscape in a fashion unparalleled by rivals. "The deal could be seen as a setback for Intel," writes CNET's Michael Kanellos, redefining the word understatement. HP and Intel gave birth to the Itanium idea more than ten years ago. And, in the past decade, the 64-bit processor has been billed again and again as a RISC killer destined to do for high-end servers what Xeon did for the low-end and now midrange. This vision, however, never approached reality with slow sales of the Itanium processor making the product somewhat of a laughingstock among silicon geeks. It's only this year that Itanium sales finally picked up a bit - sort of. No company bet on Itanium's success more than HP. The company is in the midst of killing off both its own PA-RISC processor and the Alpha chip inherited from DEC/Compaq. HP and Intel have spent years trying to prepare customers for "the great migration" by helping out software makers and end users with sweetened porting pork. But, if HP should decide to go the Opteron route, it is sure to undermine part of this massive Itanium investment. It's unlikely that HP would make HP-UX available on the Opteron systems, as it has not pushed the Unix OS on x86 chips. So, for the RISC customer base, it's full steam ahead on the Itanic. But another part of the Itanium project revolved around pushing Xeon customers up the food chain. For those users, a shift to the x86-64-bit Opteron requires far less work than a move onto Itanium's EPIC instruction set and allows users to take advantage of a rich x86 ecosystem. Beyond this, HP would be validating the Opteron processor against Itanium in a way many analysts did not think possible. HP would be confirming that the x86-64-bit path is viable and maybe the better way to go. Try and explain this to the already committed in the Itanium camp. Interestingly, Sun is the only company at this time promising to bring Unix to the Opteron processor via a Solaris x86 port. Won't that make things interesting for the abandoned Tru64 clan? You almost have to hope CNET is right on this one, if for no other reason that to see what happens at Intel's developer conference next month. Can St. Fister cure the wicked en masse? ® Related Story HP to ship Opteron - report
Campaign 2004Campaign 2004 Last week we noted how "empowering the edges of the network" had become a mindless mantra for techno-utopian pundits eager to profit from Howard Dean's presidential campaign. As we wrote then, this kind of New Age cobblers did a huge disservice to both Dean and his supporters. But it looks even less clever now than it did a week ago, when Dean's campaign stalled badly in the Iowa caucuses. As it turns out, Dean was doing more to advocate locking down the "edge of the network" than any other Democrat candidate. And the finger of suspicion for feeding the Presidential Candidate this line of argument points firmly to his campaign manager, Joe Trippi. Trippi was a stockholder, employee and booster for Wave Systems, the company contracted by Intel to implement TCPA (Trusted Computing Platform Alliance) specifications. Microsoft's implementation of this architecture was unveiled as 'Palladium' two years ago; now it's called NGSCB, and is slated to ship in the next major version of Windows, Longhorn. Viewed by copyright holders as the ultimate silver bullet, TCPA turns the open PC into a lock-down system where software can't be executed and media can't be played without the right-holders' permission. As Ross Anderson explains here. "The music industry will be able to sell you music downloads that you won't be able to swap. They will be able to sell you CDs that you'll only be able to play three times, or only on your birthday. All sorts of new marketing possibilities will open up." So TCPA represented a dramatic shift from end users (at the "edge of the network") to centralized copyright holders, spawning sites such as Against TCPA and No TCPA. "TCPA will set standards for the OEMs in June," vowed Trippi three years ago, as proof of his affection for Wave Systems stock. [Thanks to Gary Wolf for unearthing that gem.] Trippi continues to list Wave Systems as a client of his marketing consultancy, Catapult Systems. Dean himself enters the picture with a speech that he gave to a conference co-sponsored by Wave Systems in March 2002 entitled "Workshop on States Security: Identity, Authentication, Access Control" reported by Declan McCullagh at CNET today, on the eve of the New Hampshire primary. In the speech, which you can read on uh, Wave Systems website, Dean describes privacy as an "urban myth" and explains "little has been spent to secure the most vulnerable part of the network - the PC, the laptop, the government and corporate desktop computers – all at the perimeter of the computer network system." Yes, it's the national security angle that TCPA-vendors have been peddling, with the active encouragement of the law enforcement lobby. Open PCs are dangerous, Dean argued. "This is a mistake because the computing power at that perimeter can be used - Napster style - to take the entire network down," said Dean, according to the transcript. Dean suggested the cure should be interoperability between states' ID cards. "We must move to smarter license cards that carry secure digital information that can be universally read at vital checkpoints." Reinventing the Internet? McCullagh's entry into the 2004 Presidential campaign has been eagerly anticipated. In the 2000 Presidential race his coverage of a claim by Al Gore to have 'invented the Internet' reached national notoriety. "If it's true that Al Gore created the Internet, then I created the 'Al Gore created the Internet' story," McCullagh boasted. Although technical luminaries such as Vint Cerf came to Gore's defense ("It is very fair to say that the Internet would not be where it is in the United States without the strong support given it and related research areas by the vice president in his current role and in his earlier role as senator," said Cerf) the coverage made Gore the butt of jokes nationwide. "We don't need 'Dean is Big Brother'," a consultant to the Dean campaign told The Register today. "'Al Gore invented the Internet' still won't go away." McCullagh doesn't pass up the opportunity to moralize. "It's possible that Dean has a good explanation for his uniform ID card views, and can account for how his principles apparently changed so radically over the course of just two years.," writes McCullagh. "Perhaps he can't. But a refusal to answer difficult questions is not an attractive quality in a man who would be president." And moralizing isn't always an attractive quality in a man who would be pundit either, Declan. So it's worth parsing what Dean really said, and on what basis McCullagh formed his stentorian, five cigar conclusion, before we can judge either party. Omitted from McCullagh's CNET commentary account is Dean's plea to preserve privacy. "We will not, and should not, tolerate a call to erode privacy even further - far from it," said Dean. "Americans can only be assured that their personal identity and information are safe and protected when they are able to gain more control over this information and its use." Dean pointed out that privacy was already compromised as vast amounts of personal information are already shared between financial corporations and logged by Internet companies. (And lest we forget, harvested by social networks like Friendster). He wasn't advocating a national ID card, and said that public trust depended on Chinese walls built into the card. Privacy advocates are mistrustful of such Chinese walls: believing that the benefits of data sharing are too tempting for corporate and federal interests to resist. There's also plenty of skepticism that local, or function-specific introductions of smartcards morph into all-purpose 'Big Brother' cards. But Dean is clearly well aware of the privacy concerns, and his advocacy leaves Dean guilty of little more than naivety. And on that count, Dean can justifiably question the advice of his campaign manager, who was more interested in serving his stock portfolio (and marketing clients) than the Candidate. What a long strange Trippi it's been So there we have it: Dean wasn't advocating a national ID card, nor was he blithely inviting smart card vendors to breach citizens' privacy even further. However, it was remarkably ill-advised of him to advocate locking down the PC "at the edge of the network" without examining the implications for the consumer, or even the software industry. Only that wouldn't be a story now if it hadn't been for the techno-utopian pundits getting carried away with an almost religious belief in power "at the edge of the network". What does this Forrest Gump-style fortune cookie mean, exactly? As far as we can tell, it describes one characteristic of one model of collective behavior. 'Collective' is a word you don't hear too much nowadays, but Microsoft Corporation is one form of collective organization, as are the Teamsters, the Catholic Church, and the Santa Fe Institute. When people unite around collective action, the results can be very far reaching. But the word has been deprecated in favor of much more fashionable rhetoric usually touted by supporters of "emergent" capers such as Poindexter's Terror Casino. Dean supporters will hardly be thanking these commentators and experts for this foolish flirtation with New Age rhetoric, which has handed Dean's opponents with an unexpected PR opportunity. It certainly wasn't sought. But the 'blogosphere' may soon want to 'self-correct' this unwanted mini-industry of pundits and 'consultants'. ® Related Stories Of TCPA, Palladium and Wernher von Braun Techno utopians' Net Candidate falters Meet the 'transhumanists' behind the Pentagon terror casino Kill a Middle East head of state, win prizes! - Pentagon shows how
Going by the name of 'Novarg' or 'MyDoom', the latest mass-mailing worm to infest your in-tray is spreading at the same rate as SoBig, according to the anti-virus industry. The worm also targets the SCO Group's corporate website, according to a Symantec alert. On Windows PCs, the worm creates 64 threads which will hit www.sco.com with GET requests, between February 1 and 12. It also copies itself to KaZaA's download directory, masquerading as a software executable: names include icq2004-final and winamp5. Windows users should check for a file named shimgapi.dll in the system directory, and update their AV software. For most users the major inconvenience will be the bandwidth consumed: the executable attachment is 22kb in size. For SCO, it represents a different problem. When the Blaster worm was primed to attack Microsoft's Windows Update servers, the company shut down for the duration of the attempted denial of service. ® Related Stories Blaster rewrites Windows worm rules [full coverage] Sobig-F is dead And now we are One. Many unhappy returns to SoBig Sobig-F blamed for massive increase in spam Sobig beats Blaster in Top of the Viral Pops Sobig-F is fastest growing virus ever - official Yahoo! variant! of! Microsoft! support! worm! spreading! rapidly! Heavy squalls of blended worms to hit next year
The IT recruitment market showed spotty signs of growth between October and December last year, with the number of advertised permanent and contract techie jobs showing modest increases. The latest statistics from CWJobs' Quarterly IT Skills Index found that permanent IT vacancies rose by an average of 4 per cent between October and December 2003, while IT contract jobs increased by just one per cent. The improvement bucks the seasonal trend which usually sees December being one of the slowest IT recruitment months of the year, according to the Internet job site. The study predicts that the increases, although small, will mark a turning point for beleaguered IT pros, as they indicate that firms are again confident enough to recruit staff for what they expect will be a busier new year. The numbers of advertised contract vacancies have consistently increased over the last 12 months; however the rate at which they are growing was found to have slowed down to just 1 per cent in the final quarter of 2003, with a total of 14847 jobs advertised. Outer London, the research found, was the only region to see growth in both contract (16 per cent) and permanent jobs (14 per cent). The only other area where there was a growth in the number of advertised contract positions was West & Wales and here the increase was only one per cent. In fact the study makes grim reading for contractors outside of London, noting that contract positions dropped most dramatically in East Midlands (21 per cent). However, while contractors were left with slim pickings, most regions enjoyed growth in permanent vacancies with the biggest rises to be found in Outer London (14 per cent), West Midlands (10 per cent) and Inner London (9 per cent). These increases in the number of available permanent IT jobs marked the first signs of growth since the beginning of 2003, CWJobs noted. The total number of permanent IT jobs advertised between October and December 2003 was 48,975 - 97 per cent of which were advertised online. The only regions found to experience a decline in permanent vacancies were: West & Wales - dropping seven per cent - and Scotland & Northern Ireland, which fell by 14 per cent. From a vertical viewpoint, the research found that IT jobseekers in the manufacturing industry and public sector saw the numbers of available contract and permanent positions plummet. Contract jobs in Manufacturing dropped by over a fifth and permanent vacancies by only slightly less. In the Public Sector there were 12 per cent fewer contract jobs up for grabs and 5 per cent fewer permanent placements. The finance sector, CWJobs said, was largely responsible for keeping the IT contract market buoyant between July and September 2003, with an increase in both contract (seven per cent) and permanent (11 per cent) vacancies. CWJobs also notes an upturn in demand from the software, media and consultancy sectors. Microsoft Office was identified as the most popular skill for IT contractors during the final quarter of 2003, followed by SQL and Oracle. The top three skills demanded for permanent staff were SQL, Unix and C++. ®
The name means "to cum" in Finnish slang - literally, to orgasm - but the Orkut social networking service that Google launched on Friday was utterly spent by Sunday afternoon. Members of the Friendster-clone were greeted with a message that faded (stylishly) into view - the stylish fade-in coming from some piece of fancy script that only a web designer would think was important. "We've taken orkut.com offline as we implement some improvements and upgrades suggested by users. Since orkut is in the very early stages of development, it's likely to be up and down quite a bit during the coming months." Ah, just like a Microsoft program, we thought. "None of the information you've entered will be deleted, and none of the connections you've made will be lost. And, if all goes well, you should see some significant improvements when we come back online. "We'll send an email once everything is ready and running again. Thanks for your feedback and for bearing with us as we work our way up the learning curve." What did the Orkut team learn? Well, it wasn't to do with scalability, as many of the wildly erroneous rumors flying around the Net at the weekend suggested. The Orkut programmers can put away that copy of "In Search of Clusters". Scalability is the art of keeping going while millions of new users arrive, and it's not trivial to fix, and it's plagued social network companies who have to draw maps of these social relationships. Friendster itself, for example, has given up drawing these complex webs in real time, and instead caches them overnight. No, the problem was security. Sources close to Google suggest widespread XSS (cross-site scripting) hacks forced the closure of the service. It isn't clear how much personal data or communication was disclosed. The major problem facing social networks is that they scarf up personal information far more efficiently than a Carnivore system. People really aren't going to trust them if they view these start-ups as honeypots for future marketroids to reap everything we didn't want them to know. Let alone allow a passing hacker to scarf up this potential archive of great exploitable value. And not to bore you, but you've got to love how that recursive Privacy Statement shimmers into view. How do they do that? While the service will no doubt splutter back into life, it's left many wondering whether Google was entirely serious about the exercise. Maybe you can figure out what's wrong with this story... Google allows staff to spend twenty per cent of their time on their own projects. Having failed to acquire Friendster last year, Google was under some media pressure to create its own offering in a space that "experts" were telling us was "hot". An area that marketing consultants would advise adds "stickiness" to the site - if it wanted to be one of those "sticky portals" we read about years ago, in the late 1990s. But for some strange reason, Google entrusted the Friendster-buster to a junior programmer to implement on a Windows system. Why on earth the proven masters of open source scalability chose to allow this to happen, puzzles us all. Maybe Orkut will come again - as a full fledged Google offering. Or perhaps the name is a clue. Perhaps Google really doesn't take "social networking" as a business concept that deserves anything closer than arm's length circumspection, very seriously at all. And perhaps that's a very good call. ® Related Stories Google debuts Friendster-clone Orkut Friendster bubble 'has peaked - will pop'
AnalysisAnalysis There is an increasingly smug feeling among the big record companies and their various agents, that has been brought about by the supposed demise of music piracy. There are three related important points to establish about music piracy. Firstly it is illegal and also morally wrong to obtain copies of music or film or other entertainment without some form of payment. Secondly the process has been stimulated by the perception that record companies neither have the interests of the artists at heart, and they have in the past made huge profits by overpricing music. Thirdly music companies have been slow to realize that digital music requires a rights regime that allows various personal copies of music to be moved between different personal types of players. Quite simply, young people would rather break a law than pay the current prices, and certainly break the law to copy tracks they have already bought for one player, which won’t play on another. It is only one third of this equation that may have been addressed so far, that of stopping piracy, and Faultline would argue that this has not been finally or successfully addressed, despite claims to the contrary, made throughout this week. An annual online music report out this week from the International Federation of the Phonographic Industry (IFPI) said this week that the arrival of the legitimate online music services will take a large market share against the illegal use of peer-to-peer file sharing networks like Kazaa, during the coming year. There are two pieces of evidence that it cites. It says that the imminent arrival of iTunes, Rapsody and Napster, the more successful of the online music services offered in the US, will see a healthy rise in paid for, online sales. IFPI also says that US buyers have bought 30 million songs through these services, and while Europe has only, so far, purchased 3 million, this is set to rise. Online music supplier OD2 said this week that its sales are rising at the rate of 25 per cent per month. IFPI points out that there are already 30 existing online services and credits part of their growing success to the awareness growing that not only is file sharing of a copyrighted work illegal, but it is now becoming obvious that people that do it can get caught. The number of music files available on the Internet has fallen by 20 per cent to 800 million over the last year, after peaking at one billion at the start of 2003. IFPI finished with: “For everyone working to create a successful legitimate online music business, this report reflects a new sense of optimism and evidence of real change,” this from IFPI chairman and CEO Jay Berman. He added: "We believe the music industry's Internet strategy is now turning the corner, and that in 2004 there will be, for the first time, a substantial migration of consumers from unauthorized free services to legitimate alternatives." However three million downloads in a year are not enough to turn the tide, even if those downloads are growing at 25 per cent a month (as OD2 claims in a statement this week), because there have been four years of declining CD sales, and during 2002 EMI lost 11 per cent of its revenue, down by some $460 million, and for 2003, although it has yet to publish its final figures, it is likely to be flat for the year, at best. At the current price for European downloads (generally 99p a track, almost double the US prices for the same music) three million downloads yields £3 million, or $5.4 million. Even if 50 per cent of that money was due to EMI’s catalog, this would only be $2.7 million, and after delivery database OD2’s take and the take from etailers HMV etc… this would be reduced to about $1.3 million. It is hardly going to replace $460 million in lost revenues. If these three million paid-for downloads were increasing at 25 per cent per month through 2003, then they began on about 55,000 downloads in January 2003 and ended on 640,000 in December 2003. Taking the number on from there, growing at 25% per month, this 3 million will go up to 43 million downloads in 2004, and if that same growth continued through 2005, this would take paid music downloads in Europe to 631 million, grossing some $1.1 billion. Once again if half of that belonged to EMI (and it won’t) and half of that was taken by the distribution process (which it will) then that might add up to $250 million, about half of what EMI has lost. Hardly time for dancing in the streets at the EMI shareholders meeting. And we have more problems with this. No growth, including the original take up of browser technology on the internet, sustains a growth rate of 25% per month (the world wide web once had 15 per cent a month growth for about two years, but nothing else has come close). So this growth rate will not be sustained. Also it was not gradual and linear in the first place. Digging between the lines among OD2’s statements this growth was mostly in one huge leap when Microsoft added the service to MSN, and then more modest movement month to month. So expect perhaps 25 per cent growth per month for a few more months and then seasonal variation and tailing off to perhaps 10 per cent growth for the rest of the year. Also expect a price war. This is a land grab market and already there is the announcement of Coca Cola entering the online music wars, again off the back of the OD2 delivery network, to launch its own music site next month. While they all use the same delivery mechanism it is likely that prices must be the same. But yet to come to Europe are music services from Apple, RealNetwork’s Rhapsody, Walmart, Musicmatch, the company that powers both Dell and Hewlett-Packard’s forthcoming online music web sites and of course Sony, that will tightly integrate its site with its own electronic players. It will be strange if all of these charge almost double the price in parts of Europe, that they do in the US, given that many of the artists are European, and in EMI and Universal and Bertelsmann, the music companies are owned in Europe. Online services generally tend to flatten global pricing, and the resentment of the European youth will continue until they can see that they are being charged the same as their US counterparts, and while they are resenting overcharging, they will continue to indulge in piracy. So either price cuts, or more piracy. Already Playstation and Xbox games, often written by European companies, are sold online in the US by Amazon for instance, for a fraction of European prices but cannot be shipped over to Europe. And DVDs are also the same, with a grey importer of Far East DVD movies being forced last week, in a settlement with US studios, to charge more in Europe than either the US or the Far East. It is these inequalities that bedevil the operations of the troubled music companies, as much as piracy. There is a feeling still in the minds of aging executives, that one day the industry will all go back to being the same as it was before, once they’ve “fixed” the piracy question. But the internet has changed forever the global cross communication over pricing, and this has still to be addressed. And yet EMI’s share price has gained over 40 per cent since 2004 started. Surely for EMI to recover we must see a return to the retail environment to buy CDs, not just an acceptance that online music is here to stay. More likely to put the recording studio mess to rights is an announcement from Macrovision this week that it has a new release of its CD copy-protection system. This version will use the full Microsoft DRM system; Macrovision says the CDS-300 will offer multi-level protection and rights management for music CDs. CDs made with this system can seamlessly create playlists, export to portable devices and makes authorized burns to another CD, and offers one-click access to bonus content on the disc, or premium content via web links. The company claims that the average consumer should not even notice that their CD is copy protected and says that content owners can set usage rights, for example, allowing consumers the ability to export to compliant portable devices (with a specified number of exports), as well as burn CDs (with specified number of burns). Copied files will not play if emailed or distributed via the Internet. But will the content companies actually set these kinds of rights? We don’t think so. They have shown themselves to be solely interested in returning the industry to where it was before piracy took off, and will only use innovative and intricate rights regimes to combat other music companies that offer the same, not to tempt people away from piracy. At least not so far. Almost all the OD2 tracks for sale in Europe are not for sale to burn to CDs or to play on a PC, and yet this system uses the very same DRM protection that Macrovision is just introducing and the facility to do so is supported by OD2. It will be a long battle before music companies start experimenting with attractive rights, long after piracy is under control. But this is a chicken and egg argument. No rights experiments and people will continue to pirate. In the meantime the main game in reducing piracy, the messy public legal actions that are designed to scare customers out of file sharing, continue. In Europe we are told that the first suits are about to be filed, but as yet there are no named individuals coming to trial, while in the US the Recording Industry Association of America (RIAA) filed another 532 suits this week. These new filings serve two purposes. One to keep the pressure up on reducing file sharing, but secondly also to clarify that the RIAA has not given up on its legal actions just because Verizon, just prior to Christmas, convinced a US judge that the fast subpoena process that the RIAA was pursuing was not legal. Now every suit has to be filed as a “John Doe,” action against someone with no known name, and will have to go to a judge in order to force the ISP to name their customer. Would that put the RIAA off? Will the judge always agree to grant a motion to discover the name? These two questions need answering quickly and in the RIAA’s favour and need to be made public very loudly, or file sharing would immediately go back on the increase. November figures released last week shows that file sharing has already risen by 1 million individuals from October. Another rise and it could mean that all of the RIAA’s prior legal work would all be wasted. Finally the RIAA will find itself on the receiving end of a copyright legal action any day, now that a US federal judge has given leave to Kazaa to sue them. The Kazaa case rests on the fact that the RIAA is alleged to have used unlicensed versions of Kazaa to monitor the internet for copyright infringers. This amounts to a violation of the Kazaa license and Kazaa wants it stopped. And if the RIAA is blocked from using this software to find pirates, then it is back to the drawing room for the RIAA, EMI, the IFPI and a move back to rampant piracy once again. Like we said, it is far too early for the music companies to be smug about the demise of piracy. ® © Copyright 2004 Faultline Faultline is published by Rethink Research, a London-based publishing and consulting firm. This weekly newsletter is an assessment of the impact of events that have happened each week in the world of digital media. Faultline is where media meets technology. Subscription details here.
IBM's processor for the Xbox 2 will be fabbed at 65nm, and experimental versions of the chip have already popped off the end of the company's production line. So claims TeamXbox citing confirmation from unnamed sources that Xbox 2, like the next generation of the PlayStation, will be based on 65nm technology. "IBM has already taped out experimental samples at its East Fishkill fab but it will take between 12 and 18 months for them to deliver commercial parts," the site's deep throat alleges. "Anyway, they're way ahead of Intel." That schedule would put the processor's availability in a mid to late 2005 timeframe, which is a reasonable estimate. Certainly 18 months is a more likely deadline than 12. Both Intel and AMD are expected to debut 65nm chips in the second half of 2005, and while IBM may be "way ahead", we doubt it's that far. The 90nm PowerPC 970 - aka the G5, officially the 970FX - isn't expected to ship in significant quantities until the middle on next month, six to eight weeks after Apple announced the Xserve G5, which uses the new CPU. By then, Intel should also be shipping the 90nm Pentium 4 - 'Prescott'. AMD's 90nm chips are due next quarter, or early Q3. So the major chip makers are running broadly neck-and-neck. Where IBM is ahead of Intel is in thermal characteristics, with the 970FX consuming less power clock frequency for clock frequency than the 130nm 970. Prescott, by contrast, consumes more power than the previous, Northwood generation of the P4 (though to be fair, it also sports more extra circuitry than its predecessor). TeamXbox's source suggests 65nm will allow IBM to ramp up clock speeds: "The 65nm technology will allow them to break the 3GHz barrier for sure and get closer to the 5GHz mark," (s)he says. Apple's Steve Jobs has promised 3GHz G5-class CPUs next summer, so this doesn't seem too unreasonable. We reckon the next generation of G5, believed to be called the PowerPC 980, may be the part to deliver that clock speed, not least by offering the longer instruction pipeline necessary to deliver higher clock frequencies. The 980 is expected to be fabbed at 90nm, however. ® Related Stories IBM claims massive power cut for 90nm G5 Sony and Toshiba close to sampling Cell technology Prescott pipeline longer than Northwood's - Intel
Intel has entered into a three-year, $20 million deal with lithography optics specialist Cymer to fund the development of extreme UV (EUV) light sources. Intel believes EUV lithography will play a key role in the construction of 32nm transistors and processors, which it expects to put into production in 2009. But as fabrication processes shrink, the equipment needed to make chip construction not only possible but financially viable becomes more complex and expensive. Building suitable light sources is only one part of the process. "Accelerating EUV technology development to enable its successful implementation in high volume manufacturing for the 32nm node in 2009 is a critical mission at Intel," said Peter Silverman, Intel Fellow and director of Intel's Lithography Capital Equipment Development, in a statement. "This agreement will further enable Intel and Cymer to concentrate on the critical technology challenges and on delivering a cost-effective, commercial EUV source solution to produce development tools in 2006 and meet the industry's 2009 production timeline." ®
Intel has quietly cut prices across its Centrino range. The cuts amount to little more than a buck off the price of each Centrino package - which comprises a Pentium M CPU, i855 chipset variant and a Pro Wireless 2100 mini-PCI card - yielding a fractional percentage drop: between 0.2 and 0.36 per cent. So no wonder Chipzilla didn't make a song and dance about it. The cuts only apply to Centrino bundles, not to individual Pentium Ms. They were applied at the end of last December, but it's taken Intel a month to update its public price list. Intel Centrino price cuts Processor Prev. Price New Price Change 1.7GHz Pentium M + 855GM chipset $497 $496 -0.2% 1.7GHz Pentium M + 855PM chipset $494 $493 -0.2% 1.6GHz Pentium M + 855GM chipset $368 $367 -0.27% 1.6GHz Pentium M + 855PM chipset $365 $364 -0.27% 1.5GHz Pentium M + 855GM chipset $315 $314 -0.32% 1.5GHz Pentium M + 855PM chipset $312 $311 -0.32% 1.4GHz Pentium M + 855GM chipset $283 $282 -0.35% 1.4GHz Pentium M + 855PM chipset $280 &279; 0.36% 1.3GHz Pentium M + 855GM chipset $283 $282 -0.35% 1.3GHz Pentium M + 855PM chipset $280 $279 -0.36% LV/ULV Centrino 1.2GHz Pentium M + 855GM chipset $358 $357 -0.28% 1.2GHz Pentium M + 855PM chipset $355 $354 -0.28% 1GHz Pentium M + 855GM chipset $336 $335 -0.3% 1GHz Pentium M + 855PM chipset $333 $332 -0.3% 900MHz Pentium M + 855GM chipset $315 $314 -0.32% 900MHz Pentium M + 855PM chipset $312 $311 -0.32%
Adaptec likes Elipsan's storage virtualisation skills so much that it is buying the company. Terms of the proposed acquisition of Bristol-based Elipsan are undisclosed. Adaptec is to integrate Elipsan's virtualisation and business-continuance technology into its RAID systems. Upshot: no more dedicated server or third-party management software needed. So that should deliver an immediate cost saving. Also, tight integration means faster data backup and recovery, Adaptec says. Adaptec is a major supplier of SCSI RAID storage subsystems. The acquisition of Elipsan shows its commitment to be a big player in end-to-end storage. Storage virtualisation occupies an important enabling position in the storage jigsaw. This technology controls all network storage capacity, wherever it's located within the enterprise, from a single management interface. Now for a quote from Adaptec CTO Mark Delsman: "As the market continues to migrate from internal to external, and from direct-attached to fabric-attached storage systems, Adaptec is committed to providing the highest levels of functionality, performance and reliability while simplifying storage management and keeping costs down." Which is nice. Adaptec yesterday published its fiscal '04 Q3s, pumping out sales of $115m (Q3 FY2003: $109m) and a net loss of $3 million(QE FY 2003: -$3.5m) for the quarter ended 31 December, 2003. ®
Fujitsu this week brought fuel cell technology one step closer to commercialisation when it announced it had developed a new material that allows these power sources to be made smaller and more energy efficient. Fuel cells generate electricity as a by-product of a controlled chemical reaction. The fuel is a solution of methanol on water. When particles of a metallic catalyst are added to the fuel, the methanol breaks down. A membrane allows the waste products to pass through, maintaining the methanol concentration necessary for the reaction. The higher the methanol concentration the better, but at too high a concentration some methanol seeps through the membrane, which limits the cell's power generating efficiency. Fujitsu's new material allows methanol to be stored in a 30 per cent solution without leakage. That's enough, the company claims, to allow 300ml of solution to power a notebook PC for 8-10 hours. Previous notebook-oriented fuel cells have had to work with lower concentrations of methanol and thus bigger quantities of fuel to provide sufficient power for PC operation. Fujitsu's prototype cell, which uses the new membrane material, provides 15W of power yet is just 15mm thick, the company said. Fuel cells are seen as one possible successor to today's Lithium Ion and Lithium Polymer batteries. Not only are they potentially more efficient, but they are less harmful to the environment and easier to recycle. ® Related Stories Hitachi readies fuel cell for PDAs Toshiba demos mobile phone fuel cell NEC, Hitachi prep notebook, PDA fuel cells Toshiba boffins tout laptop fuel cell Fuel cell to power notebooks and mobile phones Motorola claims alcoholic PC breakthrough
Philips subsidiary Polymer Vision has unwrapped - literally - what it claims is the world's widest yet thinnest, most flexible TFT LCD panel. The monochrome 4.7in, 320 x 240 display is thin enough to be rolled into a curve with a radius of 2cm, the company said. PV's trick is to build the display's 76,800-odd transistors onto a plastic base rather than glass, the material used in almost all LCDs currently found in phones, PDAs and notebook PCs. The active matrix component is just 25 microns thick. Above that is a 200 micron 'electronic ink' panel from E-Ink, a non-volatile display system that contains charged black and white particles. Controlled by an electric field, the particles adhere to the panel, allowing them to stay there when the current is removed. Power is only needed to change the image, not to maintain it, making the technology suitable for very low power applications. PV's display is still in the research stage - the company doesn't expect to begin pilot commercial production until next year. Right now, PV says it can churn out around 5000 displays a year, enough to allow customers to begin experimenting with the technology in their own products. ®
Wi-Fi providers who redirect users' web browsers to their own log-in page may soon have to cough up cash if they want to continue using the technique - US network access software company Nomadix has patented it. The patent, number 6,636,894, was granted on 21 October last year, but is applicable right back to 8 December 1999. It essentially describes systems that redirects portable-computer users who access a public network to the host's home page, irrespective of the user's browser settings and transparent to the user. The systems cover both wired and wireless access. It also discusses the authentication and authorisation system that maintains user accounts and interacts with the billing system when network access is not provided free of charge. Almost all public Wi-Fi networks - and pretty much every one of them that charge users for access - operate such a methodology. No wonder Nomadix describes the technique as "fundamentally essential to the success of the rapidly growing Wi-Fi market". Nomadix customers will inherently have a licence to use the technique, but WISPs who have developed their own redirection code, or have acquired it from other companies, will need to ensure they have permission to use it. Nomadix will certainly be expecting them to. "Some [companies] copied what we've done," said Nomadix CTO, co-founder and senior VP, Joel Short, according to a Wi-Fi Networking News report. "We stand behind our intellectual property and now we're going to encourage those folks who provide that method to license the technology from us." ®
A two-day conference exploring how rural areas can benefit from high speed Internet access kicked off today at Cisco Europe's HQ near Heathrow. Organised by Access to Broadband Campaign (ABC), the conference has attracted 250 people from government, industry and the regional and rural broadband community to address how innovation can bring more affordable broadband to parts of the UK currently not served by ADSL or cable services. Delegates are expected to discuss the issues affecting the development of broadband in the UK. They will also explore the scope for innovation and growth with all the competitive business and consumer advantages it brings. In a "video presentation" Rural Affairs Minister Alun Michael MP told delegates: "ABC has acted as an excellent catalyst bringing together public sector, industry, community and consumer interests in the broadband debate. [It] illustrates again how important it is to take a multi-stakeholder approach to overcoming problems of the digital divide." E-minister Stephen Timms and Shadow Minister for Economic Affairs Michael Fabricant are also expected to address the bash. ® Related Story Eminister to address broadband conference
CD burning software developer Optima's patent infringement allegations against Napster owner Roxio will be heard by the US District Court of Central California on 19 April, papers seen by The Register reveal. The case will be heard by Judge James V Selna. Optima claims Roxio's own CD burning code violates intellectual property granted it in patent number 5,666,531, which details a "recordable CD-ROM accessing system". Essentially, it describes the technique used by many CD burning apps and utilities of creating an image of the disc in memory or on the hard drive which appears to the user as a CD. The virtual CD's contents can be updated at will, until the user is ready to burn the contents onto the disc, at which point the information can no longer be changed. Software released by Optima in 1995 utilised this technique, which it says ended the need to pre-plan how and where to burn data directly to the CD. Optima offered to license its intellectual property to Roxio, but its rival refused. Now it wants Roxio to cough up damages, unpaid royalties and lawyers fees And not just Roxio. "Optima believes most every company in the CD burner industry may be infringing," the company's attorney, Robert Lyon, a partner at Holland & Knight, said when the legal action was announced, last December. Optima claims that the patent is infringed by now standard ways of burning CDs as laid down by the CD-R and CD-RW technology guarding, the Optical Storage Technology Association (OSTA). Roxio is an OSTA member, as is Sony, HP, Imation, Microsoft, Pioneer, Ricoh, Toshiba and Verbatim. Associate members include Apple, Eastman Kodak, Epson, Fujitsu, Iomega, JVC, Plasmon and many more of the CD-R industry's leading lights. Optima CEO Robert Adams told The Register: "Get ready for more" - a hint that further legal action may be in the offing. For its part, Roxio disputes the allegation, claiming that none of its products utilise techniques described by Optima's patents. ® Related Story Roxio first target as CD-R patent owner threatens industry
The Wheels of European Commission decision-making grind exceedingly slow. But it appears that, after three-and-a-half years, the Brussels bureacrats have reached a decision over Microsoft's past behaviour within the EU. Granted, it's a preliminary decision and granted the ruling is not expected to be published until March, but EC officials feel confident enough to brief favoured news outlets, that yes, Microsoft broke Community competition law by abusing its dominance of the PC market. Microsoft will of course challenge any such finding in the EU's Court of First Instance. At stake will be a big fine that the European Commission will no doubt seek to levy. More importantly, Microsoft will defend itself against any attempt by the EC to enforce the separation of Windows Media Player from Windows. Last, there will be a squabble over what and how much information Microsoft will be ordered to share with its rivals. The unbundling of Windows Media Player is a hobby horse of the European Commission and it is where it parts company from its US anti-trust counterparts. ®
BT is facing a fresh complaint concerning allegations about the way it is trying to persuade customers from switching phone providers. Tele2 UK boss, Bill Butler, told the FT that he has proof that BT is phoning up punters who are looking to switch to the rival service and "falsely inflating" Tele2's tariffs in a bid to get them to stick with the dominant telco. The matter has now been passed to communications regulator Ofcom. A spokesman for BT dismissed the allegations but insisted that the telco would cooperate with any resulting investigation. If all this sound familiar, then you're right. In November, former telecoms regulator Oftel upheld a complaint from Thus and Broadsystem Ventures Ltd preventing BT from using information about the transfer of customers to alternative telecoms suppliers such as One.Tel, Tiscali and Tele2. Oftel found that BT - which was calling customers who had decided to move to rival operators - was using this information to try to convince punters to stay with the monster telco. At the time Oftel said: "Until now, BT has passed this [transfer] information to its marketing department, which has then contacted the customer to try and persuade them to stay with BT. Oftel has today ordered BT to stop carrying out this practice, on the grounds that it is forbidden under the new EU Access and Interconnection Directive that came into force in the UK in July 2003." BT insists that it is doing nothing wrong by contacting punters looking to leave its service. In fact, the monster telco reckons that unless it can contact punters, it could open the floodgates for 'slamming', a process where customers are switched phone providers without their knowledge or consent. Earlier this month BT lodged a formal appeal against the ruling with the Competition Appeals Tribunal (CAT), the UK's highest specialist competition law court. ® Related Stories BT appeals 'dirty tricks' banning order BT ordered to stop 'dirty tricks' Tele2 unveils UK phone service
Directory enquiries (DQ) outfit The Number - whose ads featuring two moustachioed 1970s-style runners have been elevated to cult status - has been slapped for using the image of former British athlete David Bedford. Communications regulator Ofcom today upheld a complaint from Mr Bedford concerning the "118 118 Runners" featured in ads for the DQ service. In a statement the regulator said: "The Content Board found as a fact that the 118 118 Runners featured in The Number's TV advertisements do caricature David Bedford by way of a comically exaggerated representation of him looking like he did in the 1970s, sporting a hairstyle and facial hair like his at the time, and wearing running kit almost identical to the running kit that was distinctively worn by him at the time, including red socks, sky-blue shorts with gold braiding and a vest with 2 hoops. It went on: "The Number concedes that it neither sought nor obtained David Bedford's permission to be caricatured." Caricature without permission is a breach of advertising rules. Despite the ruling, Ofcom stopped short of banning the ads because it ruled that Mr Bedford had not "suffered actual financial harm as a result of the caricature". As a result, The Number will continue to use its runners. In October last year Mr Bedford wrote to The Number demanding compensation after he alleged that the DQ outfit had ripped off his image without his permission. At the time a spokesman for The Number dismissed the allegations as "ridiculous" insisting that the joggers were not modelled on the image of David Bedford. Instead, he explained that The Number wanted a "1970s retro style" and that the company had actually loosely based its characters on Steve Prefontaine - the US athlete who died in 1975. Although legal letters have been exchanges, The Number has still not received a formal writ from Mr Bedford, a spokesman said today. . ® Related Story David Bedford gets the hump over 118 118 ads
Episode 3Episode 3 BOFH 2004: Episode 3 So I'm relaxing in the office when the Boss has a loneliness attack and decides to come visiting. I know it's loneliness because he hasn't brought the wadge of paper he generally carries with him to remind him of what he came for. "Just... ah... checking to see how things are going at... er... Mission Control. As it were..." he says, gesturing expansively about the room. "Fine," I respond, "business as usual." "And your assistant?" he asks as he sits in the PFY's vacant chair. "Oh, he's out and about getti--" "My, these are nice chairs!" he sighs, getting comfortable, just as the PFY arrives. "Yes, they're the new Ergo 3000s," the PFY comments. "Full lumbar, thoracic and cervical support, built-in infrared linked multimedia speakers in the headpiece, servo-assisted adjustment, and full recline. This model even has the servo interface to your desktop to allow it tilt, roll and rise in response to computer control. They market it as the ulimate in gaming chairs, but we needed them because... uhhhhhm... BECAUSE they could proactively put you into micropause position!" he adlibs. "Do you mind?" "No, not at all, don't want to interrupt your work! So where did they come from?" he asks. "Might grab myself one!" "Dunno who the vendor is, but the secretary's got the catalogue in her admin folder," I respond, to the boss' departing back. . . . Two minutes later . . . "THEY'RE BLOODY 2500 QUID EACH!" the boss gasps "They're REAL LEATHER! You used the WHOLE of last year's furniture and fittings budget on a couple of chairs!" "Well technically, we used the whole of last year's and the whole of this years as well," the PFY corrects. "For the chairs we use at home." "YOU BOUGHT CHAIRS FOR HOME!?!" "Of course! Wouldn't want to put my back in jeopardy by working remotely on a substandard item - that could cost you a stack in health penalties. It makes financial sense, because if we had to come in to work to use a proper chair to reboot a server - with a three-hour minimum call-out, overtime, plus travel expenses - it'd only take about five call-outs and the chairs would be paid for." "But you've used the entire furniture and fittings budget. What happens if someone else's chair breaks?" "Get it fixed under the maintenance budget?" the PFY suggests. "I'd use the training budget myself," I suggest. "And justify it by buying an ergonomic chair and saying that you're 'training' their posture." "No, say it's a Health and Safety item!" the PFY cries. "That's centrally funded and there's always a stack of Health and Saftey money for that sort of thing." "There's no Health and Safety budget left this year, I checked - Sharon says that the money all went into building electrical safety after some incidents last year." "Oh right - before your time," I concur. "Nasty business. Had to buy a huge box of warning labels to put on most of the building's powerpoints to indicate electricity is harmful and that it's dangerous to put foreign objects into them." "What, people put all those things into power sockets?" "Hard to believe, isn't it? Some even said that we'd TOLD them to do it!" "Did you?" "Of course not. They were just in shock - jumbles the mind, you know." "So anyway, there's not enough budget for another chair," the Boss says, getting back to his favourite topic, himself. "Yes, we know," I add. "We wanted one for the Computer Room Console desk, but the cupboard was bare. Still, can't you use the Management Innovation Budget?" "Hmm?" "The MIB - it's the slush fund for Company Managers to invest in 'Innovative' technology." "I hardly think a chair counts as innovative." "Neither's a GIS unit for your private car, but the Head of IT got one last week!" the PFY notes. "Why?" "Because it's a slush fund - they're always tapped out within weeks of the New Year by people wanting new gadgets!" "What was the model number again?" the Boss burbles quickly, penny dropping. I write the model number down, adding an "X 2" to the bottom of the page. "Times TWO?!?" the Boss asks. "You don't want to put yourself at risk when working from home now, do you?" "I don't work from home. I haven't got a machine there." "And you've never taken a work-related phone call?" "Well, a couple of times..." "And you sit down sometimes when you're on the phone?" "Wellll, it's possible..." "There you go then!" Two days later, I notice the TWO chairs arrive in their spanky new plastic wrapping, and wait at my desk for the inevitable phone call. > Ring, Ring < Told you so. "How do you hook these things up again?" the Boss asks. "Plug the chair into the charger for four hours..." I sigh. "The interface is infrared, so no wires needed after that." "And what do I do with the coiled wire? "The coiled wire?" "A long green curly wire connected to the arm rests. It says it's an... uh... 'antistatic safety earthing flylead'." I cast a quick glance at my chair and notice the lead in question still in it's plastic bag, taped under the armrest. Woopsy. "Just plug it into the earth pin of any power point." "And which one's the earth pin?" "Uh..." I say, thinking of how a good console chair would be good. And two, even better... . . . "Any of those warning labels left?" I ask the PFY minutes later, as a high pitched scream punctuates the building... ® BOFH: The whole shebang The Compleat BOFH Archives 95-99 BOFH is copyright © 1995-2004, Simon Travaglia. Don't mess with his rights.
Virus writers and hackers are helping Microsoft to develop more secure products, Bill Gates claimed yesterday. Speaking at at the Developing Software for the future Microsoft Platform in London yesterday, just hours before the MyDoom virus began spreading like wildfire across the Net, Gates reiterated that security remains key priority for the software giant. He acknowledged that better security is vital if its .NET strategy is to succeed. Microsoft would lose out, as would businesses, if customers resisted moves to put their businesses on the Net because of security concerns, he said. He said Microsoft wanted to make sure viral epidemics cease to happen. Gates did not say how this might happen beyond noting that the software giant had learned from hackers and recent viral outbreaks. Microsoft has improved its inspection techniques, emphasised the value of fewer lines of code in software development and developed firewall technologies for PCs. Internet worms have also spurred improvements in auto-updating technology, according to Gates. Bcause the smartest hackers targeted Windows Microsoft could improve the security of its platform more rapidly than OS rivals, he argued: hackers are "good for the maturation" of the platform" "It would be wrong to say an operating system is more secure because nobody is attacking it," said Gates, in a clear dig at OS rivals such as Apple and Linux. Getting customers to apply patches - vital in cutting down routes viral spread - is a thorny issue for Microsoft. Only one in five (20 per cent) customers are up to date with patches, Gates says. Gates's perspective on hackers fits fairly closely to their own frequently-cited view that they are acting in an attempt to force Microsoft to improve the security of its products. Unlike his colleague Steve 'Sherriff' Ballmer, Gates isn't inclined to drawing analogies between hackers and bank robbers. ® Related Stories Ballmer to crackers: this PC ain't big enough for the both of us Latest Email worm has SCO-facing payload
More than 115 million people will pay mobile phone networks for data services around the world this month, market watcher EMC forecast this week. And a billion of us will make phone calls on a GSM network this quarter, according to the GSM Association a worldwide trade organisation. EMC's figures take in active users of GPRS, i-mode, CDMA 2000 and multimedia messaging services, including ringtone downloads and the like. Yes, some of these applications leverage data networks, but we'd argue that they're not 'true' mobile data applications but extensions of the voice market. Either way, there are a lot of folk using data networks. January 2004's forecast represents a 13 per cent increase over September 2003's total of 100 million users or thereabouts, itself 14 per cent more than the number EMC recorded for June 2003. That growth has come from Asian users, most of them in Japan and South Korea, where online gambling and "adult content" appear to be driving the popularity of mobile data services. Email and messaging are popular applications, too. Europe has a long way to go to catch up. EMC calculates there were some 16 million users of European GPRS networks last September. That compares with 38.5 million i-mode subscribers in Japan alone. GPRS sits on top of existing GSM networks. Globally over 970 million people were using GSM at the end of last month, 180 million of them using it for the first time in 2003. That, says the GSM Association, represents 80 per cent of the 227 million new digital mobile phones sold around the world in 2003. Some 70 million of those new users live in the Asia Pacific region, with 42.8 million of them in China - just a head of Europe, which saw its userbase rise by 42 million. The North American GSM market, by contrast, grew by just ten million users, as did the Indian market. Russia managed to provide 16 million new users. ® Related Products Search for your next phone in The Reg mobile store
Fujitsu has emerged the winner of the final contract to run the NHS Care Records Service. It beat shortlist contenders EDS and SchlumbergerSema/Cerner to win Local Service Provider status for the south of England. The gig runs until 2013 and is worth £897 million. The NHS CRS programme will see an integrated patient record, prescribing and booking system installed in England. It is probably the biggest civilian IT project in Europe. The government split the country into five areas and invited IT firms to bid for each regional franchise. The winners are Accenture, which won the North East and Eastern Regions, BT (London), CSC (North West and West Midlands region)and Fujitsu. These are the leads: dozens of subcontractors also benefit. BT for example is to supply infrastructure services for the southwest and for the north east. Fujitsu's winning consortium includes PwC. BT also beat C&W and EDS to look after the central care record. On the vendor side, Sun has done very well, supplying the hardware and software for BT's central spine. And let's not forget British firm iSoft which was in the winning consortium for three of the winning consortia, and US rival IDX, which picked up two wins. ® See also: Gov't press release Gates to meet Brown, OGC and NHS chiefs - Sun, OSS in crosshairs? CSC, Accenture win NHS Care Records contracts UK NHS trials Sun Linux, threatens 800k user defection from MS NHS wants another £2bn for IT mega project NHS patient privacy? What patient privacy!
We thank reader Colin Swan for the following 419 email, which we believe is a first. It contains the bog-standard Liberian connection, as is the local custom, but this particular advance fee fraudster appears to have ten rather than the traditional two legs: Dear Sir/Madam, I am deeply sorry for the embarassement this mail will cause you, I did not mean to be a crab. I am Harrison Karnwea, a liberian by birth and the cousin of John Yormie deputy national security minister who was murdered along with Isaac Vaye deputy minister for public works Sir/Madam, this is highly confidential, We have united state dollars, and we are looking for a capable individual who can assist in safe keeping it, we are ready to negotiate on the percentage that will be due for you. This is not childs play, if you are capable, kindly get back to me for more explanitions/details. I look forward to your response, Regards, Harrison Karnwea. We're sure poor old Harrison didn't mean to be crab, although this just goes to show that just about everyone - and everything - in sunny Liberia is in some way related to someone who just happens to have access to vast reserves of illicit currency, courtesy of the fall of the late, lamented, Charles Taylor. And, as the idea that foolish Westerners can easily be fleeced of their cash continues to filter down the food chain, we can only hope that the next missive is not from a herd of Thompson's gazelle which, having escaped war-torn Zimbabwe in a Red Cross aircraft, urgently need help in relocating $35,000,000 (THIRTY-FIVE-MILLION-DOLLARS) left by their white farmer owner after he succumbed to an assault by the Zanu-PF. In the meantime, we're checking out the official Jacques Cousteau site for tips on how to handle illegal money transfer deals with crustaceans. All subaquatic suggestions are, as ever, welcome. ® Bootnote The real Harrison Karnwea is in fact superintendent of Liberia's Nimba County. According to this report he's in a bit of a scrap at the moment with Deputy Minister for Administration at the Ministry of Internal Affairs, Chief Jerry Gonyon. Doubtless his eight extra legs will come in useful when the shooting inevitably starts.
There are murmurings of discontent within the UK's ISP industry following the publication of the finalists for this year's industry awards. As one senior industry source put it: "The list seems a bit…well…odd." The reason for the grumbling is the omission of a number of high profile operators from the coveted "best" ISP awards named by trade body ISPA last week. For instance, despite its dominance in the sector, BT barely gets a look in. And there was no AOL, Demon, Eclipse, Nildram, PlusNet, Virgin or Zen - all major players that many expected to make the shortlists. Of course, some of these, including AOL, decided not to take part in the awards. Others simply didn't make it past the new vetting process and onto the shortlist. And just to confuse matters still, there will always be some punters who reckon certain ISPs shouldn't make it to the shortlists at all. That aside, when El Reg called a number of ISPs to canvass opinion, it became clear that the raised eyebrows were not isolated to a few operators sore at not making the finals. Many ISPs didn't want to talk publicly for fear of tarnishing their reputations. Privately though, they all expressed surprised with the shortlists. Even those that had made it to the final stage were surprised at the ISPs that had failed to make the grade - and some of those that had made it. As one insider put it: "These awards should be representative of the industry. Except the list of finalists doesn't reflect the industry. It doesn't tell consumers or businesses who are the best ISPs." Another said: "It's looks a little odd that the list didn't have the usual names." So, is it sour grapes on behalf of those who didn't make it or is a fresh line-up indicative of changes happening within the UK's ISP sector? Nick Lansman, Secretary General of the Internet Service Providers Association (ISPA), defended the awards and insisted that everything had been done to make them fair. "These are prestigious awards and ISPs want to win," he told us. "We try our best to make sure they're fair. But we can only assess ISPs on the criteria set out in the rules. You're always going to get some companies that are upset." He added that if companies did feel aggrieved then ISPA would look into the matter. Curiously, this year's nominations were drawn up after each ISP that entered the awards had its service tested for between a month and six weeks over December and early January. ISPA insists that this is an improvement on previous years. But other argue that it is also flawed, since it only takes a snapshot of an ISP's performance over a month or so. Is this really adequate for awards that are, by ISPA's own admission, so keenly fought for and "prestigious"? If this is the approach ISPA plans to take, and if the industry body accepts that a lot is riding on these awards, then perhaps it should consider monitoring ISPs all year round. According to one well-respected industry commentator, such an approach would even up the ups and downs that ISPs sometime suffer providing a more accurate measure of their performance. Not only would this provide useful information for consumers looking to chose ISPs, it should also result in less grumbling next year. ® Related Story UK Internet awards nominees named
Virgin Mobile clocked up more than half a million new customers at the end of last year, making it the most successful three months ever for the mobile telco. Cheap texts and a bundle of other offers have helped increase Virgin Mobile's customer base to more than 3.6 million punters, the company said in a statement. It's the biggest quarterly sales period in the company's history, and boy, isn't it letting people know. Said Virgin Mobile's Sir Richard Branson: "This is a glittering result for Virgin Mobile. In the run up to Christmas, more than half a million new customers chose us in the belief that we have the best tariff, the best value, the best marketing and the best customer service in the industry - and they are right! "We have grown faster than any of our rivals over the past two years, and we have absolutely no intention of slowing down." The numbers for the last three months of the year are up 37 per cent on Q4 2002 and up by more than 50 per cent on the year. Today's record figures will, no doubt, provide much welcomed relief for a company embroiled in a messy relationship with joint partner T-Mobile. ® Related Story Virgin and T-Mobile to co-habit after divorce
The third quarter looks set to be the turning point for WiMAX, seeing the release of a new version of the standard that will significantly boost silicon roll-out. Carrier interest, critical to success, is rising and British Telecom is the latest major telco to say that it will carry out trials, following in the footsteps of AT&T and Nextel. But significant spectrum and interference issues need to be addressed if WiMAX is to reach its full potential. With two major events focused on the standard running last week, an increasing number of equipment companies, including Siemens Mobile, were also laying down roadmaps. Siemens will work with Intel on base stations and on future mobile devices incorporating Wi-MAX. Intel, as usual, dominated the agenda at both meetings – the WiMAX Forum’s own summit and the WCA Annual Symposium, both in San Jose. Broadband wireless will bring the next five billion users to the #Internet, said Sean Maloney, the high profile general manager of Intel’s Communications Group and WiMAX’ chief ambassador. The cost effectiveness of 802.16 means that "WiMAX-certified systems will provide the building blocks truly to usher in the broadband wireless revolution." "We see a three-phased deployment of 802.16 technology that will begin with fixed outdoor antenna installations, quickly bringing wireless to emerging markets and speeding the installation of broadband services without the need to lay wire or cable," said Maloney. "The technology will then rapidly progress to indoor antenna installations, broadening its appeal to carriers seeking simplified installation at user sites. Finally, in the third phase, WiMAX certified hardware will be available in portable solutions for users who want to roam within or between service areas." At this stage, Intel will have WiMAX Centrino-style chipsets for handsets that connect directly to the antenna, but for the coming 18 months it will focus its marketing activities on infrastructure rather than client systems – though increasingly, its R&D will be working on the WiMAX portable. Chip and equipment makers are starting to roll out ‘pre-WiMAX’ gear that is ready to migrate to the standard, as well as hardware based on the first iteration, 802.16a. However, the real turning point will come with the finalization of 802.16d, a revision to ‘a’ that brings together the original line of sight 802.16 standard, its non-line of sight successor 802.16a, and the 802,16b and 802.16c extensions for quality of service, testing and interoperability. This specification is more advanced than ‘a’, especially because it allows for smaller, cheaper power amplifiers, bringing down the cost of implementation, and because it supports smart antenna schemes such as Multiple In Multiple Out (MIMO), which held maximize real world range and power. The new variant is now solid enough for first chips to be designed in time for third quarter roll-out. The revision was finished at the IEEE meeting in Vancouver, Canada last week and a test suite will be ready in September. Wavesat and Atmel are already working on 802.16d silicon and UK-based Airspan Networks says it aims to be the first to ship fixed wireless systems based on ‘d’. Its kit will be based on Intel’s upcoming Rosedale WiMAX chipset and Intel is also working with Alvarion, Aperto and Redline. Fujitsu has also begun development of an 802.16d baseband in conjunction with Wi-Lan and two Taiwanese chipmakers are expected to bring out early WiMAX devices. Mohammad Shakouri, VP of business development at Alvarion, says the critical next step is to get the certification process up and running. This is essential to win operator trust after various high profile broadband wireless failures of the past, such as Teligent, put many companies off this sector. He points out that companies are now shipping products that are compliant with 802.16a – although he agrees that ‘d’ adds significantly to the attraction of the standard – but they are not yet officially certified as such, and so will be treated with caution by possible telco customers, who would otherwise find the low costs of WiMAX ($100-$150 per home for residential services, and $5,000 to $30,000 per base station, Alvarion says) attractive. Although there is still considerable operator suspicion to allay, more and more carriers are assessing WiMAX as an alternative or parallel to wired or cellular networks. Three operators, AT&T, Covad and PCCW, joined the WiMAX Forum last week and more are expected to follow shortly. BT and UK Broadband in the UK, Iberbanda in Spain, MVS Net of Mexico, Brazil’s Neotec and Reliance Infocomm in India are all planning or “serious considering” WiMAX trials, according to Intel, which takes a strong role itself in evangelizing Wi-MAX to operators and governments worldwide. BT is one of the largest telcos so far to take a public interest in WiMAX and is expected to roll out trials in rural areas in the near future. The WiMAX Forum admits that most of its work in 2003 was focused on chips and equipment and that there was very little input from carriers. It aims to rectify this with the formation of its Service Provider Working Group, to encourage contributions from the carrier community and to influence spectrum regulators. This should make it easier for operators to influence the development of the WiMAX system profiles, a process from which they have complained of being excluded in the past, presumably one factor in their slowness to become actively involved until this month. Most importantly, the new group will develop and promote the business case for service providers to deploy 802.16; will focus on real world multimedia applications; and will create standard network management interfaces. The aim is to gain greater carrier input at an early stage in order to shorten the trial and review process and reduce time to market. To improve carrier confidence, it is essential for WiMAX to work with all the major radio standards and their governing bodies. Action to prevent collision with other wireless technologies and to ensure that WiMAX can be rolled out consistently in different countries is becoming urgent as products approach the market. This is the remit of the WiMAX Forum’s Regulatory Taskforce, which also exists to lobby for allocation of spectrum for WiMAX applications on a country by country basis. One key task is to ensure interworking where WiMAX shares spectrum with other protocols, such as Wi-Fi at 5GHz and 3G in the MMDS spectrum (2.5-2.7GHz). For all the official line that WiMAX and Wi-Fi are complementary, unless their two bodies work closely together, there is potential for interference in the 5GHz band where 802.11a operates – especially for outdoor Wi-Fi products, which tend to work higher up the band than indoor ones, and therefore closer to WiMAX. Guidelines are essential to prevent chaos from “serious contention”, says Paul Senior of Airspan, and head of the Regulatory Task Force, especially as both technologies develop. For instance, 802.16 transmits control data every 2.5ms, blocking other users from its channel, and the Wi-Fi QoS extension, 802.11e, has a similar functionality, making it behave more like WiMAX. This will be the first priority of the Task Force, working with the Wi-Fi Alliance and IEEE. More ambitious is the need to work with regulatory bodies to try to obtain the “best deal” for WiMAX operators- for instance, lobbying countries where 802.16 frequencies are reserved for other uses. One obstacle is that WiMAX has both fixed and mobile aspects, but these tend to be handled separately by regulatory bodies, with mobile technologies limited to specific bands. For instance, some countries treat 3.5GHz as fixed only, which could be a major restriction on 802.16e, and an issue that needs to be sorted out before that version of the standard appears in products in about 18 months’ time. And others allocate 3.5GHz only for satellite, although this can share happily with broadband, as it does in the UK. There are also limitations on usage of the 5.8GHz band, particularly in Europe, that could constrain the range and capacity – and so the appeal – of WiMAX if negotiations with the region’s regulators are not successful. In summary, like most wireless bodies, the WiMAX Forum Task Force is pushing for consistency of spectrum allocation across different regions and for cooperation among standards groups to minimize the risk of interference as frequencies become increasingly crowded. Such work will be as important as the technical developments of the vendors to convince the operators to ramp up their trials in the coming months and so create the expected boom in WiMAX-based services at the turn of the year. © Copyright 2004 Wireless Watch Wireless Watch is published by Rethink Research, a London-based IT publishing and consulting firm. This weekly newsletter delivers in-depth analysis and market research of mobile and wireless for business. Subscription details are here.
Nokia and Sun are building defenses against Microsoft with new developer programs for Series60 and Java, aimed to expand the range of applications for the smartphone, especially in the enterprise. By the end of 2004, Nokia says it will have radically increased the appeal of its Series 40/60/90 development environments to a broad base of programmers, using Java and other languages. In any showdown between Nokia’s operating systems and Windows Mobile, the vast developer and software base of the latter will always be its prime advantage. Now the Finnish giant aims to simplify developer access to its technologies in order to stimulate the creation of applications. It has introduced a new mobile scheme including ‘tiering’, which will give certain developers – which pay higher fees - earlier and more in-depth access to the System 40/60/90 development environments for Symbian OS and other Nokia systems. It will support a wider range of languages on these platforms in order to make programming simpler and to fill gaps that Java leaves. But Java remains the dominant language, hence the cooperation with Sun. Nokia has been active for the past two years in making its platforms more appealing to the 3m-strong base of Java programmers. It has been working with Sun since 2001 on various schemes to support and encourage these developers, and the latest move is to cooperate on improving the application programming interfaces within Nokia OSs. Sun, which has missed many chances in the server world to capitalize on its ownership of Java technology, sees the mobile world as a huge opportunity to push Java against Microsoft Windows, and one where it is starting in a far stronger position than on the PC platform. Sun’s determination to make allies of the smartphone makers is further illustrated by its involvement in a new roll-out of centers where developers can get their applications certified for Symbian OS platforms not just from Nokia but also its rivals Sony Ericsson, Siemens and Motorola, with a single set of tests. Moves to strengthen the appeal of Nokia platforms to developers cannot come soon enough for the Finnish company. Although Microsoft’s progress in the cellphone world has been slow, its operating system is starting to make limited inroads. Not enough to worry Nokia in the cellphone mainstream, but the danger lies in the enterprise market. Here, Visual.Net programs are well established and interoperate easily with software on other platforms; and as companies start to move from PDAs to smartphones as their mobile client of choice, they may find it a logical step to stick with the same environment. Palm, of course, hopes to capitalize on this trend by adding smartphone capabilities to its systems; but Microsoft-based devices such as the HP iPaq are in an even stronger position once they start to introduce cellphone functionality. As Nokia pushes into the enterprise, it needs to establish its client platform among business-focused developers in order to take away the key advantage of the incumbent PDAs – an ambition that, of course, chimes in with Sun’s aim of chipping away at Windows with Java. Nokia is drawing on the experience of Palm, whose PDAs are well established in the enterprise and which made its initial fortune based on the way that it opened up its (at the time) innovative platform to a broad developer community. The Finnish company has tapped a vice president of developer relations, Lee Epting, from Handspring (and before that, Palm), to help revamp its ability to attract a broad base of programmers as Series 60 – which has had impressive early take-up – starts to gain critical mass and become the dominant environment for high end handsets. Epting admitted this week that Nokia’s developers have demanded a more varied range of options within Series 60, and that the company is responding to the main requests as quickly as possible. For instance, the company will offer support for Perl and Python scripting on Series 60, expanding the only two choices currently available – native Symbian C++ APIs, which are tough to master; or Java, which does not always allow access to native resources such as SMS. Support for features such as Perl opens up Series 60 to non-specialists or even non-technical people looking to create a very simple forms-based application, something that is a strength of Microsoft tools. Last year, Nokia took its first step to making Series 60 more friendly to novices when it agreed to support Psion’s old language OPL, which is similar to Basic and has a loyal programmer base. Symbian open sourced the 20-year old language, though Epting concedes that it needs to gain some buy-in from various parts of Nokia. She seems to be facing a common challenge of executives charged with modernizing a well established platform – resistance or apathy from some parts of the company, and a consequent battle for profile and funding. Other items that the Nokia developers’ community, Forum Nokia, has demanded, and which Epting aims to address in the coming months, include improved documenting of APIs; better integration and support for ‘smart downloading’; and the ability to support ‘DRM forward lock’, which allows the owner of an application to forward it in trial format to a friend. Nokia says that 1.3 million developers have downloaded software tools for its platforms, and that its new moves are based on a poll of their most common complaints. But it is not just responding to gripes – it is also seeking to convince non-Series 60 developers that it is creating an attractive and lucrative market for applications using the platform. The enterprise push helps with this, by focusing on higher margin products than the typical consumer cellphone apps, which are often downloaded for a few dollars. Also important will be support for simpler distribution methods, particularly enabling business users to download complex Series 60 software more easily over the air. Epting says this is a priority for 2004 and Nokia hopes to build on some early consumer-oriented pilots in Asia, where apps can be downloaded from kiosks over Bluetooth or infrared links. Microsoft is also expected to shake up its software programs this year to appeal to mobile developers. It has two objectives, to encourage Visual .Net programmers to support the mobile platforms, and to draw in non-Microsoft developers to its platform, an ambition that will be severely hampered by the lack of Java. The greater efforts that Sun and its handset friends make to simplify Java and the various frameworks that support it, the harder it will be for Microsoft to make any inroads into the consumer cellphone applications sector, even with the appeal of Visual .Net’s simplicity. © Copyright 2004 Wireless Watch Wireless Watch is published by Rethink Research, a London-based IT publishing and consulting firm. This weekly newsletter delivers in-depth analysis and market research of mobile and wireless for business. Subscription details are here.
One of the flies in the ointment in open or IP based video on demand, is that Netflix has been giving it a moving target. Netflix needs little or no technology, depending upon whether or not you are its customers, or you are Netflix. While broadband delivered films need more than one technology breakthrough. The Netflix process is simply bulk buying of DVD films from the moment they go to DVD, hold them in shipping centers around any given country (the US in this case although they have said they will hit the UK and Canada this year) and let people order them online, and use the post to ship them both ways, guarantee delivery in one-to-three days. Extra attractions are charging no late fees, unlike most video rental stores, allowing each household to have a set number or DVDs out at any one time, bundling the price for a single monthly subscription, and ship DVD’s to and from the home as part of the price. If a video on demand service cannot beat that, it cannot take off. At least it can’t take off in the US, where Netflix has done a good job and where it has a head start. It is conceivable that outside the US, where access to video rental is perhaps less widespread, a broadband service could lead a video on demand service, but it would have to be a download service and why would Hollywood give such a service access to films? No, it is a simple fact that at the moment Netflix takes between one-to-three days to get any given film to a home in the US, and this is largely due to the fact that the US is such a large geography. If a similar service was launched in parts of Europe, even the local postal service could deliver packets next day. If we compare this with both Amazon and Kazaa, it compares well. Amazon usually manages to tie, roughly, with this delivery speed, and a file sharing service such as Kazaa, even given the fact that many of its customers swap illegal and therefore free files, would find that speed hard to beat. If someone left their broadband line on all night, or all day while they were at work, perhaps next day is a reasonable speed for a full film download. Faultline has been asked before can films be delivered to a home, over a broadband line in under three hours. We pointed out that the H.264 codec, and ADSL2 chips due this year are two ways of getting this time way down, and even new versions of the IP protocol are rumored to be set to make an appearance to improve speed. But as things stand, three uninterrupted hours is what you probably need to download a film on a dedicated broadband line. Why wait any of this time, why not drive two miles to the local video rental store and rent a film? We all know the answer to that is that it won’t have a choice of 15,000 films as Netflix has; and even if it did, can you imagine how long it would take to find just the film you wanted, especially if you insisted on finding one with actors you knew, a director who’s work you admired, in a genre you enjoyed? All of these obstacles to watching DVDs any other way have propelled Netflix to a $270 million business, in profit, with 1.8 million customers and on a growth curve of 80 per cent. This will see it become a $1 billion organization by 2006 or 2007. Its penetration of the US market is under 2 per cent, while its original market in the San Francisco Bay area is now running at 5.9 per cent. It reasons that it can reach this penetration, perhaps more, across the US and it is this calculation that initially led it to say at this week’s unveiling of its fourth quarter figures, that it will make the $1 billion mark early. For 2004 Netflix is expecting to go over $450 million and get its subscriber base up to 2.5 million. CEO Reed Hastings said at the results conference “that it is DVD acceptance that is the rocket propelling studios profits,” and he said he was now buying one million DVDs every quarter from 100 film studios and sending two million packets out every week. But the driver behind the Netflix business cannot just be a good idea, and Hastings is keen to place the credit with his company’s execution. With at least 100 copycat organizations operating in the US, some of them run by Walmart and Blockbuster, the original Netflix still claims to take 95% of the revenue in this market. Perhaps he is right, and perhaps those bigger competitors have not yet had a chance to shine, but Netflix is certainly not under threat from a maze of smaller organizations that have copied the original idea. Web advertising and word of mouth drive most of Netflix’ new customer growth, but the heavy TV advertising must give new customers a warm feeling of having a brand behind the business. Perhaps it is because of this that Netflix has managed to drive its churn down below 5 per cent for the first time in the quarter just gone, and intends to keep it there. For competing businesses, this will be poison, to always be compared in their execution to the market originator, and to have to pay more to acquire customers, only to lose them in a shorter period of time. Part of Netflix’ future plan is to continue to put more pressure on that window of opportunity for IP delivered video on demand. It will drive up and up the percentage of households in the US that can get films from it next day, and it should achieve this for 80 per cent of customers, sometime during 2004. But if driving to the shops for a poor selection of DVDs at the rental store is one step too far for most people, then there is something about not even having to remember to post the DVDs back, and being able to select programs on your return home from work, that are watchable that same night – and that’s video on demand. Netflix would give no more than a few tiny clues that this is in fact the company’s destiny. It said that its business would continue to be DVD based for the forseeable future, but that was working on a film download service that it would test in 2004 and deliver during 2005. The idea would still be ordering films online, but then opting for DVD physical delivery or online, whichever the customer chooses. The company feels this would have a modest uptake during 2005 and then grow over the next 10 years. The company frequently points out that the data which Netflix ships on a busy day on plastic discs is about half the total US Internet capacity at around 5,000,000 gigabytes per day. But what it doesn’t point out is that the internet capacity is virtually doubling every year and that with newer compression, in a year or two the entire Netflix shipments might be only a tiny fraction of the US internet capacity, and anyway, we have always expected that video is the killer app for broadband, so it should be a high percentage of what travels over the net. Hastings also pointed out that Netflix is already talking to studios about this and that given his $100 million a year spend, they were listening, but declined to say more about the exact nature of the Netflix future VoD offering and said the company will not discuss it again until it is ready for launch. There is one irresistible gain for his company in all of this. When a company rents out DVDs, and when it boasts that it has 15,000 films in stock, then it has to lay out cash for a huge number of films just to have them sit idle for much of the time. With 1.8 million customers, most of them holding 3 DVDs at any one time, that’s a minimum of 5.4 million DVDs either on the way to, with, or on the way back from, clients at any one time. But online downloaded films are copies, and there’s no postage and no up front purchase of all the hard DVDs. Or is there? If Netflix could lose those costs, its yields would sky rocket and the company’s net income of just 2.5 per cent of revenue could double or treble overnight. The image of a re-entrant film copy that sits in a machine’s memory and which can be sent to any number of customers at any one time, is how we all view Video on Demand. It may not have to be viewed that way if you are Netflix. Could Netflix have in mind a system where the DVD itself is the version of the film that is being played on a remote device, but that it is reliant on the remote device remaining connected to the DVD via Netflix in order to initiate the playing of the film. If such a system was built it might mean that there was no residual online version of the film remaining on the remote machine that could be file shared around the planet. This might mean that Netflix would still have to buy the DVDs, but at least they could sit in server farms, always ready to be used. But really this is fanciful. Good digital rights management on a server can emulate that effect and really there is no special edge that owning the DVDs themselves can give to Netflix. And anyway, what device at the other end of the broadband line is it that the DVD is downloading to? A PC? Well that’s no good if Netflix customers want to watch on a large TV screen. The numbers of TV households that already have DVD players is high in the US (and elsewhere) but not that have their TVs set up to receive from broadband, at least not yet. So what special edge can Netflix bring apart from its buying power at the studios? Let’s see, could it download its films to a set top? Well companies like Motorola and Scientific Atlanta that lead the set top market, have devices that terminate cable TV systems, and they both have models that can act as an advanced Digital Video Recorder, and these can have smart card based conditional access systems in them to protect content. But neither of these two companies is going to feel too well disposed towards Netflix. It is single handedly denting the numbers of TV programs that its customers are watching, driving down the advertising of their biggest customers, cable companies like Comcast. If Scientific Atlanta befriended Netflix, would it suddenly lose out on the next juicy batch of set top contracts? Then perhaps it is the independent DVR makers that might make good partners for Netflix, and a good place for its downloaded films to play and even be temporarily stored? And who better than the market leader TiVo? TiVo has over one million customers in the US, a similar number to Netflix. It has a monthly subscription model just like Netflix. TiVo is supporting movie studios with its new long form, voluntary, DVR advertising. And TiVo is not best of friends with the cable TV companies, given that it is a lead supplier to DirecTV, a company that has come out recently in praise of DVRs. And TiVo technology based DVRs are in the stores already from some of the consumer electronics giants. And it just so happens that Mike Ramsey, the CEO of TiVo, sits on the board of Netflix. Just an idea, not so much a conspiracy theory, but perhaps it is the equivalent in the software industry of Bill Gates of Microsoft and Larry Ellison of Oracle sitting on each other’s boards and hatching plots. Whatever the IP VoD system that Netflix comes up with, it must be confident that it currently sits in pole position in controlling the switch from DVDs to IP TV VoD films, and that gives it the potential to be the iTunes of the film industry, as long as it can find its iPod. © Copyright 2004 Faultline Faultline is published by Rethink Research, a London-based publishing and consulting firm. This weekly newsletter is an assessment of the impact of events that have happened each week in the world of digital media. Faultline is where media meets technology. Subscription details here.
Cash'n'CarrionCash'n'Carrion We're pleased to announce to all those readers desperate to get their hands on our instant classic My job went to India and all I got was this lousy t-shirt, that said item is now back on the shelves and ready for immediate dispatch. Which is all very timely, given that the outsourcing of callcentre jobs to India continues unabated. Still, you've got to laugh, haven't you? Or maybe not, according to Steve who posed this email question: er... where is your new pride and joy made exactly?? Harry Mantheakis thinks he can answer that one: I suppose there's a fair chance your 'lost my job to India' t-shirts are made in India :-) Sorry, Harry, but you're way off the mark. In fact, all of our t-shirts are made in Indonesia by eight-year-old children working 27-hour days for a mere $10 per week. It's not much, but it's all they can afford, as the old joke goes. Boom boom! Upon arrival in the UK, our printers - highly-skilled Albanian gypsies on day release from the local immigration centre - lovingly handpaint each and every shirt using a single badger bristle. A single garment can take up to eleven months to complete, and our enthusiastic artists welcome the tin of powdered milk they receive for every 100 shirts successfully bagged and boxed. Finally, crates packed to bursting with top-quality apparel are carried across country by unemployed, barefoot Liverpudlians who work for no more than a can of strong lager, a packet of fags and a Lotto scratchcard a day. Which is why we can offer you our shirts at an unbeatable £14.99 rather than the £437 they would cost were they manufactured in the UK by British workers. You see, if you add up the cost of premises, wages, benefits, taxes, etc, etc, you will see that is is simply not economically viable to keep these jobs in the UK. Nothing to do with profit. No, really. ®
LetterLetter We don't typically spend a lot of time debating benchmarks here at Vulture Central, primarily because of all the work vendors pour into tuning their systems to perform well. You often find servers with unusual features such as excessive memory or you see tweaks to processors that would not likely occur in the real world. One Reg reader took the time to have a close look at some of the most popular benchmarks and show how this vendor-tweaking plays out - at least in the case of Intel's Itanium processor. In your recent article "Analyst sees St. Fister in Itanium wafer," you included the statements "in 2003, Intel managed to capture most of the leading performance benchmarks with EPIC/Itanium architecture" and "Clabby rightly points out that Intel has destroyed the competition on most benchmarks with IBM's Power processor the only rival even close to keeping pace." This primarily proves that if one sprays around enough Kool-Aid it will touch even the lips of those who don't actually care for it: Intel has had considerable success in doing this for Itanic, and HP perhaps even more so. Let's look at the record. SPECint (the processor metric of most interest to most server users): By virtue of its monstrous on-chip L3 cache (rather than anything intrinsic to its EPIC architecture), Itanic briefly grabbed partial SPECint leadership (in base but not peak scores) over its RISC competition when the McKinley product was nominally launched in July, 2002. But its base scores still fell significantly behind those of its P4/Xeon sibling and AMD's Athlon, which without question also qualify as server competition, so no brass ring there at all. When Madison was launched in June, 2003 it did somewhat better: doubling the size of the already-monstrous L3 cache gave it at least a marginal SPECint win over *all* competition for a while. But again there are qualifications to note: 1) this SPECint leadership was only obtainable using HP's zx1 chipset and its HP-UX compiler (performance in other configurations only matched that of IBM's new POWER4+, and fell short of the fastest P4/Xeons and new AMD64 products), and 2) AMD64 and P4 have since forged ahead of even those leading HP platform results. So Itanic not only failed to 'destroy' the competition in SPECint, it isn't currently even a leader at all, and from published roadmaps seems unlikely to retake the lead any time soon. SPECfp (the processor metric of most interest to many HPTC users): This is the *one* area where Itanic can legitimately boast of a clear lead, and can actually credit that lead to its EPIC architecture rather than to other chip or system features. However, there's a good reason why HPTC users haven't jumped on the Itanic bandwagon to the exclusion of all other platforms: they're also sensitive to things like performance per Watt and performance per dollar, and Itanic does not lead in either of these. TPC-C (a server metric of significant interest to database users, at least in its unclustered form): Itanic's performance here (again) in no way 'destroys' - or even clearly leads - its competition. 1) HP's zx1-based 4-processor platform has a clear TPC-C lead at that node size, but not due to any EPIC features: 4-way Xeon systems support only 1/3 the on-chip cache, 1/3 the maximum RAM, and 1/3 the aggregate FSB bandwidth to work with, IBM hasn't submitted any top-of-the-line systems at this system size, and the cost of setting up a full-bore TPC-C test seems to have prevented any serious AMD64 submissions since the initial one nearly a year ago (its submission date was changed last summer due to insignificant revisions). 2) HP's 64-processor Superdome indeed posts the top TPC-C score, but IBM's POWER4+ p690 attains 76% of that score using only half as many processors (and only 1/4 as many processor chips, since each is dual-core - you mention future Itanic multi-core chips later in your article, so this might be of interest): on a per-processor basis POWER4+ delivers 33% higher TPC-C performance, and on a per-processor-chip basis it delivers 167% higher TPC-C performance (these numbers based on comparison with Itanic's best 32-processor TPC-C result to avoid apples-to-oranges issues: relative per-processor performance in the 64-processor Superdome system listed is even worse). SAP SD 2-tier (another server metric of significant interest to database users): IBM simply hasn't bothered to submit top-of-the-line configurations here, but their previous-generation 1.3 GHz POWER4 32-processor p690 system beats the best current-generation top-of-the-line 1.5 GHz Itanic 32-processor score (so, for that matter, does the 32-processor 1.15 GHz Alpha GS1280 system - again using previous-generation chip technology), and the score obtained by IBM's mid-range 1.45 GHz p650 8-way server suggests that its high-end model would approximately equal the top-of-the-line Itanic 2 at the 8-processor node size - so no 'destruction' (or even clear leadership at all) by Itanic here, either. Oracle Applications Standard Benchmark (yet another significant database benchmark): No direct comparison with POWER4+ is available, and AMD64 systems haven't been submitted here. But the 16-processor POWER4+ score is over 3.4 times the 4-processor Itanic2 score, so given the significantly less than linear scaling up from 4 processors that HP's systems have demonstrated in other database-related areas POWER4+ seems likely to hold at the very least per-processor equality here. SPECweb99_SSL (an important web-serving benchmark): HP initially seemed to have chosen to submit a SPECweb99_SSL Itanic test instead of one using its more popular SPECweb99 sibling because there was no serious competition there, but times have changed: in the hotly-contested 4-processor node size AMD64 now holds a modest lead, with Itanic2 and POWER4+ in a tie for second. SPECjbb2000 (an important enterprise server benchmark): Itanic holds a slim (under 3%) lead over AMD64 at the 4-processor node size and another slim (under 4%) lead over POWER4+ at the 32-processor node size - hardly 'destroying' the competition, once again. And while Superdome's support for up to 64 processors lets Itanic exceed the p690's absolute headroom, the top score in the benchmark is held by a 112-processor Fujitsu SPARC64 (gasp!) system. So there you have Itanic's benchmark 'dominance', using about the same set of benchmarks that HP uses to try to prove it (but without the apples-to-oranges spin that they add): one clear win (SPECfp), three approximate ties (OASB - maybe, SPECweb99_SSL, and SPECjbb2000), and three (TPC-C, SPECint, and SAP SD 2-tier) where Itanic is clearly behind at some system sizes and/or apparently mostly benefiting from the lack of competing submissions elsewhere). POWER4+ is not some struggling also-ran as your quotes above suggest: on average, it's measurably superior to Itanic 2 (and should increase that lead significantly when POWER5 appears this year), and where AMD64 has shown up to compete it's on average at least Itanic 2's equal (and with comparable benchmarketing effort would likely be shown to be its superior). Alpha would be putting Itanic to shame if Compaq hadn't decided to start de-emphasizing Alpha development over 4 years ago, and PA-RISC might as well had HP not made a similar decision even longer ago - but Itanic still has plenty of competition left (as noted above, even SPARC64 can't be ignored), and another serious threat may develop if Intel decides that it has to counter AMD64 with an x86 64-bit extension of its own. Bill Todd ®
Microsoft will probably ship the client version of Longhorn before its server counterpart, says Bill Gates. This tentative prediction was made in Gates's rallying call to software developers to get behind the next version of Windows, at the Developing Software for the future Microsoft Platform in London yesterday. Even though Longhorm isn't due until 2006, Microsoft used the conference to build developer enthusiasm ("prime the pump") for the next version of Windows, which promises substantial architectural changes. Longhorn’s major features include a revamp of the presentation layer (codenamed Avalon), a radical restructuring of the file system (WinFS) and far deeper integration of Web services technology (Indigo). Altogether the enhancements amount to the greatest step change for Microsoft since it moved from NT to Windows 2000 four years ago. Gates said Microsoft has been trying to develop a unified file system for some time. WinFS promises a file systems based on a relational database derived from SQL Server and the ability to search and manipulate data according to XML tags. Documents could be organised based on a keyword, author or some form of user-defined criteria. With Longhorn it’s also goodbye to the Registry and hello to richer replication. Middleware becomes underwear The OS is crammed to the brim with middleware so that only "high end transaction processing" technology will have a life outside the platform. Microsoft argues this approach is necessary to build a better platform for e-commerce and to improve efficiencies within organisational boundaries. Gates told developers that Microsoft was re-architecting the internals of the OS - everything from modelling to device drivers - with Longhorn. For developers, Microsoft promises that the OS will offer faster, easier development. Longhorn will allow clients and servers to run in a stateless mode - where data and applications run concurrently on both a server and a PC running Longhorn - to exchange information and complete a task. "We are fudging the line between client and server," said Gates. Egg timer Tom Ilube, CIO at online bank Egg, demonstrated an application developed using Longhorn to show how Egg's online banking service is evolving towards a real-time system with more features to differentiate itself from online competitors. The demo showed how a customer's PC could be running a Web banking application constantly in contact with Egg's back-end systems. In this way it is possible for users to be notified about questionable transactions more quickly, for example. Although this prototype went down well with developers their real enthusiasm was reserved for the ability to do "rich search and views" which WinFS enables. Quite a few developers were comparing the ability to put together a document with embedded video clips, links, picture and data to the way data could be manipulated in the futuristic film Minority Report. The interface is snappy but the Web-services hooks aren't finished yet, according to the demos we saw yesterday. So let's not get too far ahead of ourselves. Future perfect Gates said Longhorn will provide greater support for technologies such as instant messaging and VoIP. However greater support for speech synthesis and recognition will probably have to wait until Blackcomb, the successor to Longhorn. An audience member asked Gates about his predictions for the future, cheekily pointing out that apart from his forecast that PC would only ever need 640K of RAM most of his predictions had been close to the mark. Gates said he'd been misunderstood about the 640K prediction, which only referred back to the maximum address space of Intel's 8086 processor. Looking towards the future, Gates said he didn't foresee the industry running out of the address space possible with 64 bit computing "in the foreseeable future". ®
IBM will next month roll out one of its beefiest NAS (network attached storage) systems to date, pumping the kit full of technology typically used on high-end Unix servers. The TotalStorage NAS Gateway 500 box should arrive by the first week of February at a starting price of $60,000. With a price tag like that, it's clear that the product is for serious networked storage customers. It's designed to link a number a of boxes through a central point, giving it the Gateway name. IBM proudly says it's taking aim at EMC and Network Appliance with the new kit. The system runs on IBM's own Power4 processors and AIX operating system - the two keys to the pSeries line of Unix servers. With the Power4 chip on its side, IBM claims it has boosted overall performance of its NAS systems by 150 percent. The NAS Gateway 500 will serve up files to IBM's servers, TotalStorage FAStT products and the Enterprise Storage Server - aka Shark - box. The gateway also supports Unix, Linux and Windows clients and can support servers from other vendors when used in conjunction with the IBM SAN Volume Controller product. IBM will bundle agents for Tivoli Storage Manager, Tivoli Storage Resource Manager, and Tivoli SAN Manager software to help manage the kit. But one of the unheralded features of the NAS Gateway is the clustering technology used to connect two systems and supposedly lessen the effect of failures. The problem with the clustering, however, is that IBM bills it as a way to "maximize system downtime." See for yourself on this Web site. That is until IBM wakes someone up in the UK to fix the marketing slip. ®
Pay no attention to external surveys, HP told its staff yesterday in a fresh memo leaked to The Register. HP wants to emphasize its own internal staff polls, which are closely-monitored by the company to ensure they produce the desired amount of Happy Talk. The Pink Slip Princess Carly Fiorina is doing her best to spin HP's exclusion from the Fortune 100 Best Places to Work list into something positive. Fiorina is attempting to convince staff that the Fortune snub was part of her fanciful plan to boost morale at HP. Shortly after our story on HP's precipitous slide down the Fortune rankings appeared, HP went into spin control with a fresh internal memo. The document was penned by an HP staffer and is littered with quotes from Fiorina "CEO Carly Fiorina addressed HP's absence from the 2004 Fortune Magazine 100 Best Companies to Work For list at the January 14 Senior Leaders Meeting", the memo begins. "In 2003, shortly after launching the Best Work Environment initiative, HP decided that participating in 2004 Fortune survey would help us know where we stand. 'Clearly, we do want to be on this list. We want our people to feel excited and engaged about who they work for and the environment that we create for them,' Carly told HP's senior leaders. 'We knew, given the timing of the survey, that we probably weren't going to be pleased with the results. But 'facts' are our friends, and we wanted to know what the survey yielded.'" What HP probably did not know was that consultant Debbe Kennedy would slip up when preparing an assessment of the Fortune study results and let the sucker scramble unprotected across internal HP message boards. Kennedy's work was meant for the top brass but alas, made its way down to the disgruntled plebs as well. Here's the Official Party Line on how to think about the Fortune results. Read and obey: "The way to think about the Fortune 100 survey is that we did it to learn. We have learned and now we need to integrate those learnings into our Best Place to Work efforts, as well as our own surveys going forward," we learn. "Carly went on to say that, ultimately, the best judgment around our success will be what our employees tell us, with the best measure of success being the Voice of the Workforce (VoW) survey. The company considers the data and candid comments that emerge from the VoW survey as an important reflection of employee perspectives and a critical tool for guiding action." Ah, so it's settled then? VoW is the way to go. Well, not quite. A man we'll call Bangalore Bill sent word of just how this survey works. "Your participation in the survey is tracked - it's actually an objective on your performance plan - and the results are widely understood *not* to be anonymous - as they are claimed to be," Bill said. "Consequently, many folks just simply put the "happy" answers on this survey, and check it off. It's not even remotely independent or honest." Come on, it can't be that bad. "As you can imagine, this in turn encourages employees not to identify any negative "issues" - lest they be drafted into mock efforts to resolve them. "Independent, external surveys are always going to turn up the true picture. And I don't expect that picture to change in a year or two; these changes in HP are permanent, and they are matters of policy." So there you have it. HP failed the Fortune test on purpose so it could refine an in-house study to make sure results improve as planned. Fly safe. ® Related Stories 'Fear and Loathing at HP' - say internal docs How HP invented the market for iPod resellers HP declares war on sharing culture HP would be better off without Compaq drain - Analyst I'm Carly, Fly Me