19th > January > 2005 Archive


Intel revamps Centrino

UpdatedUpdated Intel launched the second generation of its Centrino notebook platform today, as expected. But how will prospective buyers know whether they're buying a machine based on the new version or one of the many old-style Centrino boxes Intel expects vendors to continue to offer? Certainly, there will be a diverse array of machines based on second-generation Centrino - codenamed 'Sonoma' - from basic machines through to high-end executive-oriented thin'n'light systems, with widescreen multimedia-friendly consumer boxes somewhere in between. In all, Intel anticipates the announcement of 80 different Sonoma-based models this month, with 150 on store shelves by the summer. From the start, the mix should be around 50:50 consumer:business, the chip giant said. Intel expects the Centrino brand to cover an increasingly broad line-up of notebooks, ultimately making it harder for buyers to decide what the brand actually means. Clearly, a Centrino notebook will - for now - be based on a Pentium M processor, with associated chipset and wireless functionality. In the Sonoma generation, chipset will be a 915PM, 915GM, 915GMS or 915GML, all members of the 'Alviso' family, which, as the '915' and 'M' naming scheme suggests, is a mobile version of the desktop 915 series, aka 'Grantsdale'. Alviso, as we've noted before - Sonoma's details being among Intel's worst-kept secrets, in addition to the many beans the chip giant has spilled itself - provides "power-optimised" 533MHz frontside bus, PCI Express, Serial ATA, 400MHz and 533MHz DDR 2 SDRAM, Intel's Hi-Def Audio, dynamic screen backlight power management, and - in the case of the GM and GMS chipsets - its DirectX 9-compatible, Hi-Def video-enabled Media Graphics Accelerator 900 integrated imaging engine. Mix'n'match How many of these features make it into notebooks remains to be seen. Certainly the MDA900 will, alongside more gamer-oriented products using the discrete 915PM chipset and a PCI or PCI Express standalone graphics chip, but Intel said it expects plenty of PCI-based machines to appear, particularly at the low-end of the consumer and business arenas. Notebook-friendly Serial ATA hard drives are a wee way off too, so it's a good job Alviso supports IDA storage. Sound support is likely to extend from AC'97 levels up to fully-featured Hi-Def rigs with multiple microphones for better voice recognition and Dolby 7.1 Surround Sound, but most machine will tend to the lower end of the scale. DDR 2 is crucial to the consumer, not necessarily because of the battery life benefits its lower power consumption (than plain DDR) brings, but because, unlike Grantsdale, Alviso will not support the older DDR specification, an Intel spokesman told The Register. One of the latest Pentium M processors is crucial too. Intel launched the 730, 740, 750, 760 and 770, all of which support a 533MHz frontside bus, include 2MB of on-die L2 cache and incorporate the 'no execute' Execute Disable Bit system used by Windows XP Service Pack 2 to block certain viruses. You'll note there are no 533MHz FSB Pentium M variants without EDB support, hence the lack of the 'J' suffix usually used to indicate the incorporation of this technology. The five chips are clocked at and 1.6, 1.73, 1.86, 2.06 and 2.13GHz, respectively. Intel also launched today the low voltage 1.5GHz LV Pentium M 758 and the ultra-low voltage 1.2GHz ULV Pentium M 753. Both CPUs go with the 915GMS chipset. The aforementioned 915GML chipset is pitched at the new 1.5GHz Celeron M 370 and the 1GHz ULV Celeron M 373. Last on the list of Sonoma components are Intel's ProWireless 2915ABG and 2200BG Wi-Fi chipsets, which offer 802.11a/b/g and 802.11b/g support, respectively. Again, expect vendors to mix and match both WLAN adaptors into a range of Sonoma-based machines. Both adaptors are already widely available, shipping with Intel's ProWireless Wi-Fi connectivity software. Speaking of software, Intel re-iterated its commitment to Linux, but could not provide a timescale for adding support for Sonoma's new features to the open source OS. Specific prices were not available as we went to press, but Centrino bundles incorporating the new CPUs and chipsets, and existing Wi-Fi adaptors range from $270-705 in 1000-chip batches. The two new Celeron M parts cost £134 and $161, respectively. ® Related stories Intel 2.13GHz Pentium M 770 arrives Sony unveils 'Centrino 2' notebook family Boxed 533MHz FSB Dothans seen on sale Intel's record Q4 run ends with profit drop Toshiba announces Sonoma-based notebook early AMD unveils Centrino spoiler Intel demos 65nm dual-core mobile CPU 2004 in review: processors and semiconductors
Tony Smith, 19 Jan 2005

Google's No-Google tag blesses the Balkanized web

Karl Auerbach's prediction that the internet is balkanizing into groups of people who only accept traffic from each other took another step closer to reality today. The veteran TCP/IP engineer and ICANN board member has warned of the effect for years. "The 'Net is balkanizing. There are communities of trust forming in which traffic is accepted only from known friends," Auerbach told Wired last year. The trend can be seen at various levels. At the user level, where we see bloggers repeating each other in an echo chamber and reinforcing their views; in the middle of the network, where Verizon recently blocking off inbound email from Europe, and it's happening deep down at the packet level too, as a result of the net's background radiation. But all these may look like an innocent prelude. Google said today that its search engine will respect a new link attribute, "rel=nofollow", which will means its algorithms will not give weighting to the target URL. MSN, Yahoo! and blog vendors said they'll follow suit. It's effectively declaring PageRank™ dead for weblogs, in an attempt to stem the problem. The problem is the explosion of comment spam, whereby spammers use the open comment sections of weblogs to promote their wares in the major search engines rankings. PageRank™ era, too. Google owed its success in part to the early effectiveness of link maps, but it has since demoted the factor after widespread criticism, and some embarrassing incidents. [More on that at The New York Times and Le Monde]. It's also a major blow to the 'Religion of the Hyperlink', faith in which you can see expressed in phrases like "the uniquely democratic nature of the web", coined by Google. Obviously this doesn't refer to spammers who are voting early and often, and in ever greater numbers. Comment spam has increased exponentially since last November. (We'll explain why later this week). Like email spam, it's a classic tragedy of the commons. But other options are available, which have more predictable conseqeunces. One such is verifying the user via a "Captcha", a challenge-response system which presents the user with a graphic of a distorted word or sequence of letters, which a human can interpret but a bot generally cannot. Not everyone believes the benefits of the nofollow link attribute outweigh the potential side effects. "This will do very little to cull comment spam," notes ThreadWatch's Nick W. "Spammers will just redouble their efforts to hit blogs without the plugin... It could skew the web." "Am I the only one to think that a search engine actively trying to encourage people to hide their content from it, isn't going to flaw their main aims?" observes one member of the Search Engine Watch Forum. "If such a tag were used widespread against comments and trackbacks, then wouldn't this end up kneecaping blogs, by killing their intricate networks of interlinks?" he adds. Other forum members renew the call for blogs themselves to be removed from the main index and placed in a separate part of Google, like Usenet forms Google Groups. The idea was first floated by a reader more than two years ago, and is very a popular solution amongst regular Google users. That would ensure the main content of Google consisted of material edited by humans, rather than the wasteland of abandoned sites wide open to spammers, that spammers naturally abuse. "The Spammers have Won," reckons blogger Andy Wismar. "The best and brightest at Google, MSN, Yahoo!, and blog tool creators have gathered to say 'In your face, we'll just devalue EVERY link, crippling your model and our own in the process!". Displaying not a little control-freakery, some thin-skinned bloggers - who notoriously shun dissonant views - were quick to welcome the move. "Now I can link to things I don't like without sending Google (er, or MSN or Yahoo) love over to them," writes an excited Robert Scobie on his Microsoft blog. "It's one of those moments that's purely good," said his former employer, Dave Winer yesterday. Winer recently suggested that 'podcasts' should be submitted to him advance so he could determine whether they contained jokes that undermined their 'authenticity'. A sort of blog version of the FTC. ® Related stories Harvard Man in lesbian mix-up wants satire clearly labeled Malware, spam prompts mass net turn off Emergency fixes for blog-clogged Google Blog noise achieves Google KO Google bug blocks thousands of sites Blog noise is life or death for Google
Andrew Orlowski, 19 Jan 2005

Katie.com lawyer to host cyber-bullying conference

Cyber-lawyer and "national expert on cyber-bullying" Parry Aftab is to host a conference on cyber-bullying in Westchester, New York. Particularly attentive Register readers may remember Aftab for her involvement in the saga of Katie.com, the web's weirdest domain name dispute. The argument centred around a book called Katie.com. It told the story of Katie Tarbox, a young woman who was molested by a paedophile who she met online while he was posing as a teenage boy. Trouble was, the domain name was already registered to another Katie, Katie Jones. Once the book was published, Jones' life was "completely invaded". Penguin, the publisher, backed down and renamed the book, but not before an extended campaign on Jones' part, that continued for several years. (See our original coverage from 2000 here.) It says the whole escapade was an oversight, and that the domain was brought to its attention after publishing. But before the dispute was resolved, Parry Aftab got involved: she contacted Katie Jones and tried to persuade her to hand over the Katie.com site. Jones wrote at the time: "She tried to convince me that I should donate the domain name to them. Secondly, she tells me that they're planning on launching some school curriculum thing to teach kids about online safety - and they're calling it Katie.com. Are they insane? No wonder they want me to hand it over." Aftab said she was not working with Tarbox, and accused Jones of having a hidden agenda. Jones told The Register: "When I wrote about the call on my blog and the whole thing got into the news and slashdotted again, they very quickly went into 'damage control' mode, Tarbox denied to anyone who emailed her that Aftab had called me on her behalf, and Aftab became extremely quiet on the subject." Now Aftab is lecturing on how to deal with cyber bullying, the irony is not lost on Jones: "A summit to tell people how to bully others?" she suggests. ® Related stories Penguin backs down on Katie.com Penguin and the great katie.com hijack Penguin sticks head in the sand
Lucy Sherriff, 19 Jan 2005

McDATA gobbles up CNT

McDATA is back on the acquisition trail, with a $235m all-stock deal to buy the number four SAN director supplier CNT. The merged company will dominate the high end Fibre Channel storage networking business; the question is whether it can reverse the steady climb of its rivals Brocade and Cisco. It's just under two years since CNT bought its way into the director (or chassis switch) business, spending around $190m to acquire InRange Technologies. Since then, CNT has failed to make much headway, taking just 7.4 percent of the director market, while McDATA has been losing share to Cisco and Brocade. The deal was announced the same day that McDATA finally released the 256-port Intrepid i10000 core director that it's been talking up for over a year now. Based on technology it acquired with Sanera, the i10k competes head on with the only other 256-port director on the market, namely the well regarded UMD from - yes, you guessed it - CNT. Part of the i10k announcement involved differentiating the two. For example, i10k has iSCSI capability and 10Gig Fibre Channel, both of which the UMD currently lacks, plus the ability to create physically separate network partitions with dynamic resource reallocation. The i10k also has more buffer credits, meaning it can use far longer dark fibres - over 190km for 10Gig and 2000km for 1Gig. However, UMD is already on the market, whereas the i10k won't ship until some time over the next three months. When the McDATA-CNT deal was rumoured last week, one disbelieving analyst commented that a merger of two shrinking companies wouldn't mean that they'd stop shrinking. Other sceptics now highlight the amount of product overlap between the two - the acquisition will give McDATA three different director families. It also produces overlaps in long distance data replication and WAN extension. CNT has valuable expertise here, but so does McDATA, having bought Nishan in 2003 (at the same time as Sanera) for the Fibre Channel over IP technology that it now sells as its Eclipse family. McDATA's EMEA marketing director, Dave Slater, claims that although their products might compete, the important thing is that the two companies have few shared customers. "There's an obvious overlap between the 10k and the UMD, but it wasn't a show-stopper," he says. "Most of the combined market is additive." Integrating the two companies is clearly going to be hard work and painful, but with Sanera and Nishan already absorbed, McDATA boss John Kelley clearly figures he has the expertise to do it. The acquisition is due to complete around mid-year.® Related stories IBM signs up CNT's mega-director McData: buying its way into SAN niches Sanera ready to beta SAN switch
Bryan Betts, 19 Jan 2005

AMD profits disappear in a Flash

After dropping a bombshell on investors last Monday that its fourth quarter of 2004 would be a disappointment, AMD formally announced the numbers today. AMD lost $30m on sales of $1.26bn in the final quarter of the year (and AMD's fiscal year). Despite processor sales rising 26 per cent over the corresponding period in the previous year, AMD's CEO, Hector Ruiz, said the Flash memory business, on which AMD relies for almost half of its earnings, was "freaking dismal". The CPU side saw sales of $730m and profits of $89m. The entire year saw losses stablize and sales grow after a 2003 characterized by heavy losses. Sales in FY2004 were up from $3.52bn in 2004 to top $5bn, with $91.16m net income. AMD shares crashed from $20 to $14.6 on the revelation last week of the poor performance of its flash memory division. AMD blamed intense competition with Intel. Intel has cut the prices of Nor memory aggressively in the past two quarters. ® Related stories Investors gut AMD on memory slip Euro AMD Opteron server demand slows Intel restructures around platforms Intel's record Q4 run ends with profit drop AMD unveils portable video player chip
Andrew Orlowski, 19 Jan 2005

Rambus income slides despite revenue gains

Rambus's income slid during Q4 FY2004, despite a double-figure jump in year-on-year revenue. Net income for the memory technology developer for the three months to 31 December 2004 was $6.5m (six cents a share), down 24.4 per cent on the $8.6m (eight cents a share) on this this time last year and 37.5 per cent lower than the $10.4m (ten cents a share) recorded in Q3 FY2004. Revenues for the quarter topped $38.6m, 19 per cent higher that Q4 FY2003, but down fractionally on Q3 FY2004 ($38.8m). Contract revenues contributed $6m to the total during Q4, up 31 per cent year on year but down 27 per cent sequentially. The "increase in contract revenues over the fourth quarter last year primarily reflects revenues from contracts signed in 2003 for XDR memory and Redwood interface technologies," Rambus said. "The decrease in contract revenue from the previous quarter primarily reflects a decrease in revenue recognized on one serial link contract." The balance of Q4 revenues came from royalties, up 17 per cent year on year and seven per cent higher than the previous quarter's figure. Both gains were mostly due to "an increase in SDRAM and DDR royalties". Costs rose both year on year and sequentially, with litigation accounting for a big chunk of the increase. Rambus ended the quarter with $236.4m in cash and investments. ® Related stories Rambus board plays musical chairs Infineon accuses Rambus of 'litigation misconduct' Rambus sales, earnings rise on royalties Rambus stock falls 13% on appeal failure Rambus offers DDR controller cores Rambus sues for $1bn FTC outlines appeal against Rambus ruling Judge throws out FTC case against Rambus
Tony Smith, 19 Jan 2005

Ad confidence spurs Yahoo!

Yahoo! posted its best ever numbers again today. Quarterly revenue topped a billion dollars for the first time, and after traffic costs were excluded, Yahoo! grossed $785m in income. Net income doubled over the preceding period last year to $373m, or $187m after an investment sale is removed from the numbers, more than double the $75m recorded in the same period last year. Over the year, Yahoo! made $840m profit on sales of $3.575bn. The company does not break down income into banner ads on its own site and the income from classified ads via Overture. Oversees ad revenue saw the fastest growth, with income outside the US up from $118.4m a year ago to $302.6 in Q4 FY2004, despite the defection of AOL Europe to arch ad rival Google. US revenues grew from $545m to $775m. The books value the company at $9bn, and it has $4bn in assets, of which $1.4bn is cash. Both Google and Yahoo! have seen explosive growth from their ad brokerage businesses, although Google recently said that it expects the growth of classified ad revenue to slow. Google's CFO, George Reyes, told a conference recently that the sector remained under a cloud unless solutions are found to combat click fraud. "I think something has to be done about this really, really quickly, because I think, potentially, it threatens our business model," he said. ® Related stories MSN signs up Overture for another year Google! Licenses! Yahoo's! Secret! Sauce! Yahoo! shows paid search pays Google shares fall on less-than-stellar growth fears Couple names baby Yahoo
Andrew Orlowski, 19 Jan 2005

Flagship NHS project in danger

The UK Department of Heath (DoH) will miss the December 2005 deadline set for the roll-out of its Patient Choice electronic referrals system if it does not address low levels of GP support, the National Audit Office (NAO) has warned. Other threats to the project include problems with the IT system underpinning the project, and a rapidly approaching deadline, and slipping targets. As things stand, around 30 per cent of the country will not have access to e-bookings by the deadline, NAO officials told the press this week. GPs refer 9.4m British residents to specialists every year. The Choose and Book system is supposed to give patients more control over when and where they are treated. The roll-out of the e-booking system is well behind schedule, a fact clearly illustrated by the DoH's performance against its own targets. The department said it planned to have 22 hospitals signed up to the system by now. In fact, there are just seven institutions that are fully compliant. The DoH expected over 200,000 e-bookings would have been made by the end of 2004. The actual figure was 63 referrals. And no, we didn't miss any zeros off the end of that number. This is because of an intermittent fault with user authentication - NAO officials said that this problem is now all but solved - a reluctance of GPs to engage with the system and the fact that the system is not as widely available as anticipated at this stage. News that yet another government IT project is facing difficulties will no doubt prompt a few raised eyebrows among Register readers. In this case, there is no suggestion that the supplier has not met the conditions of the contract. Indeed, quite the opposite - the NAO stressed that Atos Origin has fulfilled its contract, and has been paid. The central system itself works perfectly well, officials said, but at the moment, GPs and hospitals can't connect to it. Instead, manual alternatives to e-booking are being implemented as interim solutions. Confused contract How did this happen? Well, the contract for the central booking system did not require that it be integrated with the GPs and hospitals it is supposed to connect. This is dependent on GPs upgrading their systems on a local level, as part of the National Programme for IT (NPfIT). For this reason, the NAO said, it does not expect GP compliance to rise above 90 per cent. "We are not criticising the Department of Health," said Chris Shapcott, NAO director of health value-for-money studies, said. "We are highlighting it as an issue they need to address, very vigorously, if they are to meet their target." When pressed, however, Shapcott acknowledged that responsibility for any problems with organisation and co-ordination did stem from the department of health and the NPfIT. The lack of support from GPs is attributed to the DoH's deliberate strategy of withholding information during the planning stages, so that GPs could be shown a working final product rather than a pipe-dream. However, even among GPs who are familiar with the project, two-thirds report negative feelings about its implementation. The NAO said that steps have been taken to address this problem. It's not all doom and gloom, however. The NAO stressed that this is a progress report, not a final conclusion. There is still a year to go, and the DoH should still meet its deadline if it follows the main recommendations of the report: Address the lack of GP support Ensure that interim solutions don't distract from deploying the full e-booking system Accelerate the roll out of either the full system or interim solutions The NOA's full report, Patient Choice at the Point of Patient Referral, is available here. ® Related stories NPfIT must win medical hearts and minds BMA tells doctors: avoid NPfIT's flagship project BMA calls warning on NHS IT
Lucy Sherriff, 19 Jan 2005

Rambus poo-poos Hynix Euro patent victory claim

Rambus has accused Hynix of making "misleading statements" about the two companies' legal spat, calling the South Korean DRAM maker's victory claims "outrageous and irresponsible". What's so annoyed Rambus is Hynix's comment this week that a recent European Patent Office (EPO) decision related to one of Rambus' European patents "effectively clears the company of infringement charges". Hynix reported that the EPO has accepted a motion filed by Infineon and Micron that Rambus had unfairly widened the scope of its European patent number 1,004,956. Consequently, the Rambus' allegation in the US hold no water, Hynix said. Not so, said Rambus, re-iterating its plan to see Hynix in court in March. The move to trial follows last week's ruling by San Jose District Court Judge Ronald Whyte that the patent infringement allegations made against Hynix by Rambus can be tested in court. Hynix had asked Judge Whyte to dismiss the suit. "The EPO decision is a separate and limited matter, and we respectfully request that Hynix correct the statements attributed to it," Rambus counsel and senior VP John Danforth said in a statement. Hynix sued Rambus in 2000, aiming to pre-empt legal action from Rambus by asking the court to rule that its products do not infringe Rambus patents. Rambus sued anyway, alleging that Hynix had indeed violated its patents. Rambus is also suing Infineon, Siemens, and Micron. ® Related stories Rambus income slides despite revenue gains Rambus board plays musical chairs Infineon accuses Rambus of 'litigation misconduct' Rambus sales, earnings rise on royalties Rambus stock falls 13% on appeal failure Rambus offers DDR controller cores Rambus sues for $1bn FTC outlines appeal against Rambus ruling Judge throws out FTC case against Rambus
Tony Smith, 19 Jan 2005

Lastminute.com takes ad rap

Lastminute.com has been given six of the best for failing to respond to complaints about two saucy ads plugging its online holiday deals. The ads featured close-up shots of the bottom and breasts of a bikini-clad woman on a beach. Both ads were accompanied by suggestive headlines. Eleven people contacted the UK's Advertising Standards Authority (ASA), complaining that the posters were offensive, demeaning to women, unsuitable to be seen by children and "irresponsible because they could distract motorists". Some of the complaints also said the ads were "particularly offensive" because they were placed in areas with a large Muslim population, while another was sited near a mosque. The ASA rejected the complaints insisting that they were not demeaning to women and "unlikely to cause serious or widespread offence". And while it recognised that the images might cause offence to some Muslims, the ASA ruled that this was not enough for it to take action. However, Lastminute.com's downfall, disclosed this week by the ASA, was its failure to respond to the complaints. Said the ASA: "The Authority was concerned by the advertisers' lack of response and apparent disregard for the Code." As a result, both complaints were upheld. ® Related stories Madasafish 'Churchill' ad banned BT savaged for 'poorly run' free flights promo OFT rattles sabre over 'free flights' web offer V Two One told to pull 'UK's cheapest ISP' claim
Tim Richardson, 19 Jan 2005
homeless man with sign

Job cuts hit Freescale earnings

Freescale, Motorola's one-time semiconductor division, yesterday said it just about made a profit during its fourth quarter of fiscal 2004 as sequential revenue growth collapsed and redundancy costs slashed the bottom line. The chip company also announced the loss of its manufacturing chief and of the executive who steered Freescale to independence from its parent. Q4 FY2004 yielded revenues of $1.43bn - exactly the same as Q3 FY2004 but four per cent up on the Q4 FY2003's $1.37bn. Net income for the quarter was a paltry $5m (one cent a share), well down on the $57m Freescale earned last quarter and the $98m it made in Q4 FY2003. Freescale blamed the fall on restructuring costs of $79m and a further $5m arising from its split from Motorola. Some 1000 jobs were axed during the quarter, costing more than anticipated - it previously forecast the move would require a $65m charge. It will spend another $10m on restructuring in Q1 FY2005, it warned. One casualty of that restructure are the company's COO and president positions, both jointly held by Scott Anderson. He says the time is right to "move on [now that] Freescale is firmly established as a fully independent company". Motorola is not seeking a replacement. Anderson spent 27 years working for Motorola and Freescale, and was instrumental in steering the latter from its role as Motorola division to separate entity. Initially president and CEO, he took the COO role in May 2004 after appointing IBM Microelectronics' Michael Mayer to the CEO post. Gone too is Chris Belden, Freescale's manufacturing chief, to be replaced by Alex Pepe, currently head of the 32-bit embedded controller division of the company's transportation and standard products group. For the year as a whole, Freescale achieved net income of $211m (62 cents a share), compared to 2003's $366m loss. Revenue was $5.72bn, up 17.7 per cent from 2003's $4.86bn. For Q1 FY2005, Freescale expects revenues of $1.38-1.47bn, with pre-tax separation expenses of around $10m and new restructuring charges of approximately $10m. ® Related stories Is IBM PC sell off preparation for a Power chip attack? Intel to retain top chip maker title on 04... Dual-core IBM PowerPC 'to ship in single-core form' Freescale 1000-worker cull to cost $65m Motorola taps Big Blue talent for chip biz 2004 in review: processors and semiconductors
Tony Smith, 19 Jan 2005

Mobile DRM levy hits operators where it hurts

News that the MPEG Licensing Authority had reached a royalty policy for the patents in the Open Mobile Alliance DRM 1.0 specification, will have come as a shock to many operators and handset makers that have been led to believe that OMA DRM was to be royalty free. At the level they have set them, the royalties are likely to run into billions of dollars over the next few years. The innocuous looking statement, issued last week finally put a tally on the cost of adopting the technology with operators having to pay a royalty of $1 for every device that is issued which used the OMA spec, and a further one per cent of any transaction in which an end user pays for delivery of a digital asset using OMA DRM. According to ContentGuard CEO Mike Miron, speaking to Faultline last week: “The OMA didn’t choose to use our technology for implementing its Digital Rights Langauge for OMA 1.0, and instead chose to use a system developed by IPR systems in Australia. We told them that this wouldn’t mean that they could escape our patent portfolio and we’ve been telling them that all along. “It shouldn’t be a surprise that suddenly the MPEG LA has issued a joint patent covering OMA DRM 1.0, but OMA has been strongly suggesting to its members that its standard would be royalty free.” Miron added: “We’ve heard two opinions with some people welcoming it and others saying that the royalty is too high, but that’s just because they thought it was free, and any charge is too high once you think something is free.” MPEG LA announced that an initial group of essential patent holders including ContentGuard, Intertrust, Matsushita, Philips and Sony will license the portfolio of patents collectively, with MPEG LA collecting the royalties. The five companies reckon they have all the patents necessary for implementing OMA DRM 1.0 and most of those needed for DRM 2.0, although that hasn’t been finalized yet and could potentially mean involving technology from one or two more companies. But MPEG LA is offering the royalty payments for both DRM 1.0 and 2.0 for now, and is prepared to backdate coverage of the technology to January 2004 with payments only needing to be made from January 2005 onwards. Intertrust is known to be a world leader and patent holder in trust models, architectures for moving encryption keys around within a digital rights management environment, while ContentGuard produced the work that the ISO and MPEG21 has standardized on for expressing rights in an XML-like meta language. The MPEG LA initially called together patent holders in October 2003 and a first meeting was held in secret. Later in October 2004 MPEG LA said that its DRM Reference Model version 3.0 was released to specifically cover OMA 2.0, but no terms had yet been decided upon at that time. Although the patent royalties could hit both handset makers and operators, the payment of the $1 device charge is to be made by the company selling its handsets directly to the public, in many cases the operator rather than the handset supplier. With major Cellcos like Vodafone using OMA to provide protection for content on its Live! system, this means that one per cent of Live! content revenues could be up for grabs just as every competitor that Vodafone has is trying to launch a similar service. Over the next few years some 2 billion phones are likely to end up with DRM on them, making a basic handset revenue of $2bn among these patent holders with out the charges for service revenues, which we would expect to be higher. Vodafone itself uses CoreMedia's software which is OMA DRM compliant based on JAVA, ODRL and XML and is used to allow Vodafone to securely distribute multiple content formats such as music tracks and videos. OMA DRM 1.0 has also been implemented by virtually all phone makers and operators and even by conditional access suppliers such as News Corp’s NDS. Now all of their customers will find themselves hit with a royalty bill where they thought there would be none. And there’s probably no getting round it. Even if they stick to older proprietary technology instead of adopting OMA, there’s no certainty that these won’t also require royalty payments. ContentGuard is now owned by a triumvirate of Microsoft, Time Warner and Thomson, while Intertrust is owned by two of the other patent holders, Sony and Philips. MPEG LA also licenses patents for MPEG-2, Firewire, Digital Video Broadcasting Terrestrial, MPEG-4 Visual (Part 2), MPEG-4 Systems and AVC/H.264 (also known as MPEG-4 Part 10) standards and is working on a similar patent pool for Microsoft’s VC9 (also called SMPTE VC-1) video codec, the ATSC standard, and the DVB Handheld standard. Copyright © 2004, Faultline Faultline is published by Rethink Research, a London-based publishing and consulting firm. This weekly newsletter is an assessment of the impact of the week's events in the world of digital media. Faultline is where media meets technology. Subscription details here. Related stories Phone biz agrees on $1 DRM levy Guilty until proven innocent - DRM the mobile phone way Java and DRM key to mobile ambitions
Faultline, 19 Jan 2005

ContentGuard talks DRM futures

ContentGuard has followed the traditional development path of most intellectual property businesses. First it thought is was a product company with dreams of building a DRM monopoly on the back of technology leadership, then it found the going tough and the pickings too small and finally it dropped back to pushing intellectual property with a handful of key staff. Today it is a 32-man operation encompassing two locations, selling IP and software tools, as well as standing behind the standard for a rights expression language called XrML, backed by both MPEG and ISO. The company is now owned by Microsoft, Time Warner and Thomson, the French CE manufacturer. Michael Miron the CEO admits to not being one of the technology developers, but has been associated with the business since 1999 when Xerox asked him to build them an internet business. Back then the DRM group was 150 people strong, and he grabbed together this and some other segments of Xerox that looked like an internet play and then he told them that he didn’t think it was going to work as an internet business. About that time Microsoft made enquiries about a license and Miron suggested that instead it took a shareholding, and by Spring of 2000 ContentGuard was born as a separate entity. If you changed the company name to Intertrust and instead of Microsoft and Time Warner, put Sony and Philips and changed a few dates, you would have virtually identical stories. So it is no surprise that these two IP companies' technology is at the heart of this week’s move to offer a patent pool by the MPEG LA for DRM offerings such as OMA DRM. Blunt talk When Miron talked to Faultline this week it became clear pretty soon that he was blunt speaker about DRM and that he sounds not a bit like a Miscrosoft puppet, despite the software company’s influence: “Content owners don’t want to be constrained and neither to end users, but everyone in the middle wants DRM to be proprietary. And that’s one of the reasons that we don’t see a single trust model used for DRM," he began. “Software vendors especially want to keep the trust chain and rendering proprietary. “Many people confuse DRM with security and privacy, but when you end up with a lousy customer experience, end users vote with their wallets and their feet, and they leave. “What DRM is really about is a combination of technology and a fair offer. It’s not there so that you can shut it down and stop people from copying content, DRM is there so that you can run a business containing the number of breaches with a combination of technology and a friendly experience. “DRM must always be distinguished from perimeter security which is about letting no one get into a system. DRM is far more complex. “The basic construct behind DRM is that there is a Source for some data, and he data is provided, along with a license, each delivered separately. The asset can come from one source, and the license from another and they may go to different destinations but they are bound together. The license contains authorizing information for the asset and that needs to be expressed using a Digital Rights Language, which describes what the destination can do with the asset and under what conditions. “The reason that we say that there will never by a completely standardized trust model is because a PDF for instance, that contains a newspaper article, needs very different protection from a PDF that contains US military secrets. “We believe that web services will define a lot of trust models, which is why we have tried to work alongside web services standards and get our Rights Expression Language adopted there. It is very hard to get vendor systems to trust one another, but a web service can manage interoperability for you and it can be made to do some quite complex workflows, which you can never completely work out in advance. As web services they can be changed.” Miron adds that ContentGuard XrML has been proposed as part of the web services technology in the security token in the Web Services Security standard. He then offered the concrete example of a Movielink-hosted film being downloaded to a home server: “Here the home hub is not the end user, but a distributor. Sometimes the data source, for instance a studio or Movielink, will never know who the eventual end user is. "The system has to cater for a new license being issued by the Hub to a PC or a DVR or even a mobile phone or some future device we haven’t thought of." Miron’s point is that DRM should produce a system where there is never any need for each of these devices to talk directly all the way back to Movielink. Miron and ContentGuard propose a model of DRM that relies on five things: formats, digital identifiers, meta data, rights expression and trust model. Under MPEG 21 this is a little more complicated with a transaction language and automatic content discovery added to this. But in the end Miron believes that we can agree on the standards for format, identifiers and meta data or freely translate between different choices, leaving the Rights Expression Language and the Trust Model the only two things that need to interoperate for DRM to become interoperable. And of course if everyone uses the ContentGuard REL, that leaves only the Trust model that needs gateways. And given that Miron doesn’t think that there can be a universal trust model (although there might be a common one for one function like delivering movies) it is fairly obvious to see why he is so fanatical about ensuring there is only one Rights Expression Language and why he is so against the one chosen from IPR Systems in Australia for the OMA DRM reference model. “The more of it you standardize, the easier it gets to do,” he says. Trust model In the end he wants to bring DRM interoperability down to one variation - the trust model - where each web service that runs a particular form of delivery, selecting its own appropriate trust model. For media, models such as Microsoft WM DRM, Intertrust, Apple’s Fairplay and Sony’s MagicGate and perhaps the DRM interoperability effort of Macrovision are the only ones that matter. Get their trust models to interoperate and you have the potential for global DRM. “We think that Apple will have to re-architect Fairplay from being operating system based before it can have input from CDs and from other DRMs,” said Miron. And he adds that his own XrML can do way more than simply express rights and he insists that it can be used as the main common architecture for DRM interoperability. “We are working with the open Ebook standard, E-learning standards and even taking a look at DVB (Digital Video Broadcasting). We have suggested the notion that they look at this work and consider it for broadcasting. At the moment DVB has just got as far as broadcast flags, but they are in their information gathering phase. They could do so much more if they implemented a full REL. “XrML can turn a DVR into a networking hub, and we are talking to the TV Anytime Forum about that. Here we have met two major philosophies. On the one have you have the US conditional access players that just want to enhance what they’ve got already and on the other you have the more enlightened Asian CE companies that want to do something more advanced here. “We are coming out of technology that began in 2000, that was basically a set of proprietary DRMs, where one vendor supplied an end to end solution and their aim was entirely defensive to combat piracy. “And we are headed in 2005 for interoperable standards built around componentized DRM, which facilitate new business models that are more offensive and less defensive. “At business school I studied the Crisis Change Model, where an industry faces a new technology, moving from ignorance to denial to active combat and then sinks into defeat, resignation and disillusionment. Eventually new ideas are embraced and then leveraged. That’s where we are with DVDs, leveraging them. “I would say we are at the combat part of the cycle right now on internet delivery, moving on towards defeat. Eventually someone shows the way like Steve Jobs and Apple did for online music distribution," Miron concludes. Copyright © 2004, Faultline Faultline is published by Rethink Research, a London-based publishing and consulting firm. This weekly newsletter is an assessment of the impact of the week's events in the world of digital media. Faultline is where media meets technology. Subscription details here. Related stories Mobile DRM levy hits operators where it hurts Phone biz agrees on $1 DRM levy Europe pauses Microsoft DRM probe EC objects to MS - Time Warner ContentGuard takeover
Faultline, 19 Jan 2005

Microland goes bust

Manchester-based PC builder Microland Technology has gone bust. Administrator Unity Corporate Recovery has scheduled a meeting of creditors for 31 January at its Bolton offices in order to put the Internet-based retailer into liquidation. Unity was appointed on 6 January after it became apparent that Microland was insolvent. David Edgehill, an administrator at Unity, estimates that Microland went under with approximately £350,000 in debts. A report on Microland's financial position will be presented at the January meeting. Microland employed ten people in addition to three directors, all of whom are almost certain to lose their jobs. ® Related stories Paradise Computers goes titsup Highberry fails to force Colt into administration £100 PC business wound up by High Court
John Leyden, 19 Jan 2005

Gloves off as ISPA gong nominees named

The UK Internet Services Providers' Association (ISPA) has received a record number of entries for its annual awards this year, with 180 entries across 17 categories, the trade body said today. Fighting it out for Best Consumer ISP are freenetname, Pipex, UK Online, Virgin.net and Wanadoo. Those keeping their fingers and toes crossed for the Best Business ISP gong are Datanet, Mistral, NDO, Pipex and PlusNet. The tussle for Best Portal this year is between Blueyonder from Telewest Broadband, MSN, Yahoo! UK & Ireland, Tiscali and Wanadoo. There are a stack of other award categories, which can all be found here. ISPA has already announced the short-list for the most interesting awards of its annual industry get-together. BT, Carol Vorderman and the European Union (EU) are all in the running for the dubious honour of being named Internet Villain of the Year. Former ecommerce minister Stephen Timms and sprawling communications regulator Ofcom are in the running for the title of Internet Hero. The winners of all awards will be announced at a ceremony to be held in London on 24 February. ® Related stories BT wrestles with Carol Vorderman for 'net villain' award ISPA bigwig resigns over support for UKIF Bulldog airbrushes 'Best Broadband ISP' logo
Tim Richardson, 19 Jan 2005

Intel 'Smithfield' to run 130W hot

Intel's upcoming dual-core 'Smithfield' desktop processor will dissipate up to 130W of power - 13 per cent more than today's Pentium 4 chips - it has emerged. The revelation comes from Tom's Hardware Guide (THG), which says it has seen internal Intel documentation covering Smithfield's thermal design power (TDP) characteristics. According to the document, Smithfield will consume up to 130W and draw 125A of current up from the 115W and 119A specifications of the 5xx and upcoming 6xx series of single-core 90nm 'Prescott' P4s. Smithfield will require more power than Intel's hottest burning CPU so far, the 1.6GHz Itanium 2, which has a TDP of 122W. The 3.46GHz P4 Extreme Edition consumes 116.7W. THG assumes that Smithfield will be fabbed using a 65nm process and thus attributes the increased power requirement to leakage. However, it's almost certain that Smithfield is a 90nm part - Intel's first 65nm chip is expected to be the dual-core Pentium M, 'Yonah', and that's not due to appear in volume until Q1 2006. Indeed, Smithfield may not even be a true 'two cores on one die' processor, but instead simply feature two separate Prescott cores wired together in a single package. Certainly, comments from Intel last year suggest that this may be the case. In a discussion with The Register at the time, Steve Smith, VP for the Intel's Desktop Platforms Group, would not confirm the exact nature of Smithfield's dual-core status. ® Related stories Intel restructures around platforms Intel delays death of 100MHz Pentium Intel Smithfield chipsets said to support SATA 2 Intel invests in three digital home firms Intel 'to cut' Celeron D, Grantsdale prices Intel confirms dual-core desktop 'Smithfield' Intel demos 65nm dual-core mobile CPU
Tony Smith, 19 Jan 2005

AAS: astronauts not robots should fix Hubble

The American Astronomical Society (AAS) has added its voice to calls for a manned mission to carry out essential maintenance on the Hubble Space Telescope. The AAS endorsed the National Research Council's recommendation that the telescope be serviced by astronauts using the Space Shuttle rather than NASA's suggested robotic mission. The three main recommendations of the NRC's report are that NASA should commit to the servicing mission; that the mission should be carried out on Shuttle and that a robotic mission should only be considered to "de-orbit" the telescope at the end of its life. This, the NRC says, will allow "time for the appropriate development of the necessary robotic technology". Dr. Robert Kirshner of Harvard University, president of the AAS, said that Hubble was "one of the best things NASA has ever done". He forecast that the NRC's recommendation - that a manned service mission is the least risky option - would be accepted by NASA and congress. Service Mission Four - or SM4, as it's known - will include repairs to Hubble's gyroscopes and batteries. These would extend the life of the 'scope to 2013. SM4 will also see the installation of a new wide-field camera to make observations in the infrared, ultraviolet and visible wavelengths, and a spectrograph tuned to faint UV light in ordet to seek out traces gas from the early universe. However, since the Columbia Shuttle disaster, NASA's chief administrator, Sean O'Keefe, has been adamant that the 'flying brickyard' would only be used for flights to the International Space Station. He proposes sending a robotic mission to repair the telescope, but there are concerns that the technology will not be ready in time to save Hubble. ® Related stories Deep Impact en route to Tempel 1 Extra-solar planet snapped by galactic paparazzi Scientists watch matter fall into black hole Work begins on Hubble's replacement
Lucy Sherriff, 19 Jan 2005

World PC sales still growing

Global PC shipments grew 14 per cent in the last quarter of 2004, according to research from market watcher IDC. The small and medium business market and Christmas sales were credited with driving the expansion. Total shipments grew to 51.5m units in the quarter, an increase of 13.7 per cent, better than IDC predictions of 13 per cent growth. For the whole year, the industry shipped 177.5m machines - 14.7 per cent up on 2003. IDC predicts the market will grow by about ten per cent in 2005 before growth shrinks to single-figure increases from 2006. Loren Loverde, director of IDC's Worldwide Quarterly PC Tracker, said: "Business demand and growth in key regions like EMEA continue to drive the market. Although we saw a seasonal rise in consumer shipments, particularly in EMEA and the rest of world [outside the US], business remains a larger market and has been growing faster since mid-2004. Ongoing PC replacements and new investment should continue to drive commercial growth at least through the end of 2005." Dell shipped just under 8.8m machines in the quarter giving it a 17 per cent share of the market, one per cent ahead of HP. Apple and Gateway both showed strong growth. Rival analyst house Gartner also saw a growing PC market but credited increased laptop sales for the figures. Gartner said sales grew 11.8 per cent in 2004 compared with 2003. It estimates the industry worldwide shipped 189m machines, up from 169m in 2003. It also has Dell in top vendor spot with a market share of 16.4 per cent - two points ahead of HP. More details available from IDC here. More from Gartner here. ® Related stories Gartner buys research rival META for $162m Analyst lifts 2004 chip capex forecast... Replacement kit dominates world PC sales
John Oates, 19 Jan 2005

AOpen i855GMEm-LFS desktop Pentium M mobo

ReviewReview With more computers making their way into the living room, consumers are demanding quiet systems that don't spoil the relaxed atmosphere. Trying to address this point, AOpen has produced a motherboard that should become very popular with anyone looking to build a low-noise PC, writes Lars-Goran Nilsson.
Trusted Reviews, 19 Jan 2005

Tapping the telecoms barometer

Reg Reader StudiesReg Reader Studies The year end is traditionally ‘out with the old, in with the new’, but from the results of The Register/Quocirca year end barometer survey, it looks like for the IT and telecoms industries, the ins and outs from 2004 into 2005 will again include more ‘shaking all about’. The Register and Quocirca ran this survey to examine a number of key IT industry pressure points. We had almost 6000 respondents split 50/50 between vendors and end users of IT from organisations of all sizes. We call it a barometer survey because we will continue to gauge how the market is changing by asking these questions again at regular intervals. 2004 was a recovery year from an IT and telecommunications perspective as many suppliers saw the decline following the dot com boom give way to a levelling out and even slight improvement in some sectors. The shaking all about of ‘convergence’, or perhaps more accurately, ‘collision’ of networking, both fixed and mobile, with IT continues apace, and there appears no let-up in the physics of speeds, feeds and sheer volume of digital data. With data protection, governance and security concerns growing, keeping information in the office is proving almost as challenging as making it mobile. Once out of the office all bets are off. Mobility is rapidly moving from a new concept to a way of life, as many realise the benefit of out of the office access to in-house applications as just an extension of business as usual rather than a radical shift. Few regard remote access as a bad thing. In 2004 mobile email moved from needing to be explained, to something most people have seen or used. GPRS became a standard feature of all new corporate mobile phones in 2003, then in 2004 cheap, robust and usable smartphones became available for the first time and corporate decision makers became comfortable with mobile email as a concept. Although this makes business justification easier in some respects, costs have still kept 2004 deployments down somewhat, but mobile email usage alone was noted by almost a quarter of respondents, with a slightly higher percentage in smaller companies. These users are not only gadget gatherers, but black collar marketing roadies, industry groupies, like analysts or journalists and most obvious of all, the BlackBerry executives. In 2005, mobile email will become less of a badge of importance, and more the opposite – a tool for the mainstream. Tri and quad-band cellular devices proliferated in 2004, but 2005 will bring in more multi radio devices and 3G, Wi-Fi, and Bluetooth with everything. Great news for frequency hoppers and spread spectrum betting, but the last thing users want is to have to make decisions about connection methods. Over half of respondents are currently using Wi-Fi somewhere, meaning 2004 was the year for putting some Wi-Fi networks and supporting back office systems in place. Now public Wi-Fi suppliers must increase the ease of access to services and 2005 will be the year when the roaming agreements and value chains are worked out so that businesses can procure services fitting their requirements for coverage, tariff and support. The difficult question is who will deliver mobile solutions to the enterprise? After many false starts and a lot of poaching of good people from the IT sector, 2004 was the year when mobile operators finally started to understand what their corporate customers want from them. Will 2005 be the year they actually start delivering? Quocirca expects the wireless and fixed aggregation and roaming issue to rise significantly in importance, as companies need to simplify all forms of remote or mobile connectivity. The hotspot to mobile operator roaming agreements of 2004 are good indicators of the mature thinking beginning to emerge. Finally, voice and data convergence. BT is pouring £10b into replacing all non-IP functions with pure IP-play ones, and every service provider is making a play for running voice services over their data networks. When asked about voice over IP, our survey resoundingly showed this isn’t just wishful thinking on behalf of suppliers, it’s really gathering speed. Very few seem keen to bury their heads in the sand and say convergence will not happen. Grasping this opportunity for greater efficiencies and effectiveness in the short term makes sense - and Quocirca believes that companies who embrace this will soon reap the benefits. Mass mobile mail, the aggregation of access methods and the convergence of voice with data - it might all seem messy and complex, but the good news for end users is that after a few years of focus on individual technologies, vendors are finally getting the message that users want solutions that add value or efficiency to their business. Your industry needs you Sign up here to become a permanent member of our Reg Reader Studies Survey Panel. You'll get the occasional email alerting you to a new survey and may even get the chance to win Reg goodies. Lovely. ®
Quocirca, 19 Jan 2005

An MP3 player you can talk to...

A US firm specialising in metadata for music files is working with a voice recognition company to enable voice-controlled music devices for use where hand control is impractical. Gracenote is working with voice recognition specialist ScanSoft and early products will be aimed at Japan. Ross Blanchard, a Gracenote veep, said voice recognition would make devices easier to use: "For example, these applications will radically change the car entertainment experience, allowing drivers to enjoy their entire music collections without ever taking their eyes off the road." Alan Schwartz, vice president of SpeechWorks, a division of ScanSoft, said speech is a "natural fit for today's consumer devices, particularly in mobile environments, and the increasing portability of large libraries of music and video files make speech a necessary interface for safety and convenience for entertainment devices." Apart from allowing navigation around songs the technology will also allow you to ask for music by genre, ask for more information about a song or even hear similar tunes. The firms hope to have products ready to market by the end of 2005. Gracenote works with new media firms such as RealNetworks and iTunes as well as consumer electronics brands like Clarion, Kenwood and Pioneer. In June last year Apple did a deal with BMW linking its iPod music player to the car's existing stereo controls on the steering wheel. ® Related stories Samsung launches speech-to-SMS phone Speech recognition 'on a chip' in three years Boffins test voice-activated secure credit card
John Oates, 19 Jan 2005

Unified messaging – voicemail for 3G?

Voicemail is regarded as a killer application for telephony and some would say communications killer too - too many hid behind a voicemail screen, until mobile phones became the prevalent business communication tool. This is changing: Quocirca’s recent research of 150 decision makers in enterprises in the UK, Italy and Germany indicates a significant frequency of employees using a mobile in preference to a desk phone. So how does that change the voicemail requirement? Stored messages are genuinely useful for mobile communication particularly when the recipient is out of coverage or on other calls. But certain message formats, such as SMS, MMS and even mobile emails are stored, then forwarded or otherwise pushed onto the handset in varied and complex ways. Third generation mobile networks and video telephony adds the further dimension of stored video messages. Each of these services sits inside its own silo with an isolated message store, so the complexity of dealing with the total set is passed straight to the user. Often it’s handled badly on the device and there’s little commonality or shared inboxes. Enter the return of unified messaging. One central store with multiple access methods. Why now? When unified messaging was first touted, the technology wasn’t quite up to it, mainly because the gaps between the silos could only be bridged with case by case integration. Arguably it was also a techno-nirvana that didn’t really solve specific user communications needs. Now, IP is moving to the core of telecoms as the unifying glue and users have become familiar with and somewhat dependant on the individual stored messaging services - voicemail, SMS or email. One further thought. Unified messaging was sold as a solution by telecoms equipment companies, because its impact was deep in the network. With IP, it rises to surface as an application, placing it more in the realm of IT companies and services. There is a big concern surrounding the cost of legacy voicemail, and many existing systems are nearing their end of life. Voicemail storage systems attached to the core of the network are expensive to provide and take up valuable real estate, as they’re often in city centre locations by major network switch points. IP solutions can be positioned anywhere, even outsourced or hosted outside the operator. Larger enterprises are already pushing email to the network edge and smaller companies using email hosting, in order to give easier access to remote and mobile employees, but few have contemplated the consequences of IP applied to other messaging formats. Hosted voicemail and messaging services permit users to switch more readily from one mode of access to another – mobilising the office worker without mobilising the entire office. External message store management might faze larger organisations for a while, but for others with minimal IT or communications expertise, hosted unified messaging might prove rather attractive. For the coming generation of 3G users, voice message boxes alone will be inadequate, people need to store and send video, image, voice and text mail – so unified messaging might come back into fashion. However, it probably needs a new name to shake off the legacy of past failures – Multimedia Unified Mailboxes, anyone? Copyright © 2004, Related stories Operators wake up to mobile enterprise needs Is enhanced voice the new mobile data?
Rob Bamforth, 19 Jan 2005

Tsunami spam scammer cuffed

A man from Pittsburgh has had his collar felt by the FBI after allegedly sending out 800,000 bogus Tsunami appeal emails, Silicon reports. Matthew Schmieder was quickly traced by UK spambusters Spamhaus after attempting to fleece netizens of cash with his "fundraising messages". Spamhaus' Steve Linford noted that the swift apprehension of the "rank amateur" ne'er-do-well was due to his failure to cover his tracks. Linford noted: "He had very little in place by way of defences and upon first seeing the emails we were able to very quickly track him down to Pittsburgh. He lived right around the corner from the FBI offices there and they were able to leave the office and pick up him right away." ® Related stories UK's biggest spammer in court VXers hit new low with tsunami-themed worm Texas sues student 'spammer' for $2m
Lester Haines, 19 Jan 2005

Ebbers fraud trial kicks off

The trial into the financial collapse of WorldCom in 2002 got underway in New York yesterday with US District Judge Barbara S Jones ruling that the defence is allowed to probe the private life of the prosecution's star witness. The "star witness" in question just happens to be former WorldCom CFO Scott Sullivan. Last year he admitted "engaging in a fraudulent scheme to conceal WorldCom's poor financial performance" and agreed to co-operate with the investigation against former WorldCom boss Bernie Ebbers. His testimony is widely believed to be crucial to fingering Ebbers as the brains behind the $11bn (£5.8bn) accounting scandal. Ebbers denies any wrongdoing in the demise in WorldCom. In pre-trial motions yesterday the defence team was buoyed by news that Judge Jones had agreed to let the defence quiz Sullivan about alleged "marital infidelities". It's understood that the defence is likely to use this to undermine the credibility of the former CFO. Today the court is expected to begin selecting a jury. The trial is expected to last between four and eight weeks. ® Related stories Ebbers faces WorldCom court showdown Former Worldcom directors cough up $18m WorldCom gets sums wrong by $74bn Bernie Ebbers faces criminal charges
Tim Richardson, 19 Jan 2005

IBM earned $3bn in Q4

IBM has posted its best ever fourth quarter results, reporting profits of $3bn for the period. The company said revenues had risen seven per cent on the same period last year to $27.7bn, boosted slightly by the weak dollar. The company also announced total revenue for 2004 of $96.5bn, and said that 2005 would see that figure rise to $102bn. Mark Loughridge, Big Blue's CFO, said that the market trend streamlining supplier bases had seen many corporations looking for one company to provide hardware, software and support packages. This, he said, has boosted IBM's performance. Analysts had no trouble containing their praise, describing it as a "fairly matter-of-fact quarter" for the company. An analyst for Sanford C Bernstein & Co. told The New York Times: "It kind of speaks to IBM's strength and size." Meanwhile, Goldman Sachs put a "buy" rating on IBM's stock, saying that the company had got a lot of things right in 2004. Selling off its loss-making PC division is almost certainly one of them. All told, IBM is handing Lenovo a business with nearly $1bn in losses in recent years. Shares rose slightly on the news, gaining 80 cents for the day to close at $94.90 per share. The full earnings release can be found here. ® Related stories Pricier-than-planned job cuts hit Freescale earnings Vendors target IBM ThinkPad market share IBM shunts NTL tech support jobs to India Lifting the lid on IBM's Cell chip IBM pledges 500 patents to OS developers IBM exports Liverpool jobs to India IBM hands Lenovo billion-dollar PC loser Dell to build second factory in Europe
Lucy Sherriff, 19 Jan 2005

Legal downloads jumped 900% in 2004

More than 200m songs were downloaded from legal online music stores around the world last year - a 900 per cent increase over 2003's total, the music industry organisation IFPI said today. By our estimation, based on Apple's publicly provided figures, Apple accounted for 90-95 per cent of the market. Welcoming the brave new world of digital downloads and DRM restrictions, IFPI released its Digital Music Report 2005 today. Its conclusion: digital music is proliferating, but plenty more needs to be done to raise awareness of legal download services and to stamp on unauthorised ones. According to the IFPI, 2004 was a watershed in online sales. By the end of the year, some 230 legal download services were operating in 30 countries, compared to around 50 at the end of 2003. The recording companies saw their first significant revenues from the digital market, "running into several hundred million dollars". IFPI cited market watcher Jupiter's estimation of the value of the digital music market in 2004: $330m. If the recording companies are getting more than "several hundred million dollars" of this, it leaves little to share among 230 legal download companies. Still, Jupiter estimates 2005's total will be more than double 2004's, as more punters choose to buy downloads rather than CDs or pinch stuff from the likes of Kazaa and Grokster. Is there really a move away from free downloads? It's hard to say, but a survey conducted in the six biggest European music markets on IFPI's behalf, show that 31 per cent of music downloaders claim they will buy from legal services in 2005, up from almost one in five (22 per cent) this time last year. Campaigns to help promote legal downloads help, but more needs to be done, to raise awareness of the availability of legal services among the key 16-29 age group, among whom only half have even heard of legal download sites, IFPI claims. We argue that more needs to be done to balance the price of downloads with the restrictions and audio quality limitations imposed upon the tracks by the music industry, particularly given the trend toward ever cheaper CDs. Positive reinforcement Making punters care will even harder. IFPI reckons 70 per cent of music downloaders are aware that it's illegal to download from unauthorised sites, but there's little sign that downloading this way has lessened. There may be fewer songs being shared, in part thanks to the Recording Industry Ass. of America's high-profile legal campaign, as well as more restrained moves made by the RIAA's European counterparts; but anecdotal evidence suggests 'illegal' downloading is taking place more than ever. Anecdotal evidence also suggests that active downloarders are downloading tracks they wouldn't otherwise have purchased, and also tend to be high-volume music buyers. Still, plenty use it as a way of avoiding handing the pigopolists and artists their due, so the illegal download arena isn't entirely a picture of a misunderstood and victimised majority. "The biggest challenge for the digital music business has always been to make music easier to buy than to steal," says John Kennedy, IFPI's chairman and CEO. True - so take note, IFPI members and start licensing greater volumes of content, on better terms. And work to eliminate some of the more unnecessarily restrictive DRM limitations you place on your songs. Remember, a happy customer will be a customer again. You know you can do it. ® Related stories An MP3 player you can talk to... ContentGuard talks DRM futures Mobile DRM levy hits operators where it hurts Sunncomm gets a nod and a wink from Universal Napster subscriber tally hits 270,000 Macrovision gives forth on DRM Apple brings discord to Hymn Apple: iPod domination - or just another fad? iPod surge boosts Apple earnings SightSound looks to shut down Napster - again Napster UK pares prices Napster trades on Nasdaq 2004 in review: downloading digital music
Tony Smith, 19 Jan 2005

Oracle boss forecasts PeopleSoft customers' future

Larry Ellison, the combative boss of Oracle, struck a more conciliatory tone yesterday talking about the future roadmap for PeopleSoft products and what support customers could expect now he has bought the company. Oracle has kept 90 per cent of Peoplesoft's engineering and development staff so product lines will continue to be "enhanced". Ellison said: "By retaining over 90 per cent of PeopleSoft's development and support organisation we can deliver on our commitment to all of our applications customers." Oracle paid $10.3bn for PeopleSoft after a long and ill-tempered takeover battle. Observers are concerned that it will be difficult to merge two companies with such different cultures. Oracle is seen as an aggressive firm while PeopleSoft had a gentler reputation. Ellison refuted this analysis, pointing out that PeopleSoft had to be aggressive to survive in the market, and that Craig Conway, ex-Peoplesoft boss, had been one of Oracle's "most aggressive executives ever" when he worked under Ellison. PeopleSoft product lines will be supported up until 2013, and JD Edwards products will also be supported til the same deadline - longer than the one promised by PeopleSoft. PeopleSoft bought JD Edwards days before Oracle launched its takeover bid. Project Fusion is the next stage - to bring together Oracle, PeopleSoft and JD Edwards product lines. This will bring all three product sets together using Java and HTML standards to make the new software easy to integrate into other systems. Developers will continue to work on PeopleSoft 8.9 and then the next upgrade 9.0. John Wookey, VP of application development at Oracle and the man overseeing future joint developments, claimed: "Oracle and PeopleSoft are truly better together. The new combined organisation, comprised of the best talent in the enterprise software industry, will provide customers with greater innovation, support and expertise across industries." Talking about Project Fusion, Wookey gushed: "The new architecture and the results companies will achieve will be truly revolutionary, but the path to the new successor product line will be evolutionary." Wookey provided a roadmap of future releases: PeopleSoft 8.9 will be released in 2005; version 9.0 and JD Edwards' EnterpriseOne 8.12 will both be released in 2006. The start of 2006 will also see the first components of Project Fusion - data hubs and transaction bases. The first Project Fusion applications should be ready in 2007 and a full suite by 2008. Oracle hopes to keep 95 per cent of PeopleSoft customers, partly be promising that maintenance and support costs will be at or below current levels. Oracle will also publish a standard price list. Wall Street analysts get their own briefing on the financial implications of the deal 26 January. More details on the Oracle website here. ® Related stories Lawson lures Peoplesoft punters Oracle and the culling of the 5,000 Oracle finally puts PeopleSoft in its pocket The PeopleSoft vs. Oracle clash
John Oates, 19 Jan 2005

UK gov ready to u-turn on passport-ID card link?

As the UK's ID cards bill charges through Parliament, signs are starting to emerge that the Home Office's dubious packaging plans might be coming apart at the seams. Asked earlier this week to provide a timescale for the addition of fingerprints and iris scans to passports, Immigration Minister Des Browne said a decision had yet to be made, and seemed to leave scope for this never happening. Slight scope - Europe is currently committed to the mandatory inclusion of facial and fingerprint biometrics on passports, and while dab-happy Britain isn't being allowed to play with that trainset directly, we're still very enthusiastic about it all. Browne's answer, to one of a battery of questions from Liberal Home Affairs Spokesman Mark Oaten, is however intriguing. He said that production of passports with a facial biometric would commence in the final quarter of this year, but that: "We are at present considering the benefits and impacts of the possible introduction of fingerprint and/or iris biometrics. Unlike the facial biometrics, inclusion of either or both of these biometrics will require the personal attendance of all passport applicants. I have announced that adults applying for a passport for the first time will have to apply in person from the last quarter of 2006. However to require all applicants to apply in person and to record and store their additional biometrics would have a significant impact on UKPS operations and processes and a decision will not be taken on this until later in 2005." The response is interesting because of the amount of manoeuvring space it leaves when measured up against previous announcements on the matter. The timescale itself has not been fixed, but ID cards will, the Home Office's 'concessions' announcement said, be "issued alongside passports." Obviously when this starts happening everybody will have to show up in person anyway, so there should be no question about that "significant impact on UKPS operations and processes" - if we're still issuing ID cards alongside passports, then the only question is over when the UKPS is going to have to sustain the impact. You could argue that this is what Browne must have meant, but if it's what he did mean he'd surely have found it a lot simpler to put it that way than they way he did put it. Some skulduggery, readers, is definitely up, and if we consider the Home Office's stated reasons for doing various ID-related things alongside the real reasons and the logistical pressures, we might be able to figure out what. Ministers have said, over and over again, that US requirements for biometric passports mean we've got to implement biometric passports, so we must shoulder the costs, and once we've done that we'll have most of the ID card scheme already built, so issuing biometric ID cards will be comparatively cheap. As The Register has repeatedly argued, this is nonsense because there's biometrics and biometrics, and schemes and schemes. The US simply requires that visitors' passports have an ICAO compliant facial biometric on them, and that's precisely what British passports issued from the tail end of this year will have. The US and ICAO don't require iris, fingerprint or the all-seeing, all-knowing database at the back end; you can do that if you like, but it's not compulsory and by the end of this year we will have done everything we need to do for the moment, no ID scheme of national identity register necessary. The EU invention of a passport standard including facial and fingerprint biometrics complicates matters, as does pressure from some people in the US, e.g. outgoing Homeland Security chief Tom Ridge, for fingerprints to be added to US passports. Ridge argues this has to be done in order to keep pace with Europe, while in Europe it has been argued that Europe needs to implement biometric passports in order to keep pace with the US. Which would be funny if it weren't so shameful - it's quite clear what they're all up to. If the European fingerprint requirement remains, and if (a bigger if) the US decides to add fingerprints to its own passports, then we'll be well on the way to the addition of fingerprint biometrics to international standards and the UK will probably have to join in, but it's not urgent, not something we have to do now. Over at the UK Passport Service the prospect of having to do the ground-breaking in terms of biometric recruitment for the whole ID scheme has surely been concentrating minds. We do not at the moment have a passport renewal system that is in chaos, and implementing the two changes of adding the facial biometric, which merely needs a compliant picture, and requiring first time applicants to show up in person needn't throw it into chaos. Do we want to bugger everything up again (we did this quite recently, we didn't like it, the voters didn't like it) by making everybody show up and then faffing around with fingerprints and iris scans? No, we really don't want to do that. Of course, if we had fingerprint and iris scans for most of the population on the database already, then all the UKPS would need in order to add them to a passport would be a design that would accommodate them. People therefore wouldn't have to show up to have them collected and UKPS would not face "significant impact." But, erm, that's the wrong way round, isn't it? You could add all the stuff to the passport on the basis of 'might as well, got it already anyway, won't cost anything', but if you've been claiming that we're going to have all of this stuff because of the passports, the reality of your not actually needing it for the passports puts you in something of a pickle. Mightn't people say that perhaps you haven't been entirely truthful? So we think the airbrush artistes in the Home Office intend to handle this approximately as follows. Having got Parliament to accept that the ID scheme should be implemented on the basis of the biometric passport requirement and its various other dubious advantages, New Labour will be returned at the general election and can begin the implementation of the ID scheme. Passports with facial biometric will roll as planned from Q4, and the ID scheme's biometric collecting capability will be rolled out across the country in accordance with the ID scheme's rollout, somewhat later. Once it is possible for this infrastructure to handle a reasonable throughput, passport applicants can be required to report to have their fingerprint and iris scans done for the ID scheme. This doesn't mean that these biometrics have to be put onto the passport immediately, or indeed ever, nor does it mean that the issuing of passports need be screwed up by the need to collect them. It does mean that the eventual need of most people for a new passport will allow the Government to force people to volunteer to hand in their biometrics and be added to the national database which, as we've also kept saying, is the whole point of the packaging shenanigans. What you actually get in terms of bits of plastic is neither here nor there, so long as you've been logged. Once people have been logged they'll presumably get a card (they'd smell a rat if that didn't happen), and if they've had to be logged because they applied for a passport then in principle they'd likely be getting an ID card in approximately the same timeframe as their new passport. But they needn't come at the same time, in the same envelope, via the same infrastructure. In response to those who might try to cry foul and claim it's all a big fiddle intended to lumber us with a ridiculously expensive, liberty-eating and ineffective scheme we don't need, the Government can point out that the adjustments are simply fine-tuning, being carried out for logistical reasons. Obviously, they'll tell us, everybody having voted for an ID scheme, we should implement it in the most efficient way possible. And there's no sense in the UKPS duplicating ID scheme infrastructure, although clearly the security of the passport system can benefit from the ID scheme. If by then you've pretty much forgotten what it was they said in the first (or second) place, then the logic will be compelling. Your loopholes tonight Those of you roughing out plans for escape committees will have spotted the useful passport scheduling information straight away. If one's current passport became unusable ('oh dear - bit scorched, innit?) then a new facial biometric passport without the additional stuff would leave you a theoretical ten years with some country-fleeing capability. This will put you a little further down the queue for the ID scheme call-up papers, but on its own isn't enough for you to escape them, unless the whole thing gets junked in the interim. But other possibilities may be opening up. Mark Oaten also asked Browne "what arrangements he plans to make to allow British citizens living overseas to register for identity cards", and the answer is that there aren't any. "There will be no requirement on British citizens resident abroad to register for an identity card until such time as they come to take up residence here." That's certainly helpful to refuseniks who're already living and working abroad, and might also be useful for those considering skipping off to a handy European country. It does however hinge to some extent on how the Home Office proposes to define resident abroad. It's perfectly possible to be living abroad while not being resident abroad in accordance with the Treasury's rules, but if the Home Office proposed to simply adhere to these it would find itself requiring people who're not living in the UK to attend for biometric enrolment. Your mum forwards the call-up papers to wherever you actually are, you write back saying you live in Australia now, what happens then? Much scope for chaos here, watch this space. ® Related Stories: Consultant 'army' already busy on UK ID card scheme Labour's Zombie Army clinches ID card vote for Clarke Need a job? Get a card - arresting ID pitch to business
John Lettice, 19 Jan 2005

Panix.com hijack: Aussie firm shoulders blame

An Australian domain registrar has admitted to its part in last weekend's domain name hijack. of a New York ISP. Melbourne IT says it failed to properly confirm a transfer request for the Panix.com domain. Ed Ravin, a Panix system administrator, says the Melbourne IT error enabled fraudsters using stolen credit cards to assume control of the domain. Thousands of Panix.com customers lost email access for the duration of the occupation, and many emails will never be recovered. The mistake was compounded by the unavailability of Melbourne IT staff over the weekend - the company rectified its mistake late on Sunday evening, US time. Speaking to ComputerWorld Ravin said he was unable to contact anyone in support at Melbourne IT until the company's offices opened on Monday morning. Bruce Tonkin, Melbourne IT's CTO, offered the following explanation: "In the case of Panix.com, evidence so far indicates that a third party that holds an account with a reseller [UK based Fibranet] of Melbourne IT, fraudulently initiated the transfer. The third party appears to have used stolen credit cards to establish this account and pay for the transfer. That reseller is analysing its logs and cooperating with law enforcement." A loophole that allowed the error, that caused the problem has now been closed, he added. The roots of the affair lie in new rules governing the transfer of domain name ownership. These rules, which came into effect last November, mean that inter-registry transfer requests are automatically approved after five days unless countermanded by the owner of a domain. When ICANN proposed the new procedures, many in the industry warned that as well as making it easier to move domains around, the change would make it easier for people to hijack domains. Network Solutions, for example, took the precautionary step of locking all its customers' domains. Panix.com says its domain name was locked, and that despite this, it was still transferred. ® Related stories Panix recovers from domain hijack Katie.com lawyer to host cyber-bullying conference eBay domain hijacker arrested
Lucy Sherriff, 19 Jan 2005

BT, Kingston face EC illegal state aid probe

BT and Kingston Communications are facing an European Commission (EC) investigation over allegations that they may have benefited from illegal state aid. The investigation centres on a property tax relating to the telecoms infrastructure of both companies. The base of the business rates tax is worked out for each telco by the UK's Valuation Office Agency (VOA), which applies various valuation methods to assess the economic value of telecoms networks. The EC wants to find out why different valuation methods are used compared to other telcos. Said the EC in a statement today: "The VOA applies a certain asset valuation method to BT and Kingston, while it applies other methods to their competitors. The application of different methods may favour BT and Kingston resulting in a disproportionate tax burden for other companies competing in the market for electronic communications services." The European inquiry is to run alongside a similar investigation by UK's Department of Trade and Industry (DTI). The investigation was prompted by optical network company Vtesse Networks which claims that BT has benefited from a whopping £12bn in tax relief since 1995. In an outspoken attack Vtesse Networks chief exec Aidan Paul said: "Why does BT pay tax equivalent to a rental value of £16 per annum per local loop line, but has subsequently charged other operators a rental of £122 per annum for the same line, when both rentals should have been based on fair market values? By our calculations, this amounts to a tax disparity of up to £12bn. "Did BT lie to the Government when setting the original rental tax, or did it lie to [former telecoms regulator] Oftel when setting local loop 'unbundling' prices during the deregulation process? If it didn't lie to either, then why are these two numbers so different? Was this a deliberate tactic to keep competitors out whilst minimising its tax bill, or just gross incompetence in the finance department? And why did neither the Inland Revenue, the or Oftel spot this disparity? Did it never occur to either group of civil servants to check with the other? "We want the answers to these questions, and so wholeheartedly welcome today's announcement by the European Commission. This action is the culmination of over a year's work with the Commission relating to what we maintain is the illegal and discriminatory application by the Inland Revenue, the VOA and the Office of the Deputy Prime Minister of non-domestic business rates to the telecommunications sector. This results in BT not only paying the lowest level of taxation in relation to its business than any other telecoms operator, but also having a higher level of assets...than any other operator." Paul reckons the situation has arisen because of the "difficulties of ensuring transparency within a large vertically integrated company" - something that could have been avoided if BT had been split. "We now therefore call on the Government to work towards splitting BT up, as it is clear that neither BT nor the Government can be trusted to properly police and regulate a company of this size and power," he said, calling on rivals including Cable & Wireless, Easynet, Energis and Your Communications, among others, to help the EC in its investigation. BT said in a statement: "BT is surprised that the European Commission is to investigate the UK government over the property rates that BT has been paying. In BT's view, any allegation of state aid would be groundless as BT has received no benefit from the UK government. BT is confident that the UK government will demonstrate the fairness of the UK ratings system." BT is already facing a shake-up of its business following a wide-ranging telecoms review by regulator Ofcom. A spokeswoman for Kingston said: "Until we know the outcome of the investigation we cannot comment further." ® Related stories BT faces EC investigation over business rates MPs to scrutinise Ofcom's telecoms review BT wrestles with Carol Vorderman for 'net villain' award Competition Tribunal rules against BT 'save' calls Tough-talking Ofcom boss slaps BT Ofcom tells BT: shape up, or split up
Tim Richardson, 19 Jan 2005

Home Office tackles ID fraud. By hiring one

The Home Office has gone that extra mile to prove the true costs of identity fraud to us all - it's been conned, big-time. Confronted with a fake doctor one would ideally stride smartly off in the other direction, but in the case of fraudster Barian Baluchi the Home Office opted for funding his clinic, using him as an asylum-seeker health policy adviser and letting him be an expert witness in 1,500 immigration appeals tribunal cases. Baluchi's earnings from his imaginative career in medicine are reported to be in the region of £1.5 million, and of course (you can almost hear the Home Office saying this) it couldn't have happened if we'd had ID cards. He acquired indefinite leave to remain status in the early 1980 through marriage, but his big career move came in 1998 when he registered as a doctor with the General Medical Council under EU procedures. The GMC, bless, is claimed to have done this on the strength his qualification as a doctor and psychiatrist in Madrid in the 1980s, under the name of Antonio Carrillo-Gómez. One can see the clear similarities between the names. But a Madrid doctor of that name does indeed exist. Wouldn't have happened if we'd had ID cards. Except of course a Spanish doctor would have had a Spanish ID card. And a British resident (i.e. Baluchi) would have had an ID card, and would just have had to convince the GMC that this genuine identity had for some reason qualified under an entirely different and Spanish name. We have no idea how you'd do that either, but we doubt it'd have been harder than what he did do. Presumably he must have relied, as is the way of good fraudsters, on there being one born every minute. Which is what the Home Office and sundry Government departments and charities proceeded to prove. According to the report Baluchi "used a string of fake qualifications to set himself up as a leading clinician". The fake qualifications bit is easy, any fool can get these, but it's the second bit that must have taken some talent. Within something like five years he had parlayed no track record at all into "leading clinician." He would have been helped to some extent by the legal system's need for expert witnesses, and to some considerable extent by the immigration legal industry which has been almost entirely generated by the Home Office. This fast-growing business needs experts, many more experts than it needed in 1998, so it's entirely appropriate that the Home Office itself made a significant contribution to strengthening the man's fraudulently-obtained credentials; it needed to believe in him. Meanwhile the case of Michael Edwards-Hammond, arrested recently for impersonating a police officer at Windsor Castle, shows us how the police handle ID fraudsters. Give him a motorcycle escort? Search and detain people for him? Yes, 'fraid so. Edwards-Hammond, reports the Telegraph, has been something of a specialist in impersonating a police officer, to the police, and has a string of scalps and previous offences. He has claimed to be a surgeon on his way to save the life of a child and been given a motorcycle escort by the Met. He has had police search innocent members of the public, and an Asian family taken into a custody. And he has had several harmless pedestrians held at gunpoint near Downing Street. That one must have taken confidence. How, under any circumstances, you stop the forces of law and order saluting people they believe to be their superiors and doing what they say is a puzzle. "Yes chief constable, and I'm the Lord Lyon King of Arms. I'm just going to take your fingerprints now, sir..." That's really going to happen, isn't it? ®
John Lettice, 19 Jan 2005

Sun's Solaris for hippies to arrive next week

After years of hype, it looks like Sun Microsystems will finally unveil its open source version of Solaris at an event next week or at least make some new rumblings around the project. Sun is holding a conference call with reporters and analysts on Jan. 25 to discuss "the company's Solaris open source initiative." Sun will most likely package the core parts of Solaris under its new Common Development and Distribution License (CDDL), which was approved last week by the Open Source Initiative's (OSI) board. Sun is hoping the open source move will inspire more developer interest around Solaris, particularly its version of the operating system built for chips from Intel and AMD. The company won't say exactly what will be discussed next week, and a final, open source version of Solaris could still be months away. Insiders, however, indicate that Sun will at least talk about how the CDDL will come into play and how certain parts of Solaris might be governed by the license. Sun has spent the last couple of years debating the idea of this Solaris for hippies. Its largest customers like the tight controls Sun maintains over the version of Solaris for SPARC processors. This, at times, made the company reluctant to open up Solaris to non-Sun coders. In addition, Sun's legal staff has had a hard time stomaching the idea of an open source OS in the wild. Sun, however, needed a bold move to spur interest in Solaris x86 and sees the open source model as one way to attract outsiders to its version of Unix. Developers should think of OpenSolaris as a new distribution of the OS. The CDDL will likely be used to cover the Solaris kernel. Sun will then wrap other licenses around the various packages that plug into the OS. This will let the company protect the millions in research and development its pours into Solaris, while still giving developers a chance to make their own additions. One source described the new CDDL as an "exemplary copyleft license that is a good, functional replacement for the MPL (Mozilla Public License)." The OpenSolaris effort is often billed as Sun's attack on Linux. Many Sun insiders, however, are hoping to avoid this discussion, billing OpenSolaris as just a customer-friendly product. In the most practical of terms, opening up Solaris provides a way for Sun to get advocates of the OS to work for free. There are a number of developers out there writing drivers to help Solaris x86 run on a wide range of hardware. Now they can all rally around the same product and have a peace, love and Solaris call to cheer. ® Related stories Red Hat Q3's 'validate' Linux subscriptions Sun must acquire Red Hat or Novell - analyst Sun shooting for double-digit piece of the x86 market Sun begs partners to sell more Opteron servers It's do or die time for Sun and Solaris x86
Ashlee Vance, 19 Jan 2005

Opportunity sniffs out meteorite on Mars

NASA scientists yesterday confirmed that the Mars Rover Opportunity has discovered a meteorite on the surface of the red planet. The rover spotted the rock in early January. When its thermal signature suggested it was metallic, researchers realised it could be alien to the planet. Subsequent investigations with the Rover's spectrometer confirmed that the rock contained mainly iron and nickel, a composition characteristic of meteorites. Mission scientists told New Scientist that the discovery was a wonderful surprise. Steve Squyres, the Rovers' lead scientist, based at Cornell University, New York, commented: "I didn't see this one coming." He explained the further analysis would be difficult because mission designers had not expected to find any meteorites on Mars, so did not send the Rovers equipped to take samples from them. Testing by the original equipment manufacturer revealed that, sadly, the rover's existing grinding tool was not up to the task of drilling into a meteorite. After an hour of grinding in a test lab, a quarter of the drill head had worn away. It is possible that both Rovers have unknowingly passed meteorites on their travels across the Martian plains and mountains. The vast majority of meteorites on Earth are just rock. Assuming the same is true on Mars, (and why wouldn't it be?) they would not be visually distinctive from the rest of the rocks scattered on the surface. Spirit and Opportunity have been on Mars for over a year, well beyond their expected life span. In September, NASA agreed to fund a further six months of study. ® Related stories NASA celebrates martian Spirit of adventure Mars awash with evidence of water Six more months for Mars rovers
Lucy Sherriff, 19 Jan 2005

German national library busts copy protection

Germany's National Library is now fully licensed to duplicate copy protected electronic books and other digital media such as CDs and CD-ROMs. The library signed an agreement with the German Federation of the Phonographic Industry and the German Booksellers and Publishers Association after it became clear that its legal mandate to collect, process and index important German and German-language based works would be hampered by the European Copyright Directive. This directive makes it "a criminal offence to break the copy protection or access control systems on digital content such as music, videos, e-books and software". It is a big question whether the Library needed the authorisation. The same Directive clearly states that "member States should be given the option of providing for exceptions or limitations for cases such as educational and scientific purposes, and for the benefit of public institutions such as libraries". The object and scope of the agreement seems pretty limited, anyway. The National Library, established in 1990 in a merger of the Deutsche Bücherei Leipzig (founded in 1912) and the Deutsche Bibliothek Frankfurt am Main (1947), is responsible for the collection, processing and indexing of all German and German-language publications issued since 1913. So, the ever anxious Recording Industry Association of America (RIAA) doesn't have to worry that the Germans will legally multiply Michael Jackson's latest CD. Not until he starts singing Rex Gildo’s Greatest Hits. ® Related stories Germany preps second basket of copyright laws It came from the vaults! Google seeks to open the library OpenOffice CDs live for lending in Scottish libraries Vatican Library adopts RFID e-Minister will make every public library a Wi-Fi hotspot Law seeks deposit of websites with UK libraries
Jan Libbenga, 19 Jan 2005