2nd > May > 2005 Archive

globalisation

Papal succession fuels April religious spam blitz

Junk mail relating to religion has soared during April, accounting for one in 10 spam emails. Porn, medicines and financial scams - spam's unholy trinity - each lost ground to religious junk mail last month, according to a study by email management specialist Email Systems. The global interest in religion that accompanied the death of Pope John Paul II and the naming of Pope Benedict XVI as his successor captured the imagination of spammers, who used it to develop a string of dubious offers. These spam email promoted allegedly free biographies and audio books regarding the late Pope John Paul II, for example. Others offered promises to build a cathedral in his honour in exchange for a donation. Overall spam email made up an average of 91.8 per cent of the messages scanned through email Systems' filtering service In April. Although many hundreds of thousands of viruses are still being sent each month, only 0.42 per cent of traffic monitored by Email Systems was virus-infected. ® Related stories Tech blogger cybersquats God's Rottweiler Pope gets emailed up Netizens learning to tolerate spam - study
John Leyden, 02 May 2005
chart

Outsourcing: prepare to renegotiate your deal, says Gartner

Nearly 80 per cent of all outsourcing relationships will be renegotiated, according to a new survey by Gartner. Lack of flexibility is the main issue leading to renegotiations, followed by a need to improve the relationship between the supplier and customer. The research firm found that 15 per cent of all contracts had been renegotiated within what it calls the “honeymoon period” – the first 12 months of the contract – leaving little hope for the long-term success of the relationship. The survey of almost 200 executives from midsize and large companies in Western Europe also found that 55 per cent of all enterprises with existing IT infrastructure outsourcing arrangements have renegotiated the terms of the relationship within the lifetime of the contract. Only 23 per cent of survey respondents did not expect to enter into renegotiations, meaning that nearly eight in every 10 outsourcing relationships will go through renegotiations at some stage. Only six per cent were planning renegotiations to rescue existing deals, confirming Gartner's view that relatively few companies are actively looking to bring outsourcing back in house. Problems leading to renegotiation Lack of flexibility was the biggest issue for 50 per cent of respondents, closely followed by the need to improve the supplier/customer relationship and to reduce costs. Forty per cent of service recipients believed they were paying too much for their outsourced capabilities. "Over the past four years companies have entered into outsourcing agreements based on cost savings and short-term return on investment, with little thought given to their sourcing strategy," said Gianluca Tramacere, senior analyst, sourcing at Gartner. "The majority of enterprises established long-term sourcing relationships, based on current short-term cost-cutting imperatives. These agreements usually lacked the flexibility to accommodate the dynamic nature of the business environment and we have warned enterprises that this inflexibility would end up costing businesses more in the long term," she added. According to Claudio Da Rold, VP for sourcing at Gartner, renegotiations should “become part of a proactive cyclical process used to maintain alignment of expectations, reach the right balance between service delivery and pricing and focus on end users satisfaction". This is particularly so because as outsourcing has become more established, enterprises have gained experience and a better understanding of their business requirements. Consequently, they are now more mature and feeling more confident in their own ability to select and manage the most appropriate vendor for each of their IT requirements. "Recognising the importance of flexibility, some of the more mature enterprises are now proactively incorporating a mid-term review as part of their contract," said Da Rold. "This should be standard practice. While it is somewhat disappointing that only 16 per cent of survey respondents have currently taken this step, it is nevertheless encouraging and we expect that in the future this number will be much higher." Gartner believes that the evolving approach to selective 'best of breed' outsourcing is making and will continue to make life more difficult for service providers. They will need to be ready for an ever-evolving environment where they will be expected to support the transition of specific service management competencies to a new provider, warns the research firm. "Service providers must understand that renegotiation will become an integral part of relationship management," added Da Rold. Due to the increasing level of maturity within the outsourcing market, Gartner advises companies to place greater emphasis on managing their relationship with the service provider. Forty-five per cent of survey respondents said that communication with the service provider was the most important area in which they would like to see improvements. Governance — specifically clarification in terms of roles and responsibilities — was highlighted by 29 per cent. Gartner also warns that as internal IT departments become responsible for acting as a broker of services between business units and service providers, more emphasis should be put on correctly acquiring the right mix of internal skills. The research firm put forward some advice for service providers: Business and Market Dynamics: Ensure that propositions are compelling: either low cost or high value. Don't get trapped in the zone between being a niche and a one-shop-stop service provider. Expect prices to fall, ability to continually reduce cost will be paramount. Automation and the right onshore/offshore balance will be key. Expect new competition. Relationship Management: Expect demands for increasing openness and transparency. Culture: Develop world-class co-operation skills. Make relationship management a differentiator. And some advice for service recipients: Plans: Regularly review your sourcing strategy. Keep exit management plans and inter-vendor transition provisions up to date. People: Plan to spend at least four per cent of your IT budget on the right internal team. Controls: Establish clear and strong sourcing governance – know who is in control. Access existing contract structures and measurement systems – ensure contract flexibility and alignment to business needs. Copyright © 2005, OUT-LAW.com OUT-LAW.COM is part of international law firm Pinsent Masons. Related stories Outsourcing UK public sector services - the moral hazards Swansea IT workers lose outsourcing fight Swansea IT staff warn of 'done deal' Indian call centre staff nicked for fraud US still in midst of hi-tech powered super boom - pundit Corporates tackle security in-house Fujitsu exposes Eastern talent pool Outsourcing more expensive than in-house service
OUT-LAW.COM, 02 May 2005

HSDPA is faster 3G - in theory (but where are the phones?)

Smoke and mirrors are often cheaper than masts and cables; and the mobile carriers have found something that may keep their investors happy in high-speed downlink packet access or HSDPA. It makes 3G run faster, just by installing equipment in the server... but not this year, says Informa The hype machine is in full swing for HSDPA, and only last week Novatel Wireless announced embedded wireless cards for laptops, using HSDPA. "The Expedite EU730 and Expedite EU740 are designed for easy integration into multiple laptop platforms and other mobile devices to provide wireless high-speed broadband access on HSDPA networks at speeds up to 1.8 Mbps," it said. The reality is that these products won't be shipping any time soon; and phones will come even later. "HSDPA will only become a reality in the mass market in 2006, due to the late arrival of enabled handsets," says the latest in Informa Telcoms & Media’s "Insight Report" series. The report, "HSDPA Status Update" warns that "Many operators have already declared their intentions to launch HSDPA, a faster version of 3G (WCDMA) before the end of the year - but these launches are set to be confined to datacard users in 2005, with a lack of enabled handsets ruling out a launch to the mass market until mid-2006, at the earliest." The sad story is strangely familiar: "After having to repeatedly delay the full commercial launch of their WCDMA networks in 2003 and 2004 due to a shortage of handsets, these same operators are going to endure an exact repeat of the situation when it comes to HSDPA,’ said John Everington, Senior Research Analyst at Informa. The industry is talking a big fight for HSDPA - it has to. Shareholders in both networks and in handset companies, are counting on success in 3G soon; and HSDPA is a good excuse - for the time being. Informa isn't much impressed: "Despite early predictions from Samsung, LG and NEC of handsets becoming available from the end of 2005, HSDPA-enabled handsets are only likely to appear commercially in large volumes from mid-2006 onwards, forcing operators to limit their launches to datacard users in the initial stages," Everington said. Ironically, it looks as if the US will see earlier HSDPA, because the mobile data market there already has pretty good coverage from CDMA 2000 and EV-DO standards, and operator Cingular Wireless "is set to become the first major operator to launch HSDPA on its network in late 2005," claims Everington. This comes after NTT DoCoMo announced several delays to its deployment plans, with the Japanese operator now forecast to launch HSDPA in the second half of 2006. In Europe, O2 has already announce that it is committed to HSDPA, and has been pre-empting the decision by buying base stations which require only a software update to start running it. It has reported stellar performance on downloads during its Isle of Man trials which began earlier this year. Informa Telcoms & Media’s HSDPA Status Update covers: A summary of the advantages and capabilities of HSDPA technology A detailed analysis of the different business cases being adopted by operators looking to roll out HSDPA, including case studies on Cingular, NTT DoCoMo, and O2 A vendor-by-vendor analysis of the HSDPA infrastructure market A review of developments and challenges in the HSDPA handset market. © NewsWireless.Net Related stories HSDPA - Data gets faster, acronyms get longer Super 3G group flexes its muscles Using 3G wireless for domestic broadband
Guy Kewney, 02 May 2005

Europe uniting against Schmidt's Google print project

This story has expired from The Register's archive. You can now find it at its original location on the Forbes.com website: http://forbes.com/facesinthenews/2005/04/29/0429autofacescan08.html?partner=theregister.
Chris Noon, 02 May 2005
For Sale sign detail

Brocade blames March for Q2 shocker

Brocade has fingered a dip in storage sales during March as the main factor behind a second quarter that came in well below previous forecasts. In a statement revealing preliminary results, the switch maker said second quarter revenue will likely come in between $144m and $145m. Back in February, Brocade predicted a much healthier range of between $155m and $161m in sales. In last year's second quarter, the company reported $146m in revenue. So, Brocade will come in flat year-over-year. While EMC has reported strong storage sales, companies such as IBM and HP have struggled recently to show consistent results. Brocade pointed to disappointing sales from its customers in March to explain the revenue shortfall. "As reported recently by several industry observers, the storage environment in the last two weeks of March was weak and while April was a good month, it was not enough to offset the weakness in March," said Brocade CEO Michael Klayko. The company also noted weaker overall enterprise spending on storage and said that some deals were pushed out to the third quarter. Companies typically put out preliminary results when early checks show that a quarter has come in materially below or higher than expectations. This seems to be the case with Brocade, as its latest revenue forecast is close to 10 percent lower than the high-end of its previous prediction. Brocade will report full second quarter results on May 18. ® Related stories Hungry Sun customers must wait to feast on 'product hog heaven' SGI sees red EMC delivers again with strong Q1 IBM spooks market with dismal Q1 Sun's Q3 revenue down. Again Seagate promises perpendicular drives BMC apologizes for poor Q4 with massive job cuts
Ashlee Vance, 02 May 2005

Verizon knocks out Qwest with revised offer

Qwest has conceded defeat in its battle for MCI. Verizon, its bigger and richer competitor returned to the table yesterday with a revised offer worth $8.8bn. Verizon reclaimed its position as the "superior" suitor of MCI after raising its acquisition bid to at least $8.45bn. As has been the case since this process began, Verizon's bid was still well short of Qwest's offer for MCI - now at $9.9bn - but MCI's management has picked Verizon as the more suitable partner. Verizon has offered up $26 per MCI share, beating out its previous bid of $23.10 per share. The deal on the table from Qwest basically stands at $30 per share. "MCI today announced that its Board of Directors has unanimously determined that a revised offer from Verizon Communications Inc. is superior to the offer received from Qwest Communications International Inc. on April 21," MCI said. In response, Qwest said it was withdrawing as it did not want to "continue in a process that seems to be permanently skewed against Qwest". Verizon may have had the lower bid, but it's seen by many as the safer bet for MCI. In its statement, MCI noted that some large customers have threatened to terminate their contracts if Qwest ends up buying the company. In addition, the combined Verizon/MCI would be in a healthier competitive and financial position than a combined Qwest/MCI, according to MCI. "From the standpoint of risk versus reward, Verizon's revised offer presents MCI with a stronger, superior choice," said Nicholas Katzenbach, chairman of MCI. "Shareholders receive enhanced value with greater assurance that the transaction will create additional shareholder value." MCI still needs the approvals of shareholders and regulatory bodies to move forward with Verizon. Qwest could also choose to make another bid, although it termed the previous one its "best and final" offer. After enduring a huge accounting scandal and bankruptcy, MCI must be enjoying this bidding war and positive press. It's good to feel wanted. ® Related stories MCI dumps Verizon, cuddles up to Qwest Qwest makes final offer for MCI Our phones don't work - Verizon boss MCI wants $30 a share Verizon buys slice of MCI MCI rejects Qwest, cuddles up to Verizon Crunch time for MCI/Verizon/Qwest lurve triangle MCI mulls latest Qwest offer Qwest ups bid for MCI - yet again
Ashlee Vance, 02 May 2005
arrow pointing up

Insiders reveal SCO's Monterey disarray

LettersLetters First the sideshow, then the scoop. A former SCO insider puts the relationship with IBM over Project Monterey in an entirely new light. But first, we're delighted that after a long correspondence over a fortnight with Groklaw editor Pamela Jones, a potentially damaging series of articles about SCO and Project Monterey has been corrected. SCO has made an absurd claim that it didn't know IBM planned to port Monterey to POWER - and yet that was a publicly-stated goal of the project from the outset. But just to clarify one issue, Peter Houppermans writes - I think the worst aspect of your article is the "retrospective updates" that you allege Groklaw to have put in as a response to your article. We're happy to clear that up. The Groklaw articles were corrected before we published our account of the saga. As Pamela herself wrote [April 29, 2005 4:00:29 PM PDT] - I read it again, and I see what you are saying. So I changed the stories to reflect what I now understand, and I'll do an article to highlight it too. I want to thank you very much for bringing this to my attention. No problem. For some reason, or combination of reasons, The Mystery That Never Was diverted the community's energies, for almost a month. We're glad to help it back on track - and add something new to the discussion. A former senior employee at The Santa Cruz Operation during this period has stepped forward to shed light on the project. It's an angle we haven't heard before. Your article is a very good rejoinder, but there is even more to the story. Firstly SCO considered themselves a big player. Late in the 1990s Compaq was selling in the region of $1 billion a year of hardware to run SCO operating systems. IBM's number wasn't as large, but it wasn't trivial either. SCO figured you could add the hardware and software ( around $200 million) numbers together to get a true appreciation of the size of the company and compare it to others such as Sun. The important thing to remember that Groklaw has overlooked, or doesn't understand, is that kernel and userspace are separate. OpenServer had done well, but the kernel was limited to two processors and getting very long in the tooth. UnixWare's base (SVR4) had gone truly multi-processor in 1989. That is why SCO acquired it in 1995. The plan was to add "the enterprise" to SCO's customer base, and to do that you needed serious software that could run on serious hardware. There was also a datacentre acceleration project that some vendors threw money into that was supposed to bring some enterprise functionality early (IIRC Unisys and ICL parted with a few million for this). UnixWare used to do really well in AIM/database benchmarks on x86 hardware. The idea behind Monterey was to combine the best pieces of AIX and UnixWare kernels. For example, the AIX memory manager had a good reputation.This would result in a new, combined kernel for the new enteprise hardware. SCO was expecting to migrate their OpenServer customers to UnixWare and especially allow OpenServer binaries to run on UnixWare. Some of the tools, especially the management tools would also be ported. (Monterey was always going to run on 32 bit and 64 Intel chips with IBM seperately having it on PowerPC as you pointed out.) Ultimately it wasn't Linux that killed Monterey. SCO simply couldn't put the number of engineers on the project as they had originally agreed (it was going to be 50/50). So slowly over time IBM shouldered more and more of the burden until it all fell away. The processor being repeatedly delayed didn't help. I have always wondered why IBM never sued SCO for breach of contract! [name withheld on request] So SCO couldn't afford to keep its part of the bargain? Reader Skip in Exeter writes, rather sarcastically - You will not question The Groklaw. Groklaw only wants what's best for you. Groklaw is your friend. All who oppose The Groklaw are servents of the SCOX, and will be cast out to the wilderness. Assist the Groklaw and report unbelievers, fabulous prizes to be won... Touchscreen pioneer Gene Mosher writes - Very good work on today's Monterey story. From early 95 to late 97 my own software, ViewTouch, only ran on AIX. Then we ported it to Red Hat 5. It was barely an hour's worth of work - just a few library path adjustments. The move from AIX to a GPL'd OS, Linux, was clearly a liberating step. AIX was a totally professional OS and was virtually flawless as a technical achievement, as far as it went. The graphic tools for installing and managing it could be handled easily by anyone who was neither a system administrator nor a command line master. AIX was proprietary, however, and it did cost serious money - it was nearly equal to the cost of the hardware, in a time when hardware was getting less expensive but was still far from the commodity that it is today. AIX updates were not quite frequent enough and the news about AIX was, by today's standards, sparse. The community around the OS barely existed. Every piece of software that extended the usefulness of AIX (X, Motif, debuggers) also cost serious money and was proprietary. X and Motif were pretty good layers to make all the UNIX OS's graphical, etc., but they were nearly dead from a lack of fresh air. They were proprietary and stuffy - absolutely nobody was doing anything with them except using them to build widget toolkits. That was, apparently, the extent of the plan by those who controlled them. Linux, the GPL workalike, needed equivalent extension workalikes for itself - Lesstif, for instance. As the various software components died, or were recast in GPL versions, so, too were the companies & organizations behind the software fated to die unless they could recast themselves, as IBM did, and as SCO refused to, hanging on to the ugly, bitter end. For SCO it is indeed ugly, and bitter. For those of us who were there, as you and I were, knowing the difference between what happened and what somebody says happened is easier. While we can we have to tell it the way it actually was, and we can only hope that doing so will be enough to keep the truth separate enough from the myth to remain the truth. They can't pay you enough for what you do. Gene Mosher And thanks to Peter Norman for pointing to many unanswered questions. Yes PJ is a bit off base about IBM's Linux plans, but you left out a few important things that are every bit as serious. Nothing is ever completely straightforward with a company as big as IBM. IBM has always had different parts of the organization working for conflicting goals. Look at the FS vs. 360 wars or the development of VM/CMS. First the research showing Monterey was always intended to run on the Power PC was aimed at SCO's claims that they never knew IBM was using certain system V code in AIX. It wasn't intended to be a great discovery. It was intended to be a refutation. Caldera knew IBM was using the system V print code in AIX 5L. Second there is the Trillium project. All of the parties IBM, Caldera and SCO had a foot or more in the Linux camp. This should surprise no one, since IBM is famous for hedging its bets. When the Itanium flopped IBM had plan B ready to go. Since you claim knowledge of the circumstances, if you want to contribute something new, here are some unanswered questions. Was there ever a formal cancellation of Monterey, or did IBM just reneg on its marketing commitments on the grounds that they were pointless? What compensation did IBM offer Caldera (Ransom Love made this claim in an interview), and why did Caldera refuse it? Did they refuse compensation in order not to lose their right to sue (Ransom Love implies they considered it)? This indicates a great deal more continuity of planning than SCO have admitted. There would be a number of unpleasant legal consequences should Caldera have planned to sue a year earlier than Darl has said they did. At the end of May 2001, after the Santa Cruz deal closed, Caldera announced they would ship Monterey in the second half. Did they do so? If not, why not? If so, why did they stop selling it? What is the relationship between Unix 8 and Monterey? Why did Unix 8 get rebranded? The Monterey contract is incomplete by itself, since it refers to work scope documents to be created during the project. What is in those documents? Did IBM ever exercise or threaten to exercise the change of control clause in the Monterey contract? Peter Norman To which we can add one more question: does the statute of limitations allow IBM to sue SCO for not fulfilling its part of Project Monterey? Or perhaps this is so damning that IBM is saving it for the trial. ® Related story SCO, Groklaw and the Monterey mystery that never was
Andrew Orlowski, 02 May 2005
channel

My 96-processor Linux cluster is smaller than yours

Personal computer. Personal cluster. Call it what you will. Orion Multisystems is pretty sure it has the fastest PC around no matter the moniker. Last week, Orion started shipping its long awaited 96-processor (Transmeta Tinside) deskside cluster. This box is the follow-on to a 12-processor desktop system that has been on the market for several months. With the new, larger system, customers get pretty much the most powerful computer around that can plug into a standard electrical socket. "I was very pleased to see this thing ship," said Orion's CEO, Colin Hunter. The first two 96-processor boxes went to an unnamed US defense contractor. The computers were assembled by hand at Orion's Santa Clara, California HQ and sent on their way. "About half the company came out and watched them leave the building on the FedEx truck," Hunter said. "It may sound silly, but from an internal morale perspective, it was a big moment." The big daddy box starts at close to $100,000 and will likely make or break Orion. It's this system that will gauge how much designers, engineers, artists and scientists are willing to pay for their own, easy-to-use Linux cluster. In one of computing's great traditions, Orion has wrestled a high-performance system - in this case a cluster - away from centralized technicians and put it in the hands of any employee with a desk and a power outlet. This is really the natural extension of a pattern that saw supercomputer-class machines travel down from national laboratories to large companies and then those same machines break up from single, hulking systems to clusters that even more customers could afford. Now, Orion has delivered a bite-sized cluster for personal use. Orion has already moved to answer some questions about the noise of its boxes. New clusters ship with software that matches the speed of the fan with the amount of work the system is doing. Previous clusters ran the fans at full blast all the time. Hunter downplays the importance of such aesthetic features, saying most engineers will run the Orion boxes in rooms already full of noisy computers. If, however, the company is going to claim to have true desktop and deskside computers, it should place a premium on making sure the boxes will eventually run quiet in the average office. So far, the US and Canada have accounted for 50 per cent of Orion's orders. Another 35 per cent come from Europe with the UK, France, Germany and Switzerland leading the way. Asia then fills out the rest of the deals. "We have a wide variety of accounts," Hunter said. "There are many millions of dollars worth of deals out there in various stages." You can have a look at the old 96-noder here. ® Related stories Engineers welcome era of 96-chip PCs RLX quits blade server biz, whacks most of staff Grizzled blade server vet shows 64-bit kit
Ashlee Vance, 02 May 2005
channel

HP becomes EMC software reseller in $325m settlement

EMC and HP have devised one of the weirder patent settlements you'll ever see to get rid of a four-year-old dispute around storage intellectual property (IP). On the surface, the deal seems pretty straightforward. HP will pay EMC $325m over the next five years. Such an agreement, however, would be too simple for these vendors who have been squabbling over storage patents since 2000. In actuality, HP will pre-pay $65m to EMC at the start of five consecutive "purchase periods," otherwise known as "years." During these periods, HP can hand EMC cash or it can put that money toward the purchase of EMC's hardware, software and services. HP can use the EMC gear it buys in-house or resell it. Many of you will see where this leads. HP is basically being, er, encouraged to buy software such as EMC's VMware server partitioning products or Legato backup products and wrap these applications with its own hardware sales. Such a move opens a new channel to EMC and possibly a longer-term partnership with HP. "By expanding our relationship with EMC's various software divisions, HP will be able to deliver a more formalized approach to selling these solutions, and explore new ways to integrate and leverage our complementary offerings," said HP vice president Joe Beyers. Also on the friendly front, the two vendors have signed a five-year patent cross-license agreement. This agreement along with settlement erases three separate lawsuits between the vendors. Their dispute dates all the way back to a 2000 lawsuit over replication software patents between EMC and StorageApps. HP later acquired StorageApps and, in 2001, inherited the EMC lawsuit. HP would go on to countersue EMC but appears to have lost out in the end. EMC and HP already do business together. HP has been one of VMware's most consistent customers, and HP likely resells EMC hardware or software from time-to-time when it needs to close a large deal. It's unlikely, however, that HP sells $65m worth of EMC kit per year. So, EMC's software unit has won itself a major hardware partner. HP can either give money straight to EMC or find a way to sell EMC's software to mutual customers. Which option do you think it will pick? The folks at Veritas must be seething just a bit about this settlement. Last year, HP agreed to use Veritas's file system and volume manager software at the core of its HP-UX operating system. This deal seemed to put Veritas in prime position to sell HP customers lots of storage software. Now it will have to compete with EMC on at least $65m worth of deals. ® Bootnote Earlier this year, HP and EMC agreed to settle this matter out of court via a private arbitration. For those interested in the real fine details of the deal, here's a note from an 8-K filing HP made about the agreement. "If EMC purchases HP products or services during the Purchase Periods, HP will be required to make an equivalent amount of additional product or services purchases from EMC of up to an aggregate amount of $108 million over five years, with caps for each Purchase Period as follows: $10,830,000 for the first Purchase Period, $21,660,000 for each of the second, third and fourth Purchase Periods and $32,490,000 for the final Purchase Period. If HP does not make the required amount of additional purchases of EMC products and services attributable to such Purchase Period, HP will be required to pay the difference to EMC. For purposes of computing the amount of HP products that EMC purchases, hardware products shall be deemed to have been purchased for 50% of the actual purchase price." Related stories HP and EMC ready to settle ancient storage dispute HP and Falconstor team on fancy storage kit - report HP fires patent lawsuit, EMC fires back
Ashlee Vance, 02 May 2005

Inside Sun Labs - the best and the 'bots

Sun showcased a selection of its research efforts at the Computer Science Museum in Mountain View last week. Established by Ivan and Bert Sutherland and Bob Sproull in 1990, Sun Labs has an enviable reputation for pursuing difficult problems, and investing in long-term research.
Andrew Orlowski, 02 May 2005