15th > November > 2002 Archive
P.Eng MailP.Eng Mail The vast majority of readers who responded to our story about Canadian engineers objecting to vendor exams conferring "engineer" status support the Canadian stance. The Canadian professional engineers' association has asked Microsoft not to describe Microsoft Certified Systems Engineers - people who have passed the vendor's exam - as engineers. That needs a full P.Eng, they say. Such requirements are common in some American states, too, you tell us, and in Germany, France, New Zealand and South Africa. "Engineers develop new products and any holder of a P.Eng is legally liable for any damages that may result from using those products, be it a toaster that catches fire or a bridge that collapses. This responsibility to the public good is what sets engineers apart," writes John Kuhne from Toronto. "In comparison, someone who holds an MCSE is merely a technician. They know how to use a specific set of tools, and can troubleshoot computer systems that other, more capable people have designed for them. Calling such a person an engineer truly is an affront." Mike Dixon thinks it's "completely ridiculous and typical of those idiots at Microsoft to just take any term they feel like and embrace and extend it as if they just invented something new," he writes. "I've worked with a whole lot of 30 day 'MCSE' wonders that couldn't even format a floppy disk." Frank Shute points out the same code of ethics is upheld by "chartered engineers" (C.Eng): "You only become a chartered engineer after some years of supervision following getting your degree be it in electrical, civil, mech, mech/man." "Calling MCSEs engineers is no worse than my washing machine repairman calling himself a Service Engineer (like lots of other repair technicians). Then we have Sales Engineers. The conclusion is always that we need a new word which doesn't have the oily rag associations. Now there's something for your readership to consider...." writes Rob Clive (CEng MIEE), adding "sorry, couldn't resist it". Texans must fulfill the Texas Engineering Practice Act (thanks to Andrew Mattei and Tom Tiller for the links), Floridians a similar act (thanks to David Dean), and even in The Beast's home state, it is:- "...unlawful for any person to practice or offer to practice... engineering...or to use in connection with his name...use, or advertise any title or description tending to convey the impression that he is a professional engineer... unless such a person has been duly registered under the provisions of chapter 18.43 RCW," notes a reader who must not be named. A PE must have four years of supervised training, while:- "Engineers do not build collapsing structures or exploding machinery then call it a bug and carry out running repairs until the next attempt," notes T Rutherford acidly. "If people who produce software are seeking a generic title to hide their true profession, they could do worse than look to the men who dug the canals and who were given the ironic nickname of navigators, subsequently shortened to navvies. I anticipate a roar of protest from railroad and construction workers everywhere at being bracketed with bunglers and snake oil men." Closed shop But a handful of MSCEs object to being victimized by (in the words of Stefan Banda) "pansy-ass P.Eng cry-baby canucks" engaged in some "little semantic warfare". "I happen to be an 'MSCE' in Canada, and the fact that some damn group is claiming basically copyright on the word "engineer" is complete BS in my opinion, I don't say 'I'm A Microsoft Systems Certified Engineer'- I say 'I have an MCSE'...it's easy," writes Steven Burtt. "They have some pine cone stuck up their butt, quite frankly this is one battle I would like to see M$ win." The most persuasive comes from Herman Oosthuysen. "I'm an Engineer with a degree from an accredited university which I earned almost 20 years ago: Bachelors in Engineering (B.Eng.(E)) and have been a member of the Institute of Electrical and Electronic Engineers (IEEE) for more than 10 years, but according to Canadian law I may not call myself an engineer. "With this 'non engineer' engineering status, I happened to have worked on military contracts in 3 countries, for different governments, including Canada, but I may not call myself an engineer. "According to the Canadian engineering associations, Eiffel, who built many roads and bridges all over the world, including such fun projects as the Eiffel tower and Statue of Liberty, may not have called himself an engineer either. "The trouble is that the Canadian professional associations, have laws behind them, which supposedly protect the 'public', but in practice, it is merely a bunch of guys who want to protect their own jobs - trade unions, by a professional name, who hijacked the word 'Engineer'. Also, the Engineering Professions acts are in conflict with for instance the Universities Acts. Try to tell a University Professor that they may not award a degree in 'Software Engineering' and see the sparks fly... "As I have a distinct dislike of all things union, I will stay one of the many renegade non-engineer, engineers. It never bothered me in my job and there may very well be more 'non-registered engineers', than 'professional engineers' and the more the professional trade unions rally and rave against non-registered engineers, the more ridiculous they look, so they are digging their own hole. I have considered calling myself a "professional non-engineer" and ask the IEEE to rename itself to the "Institute of Electrical and Electronic non-Engineers"... So, as far as I'm concerned: "Go Microsoft! Up and at them!". Well, Herman - you're obviously entitled to call yourself an engineer. But the point about professional trade organizations and unions is that if you don't hang together, you'll hang separately. Jeremy Silver agrees with the last correspondent. "The CCPE is claiming sole rights over the term 'engineer' - much as a copyright holder would. Is every use of that term granted to them by the Canadian government? In the US, I am pretty sure the AMA and ADA do not claim sovereignty over the term 'doctor', which can apply to people with professional as well as academic credentials (from theology to physical education). Maybe the CCPE's problem is that the MCSE is more widely recognized than the P.Eng." Several MCP says Microsoft was only following the example set by Novell, and one offers this tale:- "Now back then, when Microsoft Product Engineer actually meant something, those people were/still are, damn fine engineers, who could build servers, design networks, find bottlenecks, fix problems, roll out a secure infrastructure. Those people are still with us." "Now, are the exams hard enough to classify someone as an Engineer? NO. Sat down in front of my first NT4 Workstation with a problem, might as well have used the book / exam, as a paper weight," he confesses. He describes an adaptive TCP/IP exam "so the more wrong answers you get, the easier the questions you get." "Would I call myself an Engineer? Maybe now, with 3 good years employment in the industry. "Would I call others with MCSE, an Engineer? Some of the people I've had in to do work, have quite literally, shocked me. Crap. Bona fida crap. But they got those letters after their names." One "Thingy" writes:- "I busted my ass to be a thingy. No boot camp, no dump memorization. I know my shit because I studied and practiced. I am a Business Grad as well. I've seen College engineers who can walk the talk and those that can't and the same is true for MCSE's. There needs to be a vendor/university neutral evaluation of computer/network engineering skills. Fining those who go after M$ certifications is a different approach to straightening out the certification process "Before I started down the M$ trail I knew it was more about $ than supporting the best OS that could be made. So what about them there college bred Canadian Engineers? Are they not chasing $ like us Thingys? Look in the mirror when you answer," writes Joe Fohner, who adds: "I don't know if I'll call my self an "Engineer" but I will call myself an MCSE for now. At least until it expires. Then I'll be a Thingy for sure." A witty rejoinder from Steven Franklin, "Senior Thingy" at Maryland Public Television, who has rolled his sleeves up for the doity work:- "You learn something new everyday. I'm both a Micro$oft Certified Systems Engineer (a weird one who uses nothing but Linux and FreeBSD) and a Certified Video Engineer (by the Society of Broadcast Engineers, http://www.sbe.org). I thought I was an engineer but I'm glad to find out that I'm just a "thingie." "This means that all I really need to keep complex television systems on the air is an oily rag and a set of spanners. That should make the digital conversion go much smoother in this country. I'll notify the FCC at once that they should change all mention of television engineers in their documents to "thingie" and issue some spanners and oily rags. Who knows, it might help "lubricate" the switch to digital. Nothing else is working so why not try this? "I'd elaborate further but our broadcast automation systems are not working correctly with our GPS time system and I need to wipe it down with the oily rag. (If that doesn't fix it I'll give it a jolly good whack with a spanner.)" Finally one reader, Ralph Grabowski, says that a former Chinese engineering classmate was called P.Eng, and wonders if his business card readers P.Eng, P.Eng; while Jan Van Der Post says:- "I always wanted to become a member of the Meat Institute so I could put MInst Meat after my name," Groan. Thanks for all your letters. ® Recent mail The Vulture Central Mailbag
According to a report published November 12 by Aberdeen Group, "Security advisories for open source and Linux software accounted for 16 out of the 29 security advisories - about one of every two advisories - published for the first 10 months of 2002 by Cert (www.cert.org, Computer Emergency Response Team)." Aberdeen says Microsoft products have had no new virus or trojan horse advisories in the first 10 months of 2002, while Unix, Linux, and Open Source software went from one in 2001 to two in the first 10 months of 2002, that in the same 2002 time period "networking equipment" (operating system unspecified) had six advisories, and Mac OSX had four. In other words, all except Microsoft had increases in reported vulnerabilities this year. "Contrary to popular misperception," the report says, "Microsoft does not have the worst track record when it comes to security vulnerabilities. Also contrary to popular wisdom, Unix- and Linux-based systems are just as vulnerable to viruses, Trojan horses, and worms. Furthermore, Apple's products are now just as vulnerable, now that it is fielding an operating system with embedded Internet protocols and Unix utilities. Lastly, the incorporation of open source software in routers, Web server software, firewalls, databases, Internet chat software, and security software is turning most Internet-aware computing devices and applications into possible infectious carriers." The report lauds Microsoft for having overhauled its development process in an attempt to fix security problems, and says, "Perhaps it is time for some of the suppliers of open source and Linux software to take similar measures." (You'll need to register with Aberdeen to read the rest of the report -- it's one of their free ones -- but I believe I've covered the Linux-relevant high points here.) And yet, here I sit with my virus-free, trojan-free Linux box, receiving tons of viruses and trojans from Windows users (that don't affect me), watching news item after news item about sites run on Windows servers getting defaced and broken into. According to what I've heard from my many sysadmin and network security specialist friends, no OS or network-connected software is secure unless it's administered properly and security patches are applied as soon as they are available. And then, after I started writing this story, a ZDNet article with the headline Linux utility site hacked, infected came across my monitor, and I started wondering, "What if these Aberdeen people are right? What if this isn't just Microsoft-sponsored nonsense?" A look at CERT's 2002 Advisories and Incident Notes pages was not overly reassuring. Yes, I saw some Microsoft vulnerabilities there that Aberdeen apparently missed, and one for Oracle. I also think we have enough Microsoft viruses left over from last year that we don't need any new ones this year. But the real issue is that we all need to be more security-conscious. The Aberdeen report points out that the system with the most reported vulnerabilities can change from year to year, but that the overall vulnerability and incident trend is up. Way up. In other words, whatever operating systems we use, we all need to watch out more for security flaws than we have in the past, and work harder to protect ourselves from them. © Newsforge.com Related stories Linux security self-censorship ominous Linux update withholds security info on DMCA terror Linux attacks on the rise?
The World Wide Web Consortium (W3C) has released a standard that will make it easier for Web pages to be viewed on a variety of devices. The Internet standards organisation said it was recommending its XForms 1.0 specification as an upgrade from HTML forms. Ten years after their introduction, HTML forms can no longer cope with the array of devices such as mobile phones and PDAs that are being used to access the Web. As such, said W3C, XForms presents a more flexible option. According to the consortium, this is because XForms allows form authors to choose from the mark-up language of their choice. Being able to pick from XHTML, SVG (Scalable Vector Graphics), and XML, will make it easier for implementers to deliver functionality to users and devices that previously has not been possible, said W3C. "Practically speaking, XForms technologies make it possible to use forms from a PDA, mobile phone, screen reader or conventional desk-top device without loss of functionality for the end-user," remarked Steven Pemberton, co-chair of W3C XForms working group. W3C went on to say that splitting the traditional HTML forms into the three-part XForms model meant that XForms modules can be re-used independent of the information they collect, user interface controls can easily be re-presented on different devices with different capabilities, and the separation of presentation from content leaves information more readily available for users of assistive technologies. The new standard will be based on XML, which allows information to be moved between different devices and is rapidly replacing HTML for the building of Net-based platforms. Doing this, said W3C, will make it easier for developers to fed form data into databases and other applications. Although Sean McGrath, chief technology officer of Irish XML company Propylon, said the specification was long overdue and would make it easier for Web pages to be displayed on different devices, he cautioned that the lack of Microsoft backing for XForms may restrict its wide-spread adoption. "XForms will need to be bundled directly into the browser if it is to be well supported, but the problem is that Microsoft has developed its own specification, XDocs. Without Microsoft on-board, it will be it difficult for XForms," commented McGrath. However, Dr Bob Sutor, director of Web services strategy in IBM and one of the editors of the new specification, said that XForms has a great chance of becoming the standard, open, non-proprietary technology that will help people access information on-line on any device. "This is is an important step toward establishing a true electronic forms standard which will be critical for cross-industry interoperability," said Sutor in a statement. The W3C XForms working group includes the likes of Adobe, AOL/Netscape, Computer Associates, IBM and Xerox. The W3C is an international industry consortium that develops interoperable technologies with the aim of improving the Web. © ENN
With tech budgets under intense scrutiny and vendors waiting with bated breath for a surge in spending, there's probably never been a better time to look at project success. According to a recent poll of IT directors from medium and large businesses, individually they will spend £37.7 million a year on some 45 annual IT projects for their business, but 80 per cent of them don't believe that these solutions will provide a competitive advantage to their firm. There is an obvious question that begs to be answered - what is the definition of competitive advantage? Does it include profitability and efficiency gains for instance? We don't know and the study, undertaken by Winmark Research, doesn't seek to answer it. Still, putting that aside, CIOs still aren't very confident about their work. 80% of them expect that their email and intranet systems will crash when they launch a new application. And only 6% of them think that the rest of their critical systems are safe from failure in that situation. It's a sorry state of affairs but it seems that CIOs are fairly pessimistic about the likelihood of failure with their projects and systems. 65% of those asked said that their projects achieve less than 75% of their expected value. This could be because of a lack of understanding how to measure the value of course - those contentious soft benefits can still irk. Failure to reap all of the benefits from a project, and failure of an entire system are completely different things. But each IT director polled said that they thought the average cost of system failure to their business was approximately £129,000 annually. That's just the cost of the failure itself too and doesn't include potential lost revenues or damage to reputation. For the IT directors, it's the lost revenues that concern them. The study needs to be taken with something of a pinch of salt. It was commissioned by a change management software firm, Serena. Still, it highlights the problems and concerns that are to be found amongst tech buyers. IT projects have always been a controversial subject for the business. High failure rates, project over-runs, budget slippage, you name it, the tech project has had it. In today's tough economic climate though, it seems IT Directors are still unsure about their returns and until that improves, spending will remain dipped. © It-Analysis.com
"Where's Larry?" Carly Fiorina asked at the opening of her OracleWorld keynote speech yesterday. Her question was meant in the spirit of metaphor than genuine enquiry, though, Gavin Clarke writes. That's because Redwood Shores, California-based Oracle's chief executive had been - as OracleWorld delegates knew - ensconced aboard his yacht and chasing the Americas Cup around New Zealand, rather than attending his company's annual user conference. Fiorina chose to answer her own question, rather than wait for a heckled answer. Larry was indulging his other great love - the Americas Cup, she said. His first love, incase you didn't know it, is Oracle. Seconds later the full glory of Fiorina's majestic metaphor came sailing into plain view: "Why do I ask that question, other than to explain his absence," she asked with a slight hint of disapproval in her voice. "It's because HP put the wind in Larry's sails." That wind, apparently, were the HP servers, PCs and laptops used to run software - from Oracle and other vendors - that helped engineers to design Larry's yacht. And, in case you didn't know it, that hardware - and HP's accompanying software portfolio which includes OpenView for systems management - can also put wind into the sales of your business. Fiorina's metaphor had berthed in all its glory. It's arrival helped to divert attention from this week's sudden resignation of company president and former Compaq Computer Corp chairman and chief executive Michael Capellas. In her speech, Fiorina, chairman and chief executive of HP, failed to address Capellas' jumping overboard. Instead, Silicon Valley's most powerful woman executive instead embarked on a 45-minute pitch for HP hardware and software, emphasizing how her company's end-to-end product portfolio can help reduce customers' total cost of ownership (TCO) and return on investment (ROI). Fiorina informed Oracle's users, gathered in San Francisco, California, that prevailing economic conditions meant TCO and ROI are more important to them than technology innovations. Recycling a speech she gave analysts last month, Fiorina said: "Customers are no longer after the fastest, hottest box, the next killer application or the next coolest piece of technology. Customers are focused on ROI, real value and how do I deal with ongoing cost of ownership." Fiorina said she believed focus on ROI and TCO is forcing the industry's shakeout, as customers seek to wring the last drop of value from their IT. As such, she added, the shakeout is not your typical cyclical economic event and will have long-lasting consequences. "People misunderstand and still think this has been driven by cyclical economic requirements, but this has long-lasting structural changes for the IT industry," she said. Attempting to ensure HP's ship remained afloat despite the drying sea of IT investment, Fiorina fired a broadside at her Palo Alto, California-based company's competitors. Defending Superdome and UX 11i in the wake of comments made against Unix systems by Michael Dell earlier in the week, she said: "No, I don't think Unix is dead. Unix is not growing as fast as Linux or NT, but it's clear there's a real requirement for high-end systems. During an OracleWorld keynote speech earlier this week, Dell consigned "proprietary" Unix hardware and software to the dustbin of history. He, instead, championed Linux and Oracle running on his company's Intel Corp-based servers. "We respect Dell as an example of very efficient distribution... but in real innovation and service, we have the real advantage," Fiorina said. Fiorina said HP also beat systems rival Sun Microsystems Inc. Santa Clara, California-based Sun's chief executive Scott McNealy is renowned for jibing at HP during his own keynote speeches. "Sun can be innovative, but when it comes to standards-based approaches, ability to implement and their customer support... we beat them," she said. She finished by saying that some people had mistakenly compared her company with HP's systems and consulting competitor IBM. "We are not trying to emulate IBM. You can look at IBM, and our product portfolios are different in many areas. They are doubling down in areas we haven't. In services they bought PriceWaterhouseCoopers, where we prefer to partner," Fiorina said, overlooking HP's own failed attempt to buy PCW for $20bn in November 2000. © ComputerWire
The future is bright, the future is Oracle Corp - and other large software vendors - who will offer increasingly diversified product portfolios, company chairman Larry Ellison said yesterday. The future does not consist of small or niche vendors who, Ellison claimed, will struggle against drying IT budgets and industry consolidation. Ellison singled-out Ariba Inc, Commerce One Inc, I2 Technologies Inc and Siebel Systems Inc as companies who he said will "never come back". These companies are each struggling with quarterly or annual losses and lowered revenues. Oracle, though, has not been immune from the spending slow-down. The company last month reported first quarter sales of $2.03bn, down 10.5% year on year, Ellison is likely to have singled-out Ariba, Commerce One, I2 and Siebel because they are the companies which compete with Oracle in customer relationship management (CRM) and other areas of e-business software. In response, Ariba CEO Bob Calderoni said his company has been winning deals in Oracle's "back-yard" while rejecting Ellison's claims of Ariba's demise. "Larry said the same thing about IBM ten years ago and all IBM has done since then has taken significant market share from Oracle in their core business. He has been wrong for 10 years and I expect to keep that streak alive," Calderoni said in a statement. Commerce One, I2 and Siebel were unable to provide comment by time of going to press. Ellison made the predictions, while noting that the industry is suffering from a post-bubble correction in stock market valuations and IT spending levels. "The computer industry is maturing, and an awful lot of companies will never come back," he told delegates via satellite at his company's annual user conference in San Francisco, California. Ellison chose not to attend the event in person, as he was in Auckland, New Zealand, participating in the Americas Cup race. Larger software vendors, he said, will fill the vacuum by offering broader and richer product portfolios. Ellison claimed Oracle, for example, has successfully plugged holes, such as order automation, in its 11i eBusiness suite while the 9i Application Server includes advanced features like clustering. "There will be much smaller number of companies with more diversified product lines. That's what Oracle has attempted to become by offering a rich set of functionality in a smaller number of products," Ellison said. Ellison, adept at public speaking, only appeared to struggle during his post OracleWorld keynote question and answer session when one attendee asked him to explain his reasoning for participating in The Americas Cup instead of attending OracleWorld. Ellison chose not to answer, and instead talked about whether he'd been on the joint Oracle/BMW racing boat. © ComputerWire
Deutsche Telekom AG has written €18bn ($18.2bn) off the value of its US cellular operation as it paid the price for its previous management's reckless acquisition spree and posted a record loss for a European company of €24.5bn ($24.7bn) for its first nine months. As expected, the German incumbent has appointed Kai-Uwe Ricke as CEO despite fears that the former COO was too much a protege of former boss Ron Sommer to give the company the radical change in direction it needs. Deutsche Telekom has ruled out the sale of the assets of VoiceStream Wireless Corp for which it paid $34bn in 2001. Instead, it is still pursuing a possible merger with AT&T Wireless or Cingular Wireless LLC as a way of stemming its losses in the US. Ricke said: "We do not want to, nor will we, remain the number-six in the US mobile communications market in the long term. We view the possibility of a merger, not as a means of reducing debt, but rather as an opportunity to maximize the value which currently resides in T-Mobile USA." After looking at the shrunken value of its assets, Deutsche Telekom has written off €22bn ($22.2bn) in total, leaving its figures in a dismal state. For the nine months to September 30, it posted a net loss of €24.5bn ($24.7bn), up from a loss of €1bn ($1bn) on revenue up 12% at €39.2bn ($39.5bn). It expects to close the sale of its cable television assets in the first quarter of 2003, according to chief financial officer Karl-Gerhard Eick. The company's major priority is its net debt of €64bn ($64.6bn), which it aims to reduce to between €49.5bn and €52.3bn ($49.9bn and $52.8bn) by the end of 2003. The immediate prospects for raising cash are the sale of its remaining cable interests and non-strategic interests such as real estate that will raise between €6.2bn and €8.5bn ($6.3bn and $8.6bn). In further bleak news for equipment suppliers, Deutsche Telekom will cut capital spending next year. Though job cuts are a sensitive issue in recession-hit Germany, Deutsche Telekom plans to take the axe to its 254,806-strong workforce, and 42,500 jobs are to go in Germany, as well as 12,200 at subsidiaries overseas. Ricke said that a properly managed Deutsche Telekom should be a "cash machine" and its future lies in debt-reduction and growth. An enormous task lies ahead as the company switches direction from expansion overseas to squeezing more value out of its core operations. © ComputerWire
Dell Computer Corp set aggressive targets for the fourth quarter yesterday as it delivered on its earlier raised guidance for the third quarter. The Round Rock, Texas-based company turned in sales of $9.1bn for the quarter ending November 1, up 22.5% on the year. Operating income was up 39.3% to $758m, while net income was up 31% to $561m. This resulted in earnings per share of $0.21. The figures chimed in with the raised guidance Dell gave at the beginning of October. As usual, the company contrasted its success with the industry as a whole, saying industry revenues were flat to slightly down, and that the company had gained 2.3% of share, while the rest of the top ten had lost more than two points of share combined. Looking ahead, the company said it expects fourth quarter revenue to be $9.7bn, up 20% on the year. Net earnings per share should be $0.23. Wall Street currently expects $0.22 per share on revenues of $9.5bn. Despite Dell's raised guidance, CFO Jim Schneider was downbeat on the state of the market as whole. "It's too early to call a rebound in IT spending," he said. "Demand hasn't changed. It's stable in the US, but some other markets show softness." He added that the company had seen an increase in aggressive pricing from its rivals. President and COO Kevin Rollins added that while there were a lot of new technology trends that customers would like to buy into, "Technology transitions are not the issue in stalling or picking up growth. It's an economy and capital spending issue." For the year to date, Dell's sales were up 11.1% at $25.7bn. Net income for the year so far is $1.5bn, up 92.2% on the year. © ComputerWire
If there is one thing that Seattle-based Cray Inc wants to do besides make a lot more money in the supercomputing market, it is to live up to the engineering genius of Seymour Cray, arguably the best HPC computer designer and visionary the world has ever seen, writes Timothy Prickett Morgan. A Cray computer used to be a serious sign of status among spook agencies, government research facilities, and those few commercial entities that could afford such exotic gear. The world has changed a lot since the heydays of the vector supercomputer of two decades ago, but the corporate and national pride that comes from having a unique approach to supercomputing that gives one company an edge over others is one of the constants in the computer business. If the Cray X1, which was announced on schedule yesterday, performs as designed - and the early indications seem to be that it indeed does - then the battered company that is an amalgam of the former Cray and Tera Computers will once again be able to stand tall and chase hundreds of millions of dollars of HPC budgets and live up to the high expectations that Cray, the man, always had for Cray, the company. To be sure, Cray does not have all of its eggs in one basket. It is designing a 40 teraflops Linux cluster called "Red Storm" for Sandia National Laboratories, based on AMD 64-bit Hammer processors, that is supposed to be able to scale to 100 teraflops. It also has a reseller agreement with Japanese rival NEC Corp to sell its SX-6 vector machines in North America, a reseller agreement with Dell Computer Corp to peddle Linux-based clusters based on Dell's PowerEdge servers, and has the Tera MTA transputer and the Cray T3E massively parallel supercomputer as alternative platforms. But the X1, formerly known as the SV2, is the crown jewels and it is the platform on which Cray is staking its reputation as it promises to build a system capable of delivering petaflops - that's millions of gigaflops - of aggregate processing power by 2010. The Cray X1 is a massively parallel supercomputer that is based on a variant of the vector processors that Cray is famous for. The X1 marries a MultiStreaming Processor (MSP) - Cray's name for a collection of CMOS-based vector processors that are linked to create a virtual and much more powerful vector processor - to a distributed-shared memory architecture. Each X1 cabinet has four MSP nodes, each comprised of four 800MHz processors, and from 64GB to 256GB of memory that is distributed to each of the processors. This memory is actually Rambus DRAM, manufactured by Samsung. Each 800MHz processor in the MSP is rated at 12.8 gigaflops, which suggests that each processor can process a peak 16 floating point operations per clock cycle. The I/O subsystem in the X1 is based on Sun Microsystems Inc's Sun Fire 6800 servers, oddly enough, and that I/O subsystem was tested more than a year ago at the Ohio Supercomputing Center at Ohio State University in the States. Each cabinet has a rating of about 205 gigaflops, and Cray says pricing starts at $2.5m for this base machine. A typical X1 configuration is expected to cost from $5m to $40m. A fully loaded X1 machine has 64 such cabinets, 1024 MSPs, and 4,096 processors with memory ranging from 16TB to 64TB of shared memory. Such a monstrous Cray X1 would be rated at a whopping 52.4 teraflops of computing power, and if such a machine were built today, it would be the fastest supercomputer in the world in terms of peak processing power and probably in terms of actual processing power, since Cray's whole point with the X1 was to design a parallel supercomputer that was not as inefficient as RISC/Unix machines lashed together with high-speed switches. Cray says that its system interconnect is faster than the alternatives in the Unix MPP market, but did not provide specifics on how it accomplished this feat. What it does say is that interconnect is based on a modified 3D torus topology that has 400GB/sec of aggregate bandwidth on a 16 node, 64 processor X1 configuration. MSP is equipped with a system port channel, which has a 1.2GB/sec peak I/O bandwidth. Peak memory bandwidth is 38.4GB/sec and peak processor cache bandwidth is double that at 76.8GB/sec. A fully loaded X1 machine would reportedly cost in the range of $200m to $300m, which is a premium compared to Unix-based MPP machines like IBM Corp's pSeries 690 and HP's AlphaServers, but Cray is clearly expecting the actual performance of the X1 to justify the higher price. Five X1 machines have already been tested by various customers, including the U.S. government. Just last week, Cray won an $8.4m multi-year order from Spain's National Institute of Meteorology (INM) for a Cray X1, which will increase that country's weather forecasting processing capabilities by a factor of 255 when the X1 machine is installed in mid-2003. In the meantime, INM is taking a placeholder SV1. Cray says that it will ship the X1 machines to customers before year's end, and that it expects the machine to contribute mightily to its 2003 financial results. © ComputerWire
Looking for ways to increase the amount of revenue it gets from subscriptions and reduce its exposure to the volatile advertising market, Yahoo! Inc yesterday started to offer a paid-for version of its previously free web-based email service. Yahoo! Mail Plus has been priced at $29.95 per year. Users get 25MB of storage, the ability to send up to 10 attachments of up to 10MB total, POP access, hard-drive archiving and extra filters. © ComputerWire
Bill Gates' schmooze-cruise of India is working, reports an Associated Press eye-witness. Gates has been "handing out so many freebies to India's federal and state governments in the last three days that talk of open-source software [has] started annoying government officials." Step forward annoyed Karnataka state information secretary Vivek Kulkarni, who says: "You should not make accusations against a company because it is successful." Richard Stallman was in India earlier in the month arguing that India should treat free Windows like free cigarettes, but he does not seem to have been entirely successful. Karnataka's state government in Bangalore has just been given free .NET software to be used for e-government systems, and according to AP promptly asked for more money in order to computerise the state. Gates also announced a project to provide broadband for state schools, and provided a possible justification for Microsoft's largesse. Microsoft's prices could be "dramatically lower" for socially relevant projects. This perhaps is intended to provide some moral underpinning for Microsoft's recent and growing use of 'donateware' to build its presence in government. Not, of course, that the donateware programme is confined to the developing world - au contraire, and if people are willing to take the freebies, the only thing rivals can do is try to out-freebie Bill. The Indian tour has not been without its surreal moments. When we saw "Gates honored with big condom" on CNN last night, we assumed they'd been hacked. But no, it's there in all its splendour at Reuters, which originated the story, and Bill apparently smiled when he saw the giant air-filled condom in India's rising technology hub of Hyderabad. Words don't entirely fail us, but frankly, who needs them? ®
More and more spouses are blaming the Internet for the break up of their marriages. Two-thirds of lawyers meeting at an annual conference in Chicago said the Internet has played a significant role in divorces they had handled during the past year. Meeting a new lover online and an "obsessive" interest in pornography were the top two problems cited in many Internet-related divorce cases. Other reasons that have led to the break down of marriages include excessive use of the Net and chat rooms. "The computer is a great communications device. But spouses need to remember to communicate with each other as well," said J Lindsey Short Jr, president of the American Academy of Matrimonial Lawyers. While the Net is being blamed for some divorces, people's activities online are also being used to support a case against a spouse. For instance, almost 80 per cent of those attorneys questioned said that incriminatory e-mails had been part of divorce proceedings, while 65 per cent said computer and financial spending records had been incorporated into divorce records. "While I don't think you can say that the Internet is causing more divorces, it does make it easier to engage in the sorts of behaviours that traditionally lead to divorce," said Short. ® Related stories Randy vicar exposed by emails Text messaging used to trap unfaithful partners SMS a sin, say Indian protestors I divorce thee, I divorce thee, I divorce thee e-Pen signs death warrant for bigamy
It's good to see that the UK's 1901 census site is finally up and running after its considerable teething problems. True, it's still in test mode, but there is already a wealth of fascinating information to be gleaned from this online resource.
Three in ten wired homes in the US access the Net using a broadband connection. So says a survey by Dataquest, which found that over the last two-and-a-half years the rate of broadband Net use in the US has nearly tripled. This works out at a growth rate of 9 per cent a month at a time when the total number of US households accessing the Net - either by broadband or dial-up - grew by just 1 per cent a month. The report, US Mass Market Loves Broadband More Than Ever, found that high speed Net access is a growth market despite the economic slowdown and the relatively high cost of DSL and cable modem services. Furthermore, there is still untapped demand for broadband in areas currently not being served by DSL or cable. As of June 2002, cable modem and DSL together represented almost 90 per cent of the domestic broadband access market in the US, up from 70 percent of broadband access in 2000. This, claims the report, is due to a decline in use for ISDN services. Although DSL is gaining in popularity, cable modem services account for more than half of broadband services in the US, said the report. ® Related stories Cable modem users top 10m in US Half of US Net users to have broadband by 2004
Roxio is to buy the assets of Napster, the dead P2P music-swapping firm. In other word's it's not taking on liabilities or Napster's litany of legal battles with the music major. The deal is subject to approval from the bankruptcy court and, presumably, to legal challenges from creditors and, err, lawsuitors. Roxio is offering $5m in cash and warrants to buy 100,000 shares worth another $300k, rather less than the $8m offered by Bertelsmann a few months ago. The Delaware bankruptcy court rejected the offer, agreeing with the fierce resistance from the creditors/lawsuitors. So what's in it for Roxio, best known for CD burning software? Well, the Easy CD creator, is, it tells us, "the Digital Media company®". And it has a "strategic vision of how Napster will expand Roxio's role in the digital media landscape and enhance our offerings to consumers". In September, porn giant Private Media made a $2.4m stock offer for the Napster name. Sounds like Napster as paid-for download distribution mechanism for burned CDs. But we're jumping the gun - all will be revealed after the deal closes (expected November 27). ® Roxio press release Related stories Porno firm wants Napster Bertelsmann to pull Napster plug Bertelsmann saves Napster - The Napster phenomenon
Security concerns are hampering to roll-out of remote access, particularly to those working for smaller firms. A survey from In-Stat/MDR, released this week, which found companies are evenly split, more or less, between those who allow remote access to the corporate LAN and those that do not. In-Stat/MDR notes that larger companies more likely to allow remote access than smaller concerns. Lack of need and security concerns were main reasons stated for barring all remote access. More than one in three (38.1 per cent) of those quizzed cited security concerns as the primary reason why they balked at allowing any type of broadband remote access, whether from home or a public access location. Among the select few companies which allow public broadband access, hotels are the most popular venues. Airports were the next most popular location, In-Stat/MDR found. Mobile phone companies were the top choice among panellists as providers of hot spot access. Amy Cravens, an analyst with In-Stat/MDR, said the survey showed that suppliers need to "appease security concerns among corporations as well as develop a broader base of individual subscribers" to demonstrate demand for remote access services. Providers would do well to market remote access services to corporates - where more stable relationships can be established than is possible with the public access market, she advises. Copies of In-Stat/MDR's report Managing a Mobile Workforce: Broadband Access Policies for Remote Workers can be ordered here. ®
IBM today launched an ultra dense Unix server targeted at the high performance computing and supercomputer markets. The eServer p655 packs 128 POWER4 processors in a single rack and is available in four or eight processor building blocks. It has a maximum performance of half a trillion operations per second at maximum configurations. Four way versions of the p655 are expected to begin shipping in volume later this year. The p655 will run the AIX 5L operating system, including Version 5.1, and Linux. IBM anticipates that one or more Linux distributors will support 64-bit Linux in the first half of 2003. The eServer p655 is designed for applications that require very large databases or massively parallel processing, typically needed by companies in sectors like digital media and life sciences. It also positions the eServer p655 as suitable a platform for running business intelligence applications. IBM says the eServer p655 offers higher density and superior price/performance compared to alternatives such as Itanium 2 server clusters. A single eServer p655 rack with 128 POWER4 processors occupies only one-sixth the floor space of an HP rx5670 Itanium2 system with the same number of processors. Additionally, a four-way eServer p655 with 1.3 GHz POWER4 processors has a SPECfp_rate2000 of 51.7, offering 15 per cent better throughput than a HP rx5670 with four processors. In a measurement of sustained memory bandwidth, the eServer p655 hits almost 2.5 times the peak theoretical memory bandwidth of the HP Itanium 2 systems (6.4 GB/sec2) in a tuned version of the Stream benchmark. eServer p655 systems supports clustering and logical partitioning. The product also incorporates IBM's Chipkill and bit-steering memory high availability technologies. Prices for the four-way version of the eServer p655 start at £57,359. ® Related Stories Cray flogs X1 supercomputer IBM cranks up pSeries Power 4+ chip speed 16,000 Hammers in Sandia supercomputer Sun blasts "1970s" Itanica SGI raises the Itanic UK boffins get supercomputer boost IDC takes a second pass at supercomputer rankings NEC captures supercomputing crown IBM builds absolutely super computer
The mysterious shroud surrounding Microsoft's revenues was dispelled yesterday, when the company revealed that it is losing shedloads of money on everything bar client Windows, server and Office software. In these, naturally, it's making even bigger shedloads, but it's abundantly clear who's paying the rent, and financing the assaults into new areas. The breakdown of financials by division was published for the first time in Microsoft's Form 10-Q filing to the Securities & Exchange Commission, presumably as a side-effect of corporate America's attempt at a post-Enron clean-up. For the period ended September 30th, the two cash cows of Client (i.e. Windows) and Information Worker (Office) produced operating income of $2.48 billion on revenue of $2.89 billion, and $1.88 billion on $2.38 billion respectively. Server Platforms performed modestly by these standards, but spectacularly by most other people's, chalking up an operating income of $519 million on revenue of $1.52 billion, but beyond that we have the basket cases. MSN lost $97 million on $531 million, CE/Mobility was out $33 million on $17 million revenues (always a good trick, this kind of stuff), and the home of Xbox, Home Entertainment, dropped $177 million on revenues of $505 million. Business Solutions, which includes Navision and Great Plains, and is a sector Microsoft hopes will contribute great things in the future, lost $68 million on $107 million. Trend-wise (the numbers for 2001 are broken out as well), the pain of MSN is easing while revenues are increasing. CE/Mobility only pulls in slightly more revenue and has slightly lower losses ($14 million in and $48 million out in 2001), and Xbox has resulted in a revenue boost plus a substantially increased loss for Home and Entertainment (the loss was $68 million on $236 million in 2001). Realistically though we should expect heavy losses in Xbox's first year, because the economics of this kind of business anticipate revenues from games sales to follow as a long tail. But as we said, it's clear who's paying the rent for these expeditions, and it's also clear that Microsoft is the dominant force in the PC market, and only the PC market. It can afford to shoulder big losses in the areas where it wishes to be the dominant force for a very long time. Which is fortunate, because in several cases these look suspiciously like ventures normal businesses would be forced to put a bullet into. Now. ®
There's good news this crisp autumnal Monday for all aficionados of O'Really. The latest additions to the Cash'n'Carrion O'Really range are the provocative Tracing Spammers and Windows NT User Obliteration. As ever, the message is delivered on a 100 per cent premium cotton t-shirt, available in Medium and Extra Large. They go out at £10.21, that's £12.00 inc VAT. So, if you'd like to make your feelings on spam known, or have a particular penchant for Windows NT, then these will help you get your point across. Remember, we still have the O'Really favourites Distributing Clue to Users, Snooping Email for Fun and Profit and Practical Unix Terrorism available to brighten up the office Xmas party. Enjoy the complete range here. ®
PC sales are seasonal, with the second half of the year doing better than the first. The last quarter is when the hay is made, for that's when consumers of the Western world and Japan make their big pre-Christmas purchases, and corporations which operate Jan-Dec financial years make sure they get rid of any surplus in their IT budgets. Or that's the way it used to be. A couple of years ago, the big holiday season PC buying surge simply didn't happen in the US, whose consumers are the engine of the world's economy. This confirmed 2000 as an annus horribilis for the PC industry. It didn't get any better in 2001, although there was an element of return of the US holiday season purchase. In January this year we reported a "surge in consumer PC sales in the US in December (2001). Fuelled by low-interest deals, US PC sales through the retail sector jumped 101 per cent each week after Thanksgiving, according to investment bank Salomon Smith Barney. This compares with rises of 40-60 per cent a week in the run up to Christmas 2000, a truly terrible period for the US PC industry". US PC sales in December 2000, were 24 per cent down on the previous year. So what is it to be this year - gruel, or gravy? Unfortunately for the PC vendors and their component suppliers, this holiday season is to offer slim pickings for the third year running. Gartner Dataquest does not expect a big boost this year, with "the gloomy economic situation ... likely to affect US and Japan holiday season purchases". Worldwide PC shipments should hit 35.1 million units in Q4, 1.5 per cent up on the same period on 2001. At least it's not as bad as 2000. Other electronics devices - DVD players, games consoles, and digital cameras - are competing for limited funds. And Gartner Dataquest complains of a "lack of compelling technologies" that will inspire the punters to buy a new PC. Of course, there is no magic bullet. But broadband access, file-swapping pirated games, music and films, editing pics from digital cameras, funky LCD screen, are all step-reasons that will encourage supposedly satiated consumers to upgrade. It will just take a little longer, that's all. In the meantime, the surge that will never be is good news for consumers who are thinking of upgrading their PCs this year. If vendors have got their forecasts wrong, then that means fire sales before Christmas, rather than January. If you really must buy a PC before Christmas, try and hold out for as late in December as possible. On a side note, what is one to make of Gartner's bee in bonnet over the rewriteable DVD format wars? "Rewriteable DVD prices are anticipated to fall, but the ongoing battles between rewriteable DVD drive formats (DVD-RW or DVD+RW) continue. Uncertainty over the outcome of this format war is at least partially likely to undermine the effect of lower drive prices." We have never heard anyone say they are delaying buying a new PC because of the DVD format battles. Have we missed something? ®
We'd be the last people to accuse Orange of not having its act together as far as personal data is concerned, but at least some of the people who work for the company seem to be a little bit fuzzy on the subject. What, for example, would you think if a refurb phone you were given had all of the contacts of the previous user still on it? That's precisely what happened to a Register reader earlier today. He's not particularly happy about being forced to take a refurb Nokia 7650 in exchange for a defective one, but it was either that or the long trading standards battle, most mobile phone outlets in the UK seem to take a similarly pugnacious line, and anyway that's another story. In this story the replacement contained contacts, pictures and SMS messages. This could of course have been an isolated incident, but one might also speculate that the refurb departments of the phone outfits haven't yet grasped that phone handsets contain data in growing amounts, and that data erasure should therefore go along with the refurbishment process. Our informant called the helpdesk for advice, and was told to reset the phone. He responded that this would simply reset the phone, not delete the data. But that's not the point anyway - shouldn't they be just a little bit worried about having carelessly redistributed personal data for one of their customers, and maybe a bit more worried that it could turn out to be more than one? We're pretty confident that this sort of thing happens frequently, and we're certain it'd be a good idea if it stopped happening, fast. So, anybody else get something interesting on their refurbished phone? And if so, which company did you get it from? ®
Linux will provide the brightest hope for server manufacturers in the next year, according to Gartner-Dataquest's crystal ball. The analyst firm reckons that although the server market will only grow by 1 percent, Linux shipments will double to almost $4 billion, or 9 per cent of the market. The OS will expand into telecomms, and further into web applications. Gartner-Dataquest also reckons that x86-based servers will pip RISC in revenue terms for the first time. Analyst Jeffrey Hewitt predicts the RISC business will be worth $18.1 billion while Intel servers will top $20 billion. Naturally Intel is cock-a-hoop at the prediction. But there's little good news for Itanium, which will fail to make a significant market impact, and Intel doesn't have an evolutionary route to 64bits for its vast x86 user base. Yamhill is dead, and Microsoft killed it, according to the most recent rumors. ®
LettersLetters Recently we invited ideas from readers how the US could close its cellular wireless deficit. It's not impossible, we suggested: with the right infrastructure and a competitive market, there's no reason why North America shouldn't get the coolest phones and services first. As it is, it trails most of Asia and Europe, and our piece was prompted by a wireless data start-up bemoaning the lack of a CDMA 1X Bluetooth device. The reason for this, handset manufacturers will tell you off the record, is Qualcomm. The CDMA monoculture relies on chipsets from just the one supplier, and must move at Qualcomm's glacial pace. Most Americans don't really like or want getting their data "on the go". 99% of the people drive to and from work. Looking at a tiny screen on your cell phone or PDA to get an e-mail or browse the web while on the road. I know I wouldn't do it. Wireless web access or sending e-mails from cell phone is cumbersome and inefficient. I can wait 20-30 minutes and get to my PC at work or at home to get my e-mails. Plus a little "toy" on you 24/7 would mean that my boss can get to me 24/7. No thanks. America is not Europe or Asia. We have our things to be proud of, they have Wireless. Their perceived superiority in Wireless is just that. It's perceived. And it is developed out necessity rather than as a new opportunity. If you don't have a PC or Internet access is too expensive then why not use a small wireless device. You've got nothing else to get to Internet with. Alex Dukhon So there you have it: Americans don't want cool phones. Is there anything else Americans don't need? Color TV? Penicillin? Round wheels? Fortunately, many Stateside readers disagree. Although Alex's comment was typical of a vociferous fringe of QCOM shareholders: who devoted their emails to not answering the question, but instead talking about spectral efficiency, W-CDMA and other red herrings right off the Qualcomm marketing sheet (and thanks to the reader who sent us one of these PowerPoint gems), readers have more practical suggestions. I am finally tired of waiting on the mobile pigopolists to, out of the goodness of their hearts, generously provide me with service common in the rest of the world. I am therefore changing my contract to a company that will provide me these services. I don't know the situation in California, but here GSM doesn't seem to be a "local monopoly" as you put it. Even if it were, I would still opt for that and support a local monopoly that brings me the technology and services that I want, rather than a national monopoly that doesn't. The only way that we can get the services that we want is by supporting those companies that provide it. Supporting GSM providers in the US and staying away from those that don't will have two effects. First it will push those that don't to upgrade faster due to customer loss. Second it will allow those that do to advertise the advantages and services of GSM more and convert even more people. The only reason we are still so backward in American wireless is that the vast majority of consumers don't have the faintest clue that there is better out there. Even Sprint and Samsung, who have huge advertising campaigns in the US for their technologies, don't stress the difference or new features they offer. The advertisements are either so "artsy" that they don't show anything other than blue haired people lounging around, or they stress that their coverage area/call quality is better. We need to educate our friends, family, and neighbors by showing off all the "gee whiz" features of our GSM phones and steadily convert them, only then, as demand increases, will the marketplace evolve to a higher standard. [Name and address supplied] Alas, even that isn't enough. The consumer freedom that's wired into the European market isn't present even amongst Stateside GSM carriers. Writes Maurice Hilarius:- The real problem with most phones over here is that the various service providers just "don't get it". Typically you get a GSM phone. You think "When I travel I can simply get a local SIM card and I am all set", or "When I go somewhere else I can subscribe to local GPRS providers". Wrong. The local providers want you to buy the phone with a service contract. So you get one of those. And it is SIM locked to their SIMs only. So you get that fixed. And the phone menu does not let you set up for a different GPRS provider. Why? Because the phone manufacturers get asked to make the firmware that way, and they are STUPID enough to comply. I bought a Mototorola Timeport P280 last year, triband GSM, which should work anywhere except Japan. Wrong. The firmware in it is programmed for Voicestream only. I managed to get the code to unlock the SIM cards accepted. But to change the GPRS setup I had to send it back to Motorola and pay a fee for reprogramming. This is a phone advertised as a "World phone" Assholes, the lot of them. Maurice W. Hilarius If AT&T would get their GPRS network in gear nationwide I'd finally buy the T68i or the forthcoming P800. But since I travel throughout the US, most of the time outside AT&T's GPRS network, I need a phone that can call from anywhere in the US. Duane Pandorf Its getting harder and harder to care about anything that happens in or to the US. Boohoo- they don't have Bluetooth. I don't understand why most US citizens are getting shafted and they seem to like it. This coming from a country that promotes monopolies (Microsoft, IBM, Qualcomm) and values marketing hype and bullshit as more important than happy, buying, customers. Sven Busselot This isn't an ohms and hertz war between GSM vs CDMA. Here are a couple of modest proposals:- 1) Mandate interoperability between carriers. Give customers the same freedom to move between the CDMA networks as GSM customers have moving between their GSM networks. This means phones based on SIM cards, or a similar model: the point being that carriers have to compete harder on service, while customers have the freedom to choose the most attractive handset. 2) CDMA is dreadfully penalized by Qualcomm's grip on the chipset business. As CDMA doesn't have GSM's volumes, Qualcomm chipsets are typically much more expensive than their counterparts, and more so when they must support analog roaming. Little wonder, then, that manufacturers skimp on the extras Europeans take for granted, like Bluetooth. Fixing this isn't going to be easy: but mandating that there's more than one CDMA chipset supplier will break the monoculture, encourage competition, and should lead to lower prices. Of course we need to break the "General Motors" syndrome: that what's good for Qualcomm is good for America: and as our readers response show: not everyone has twigged that yet. Qualcomm is America's most pampered company: there's little prospect of the FTC mandating greater competition anytime soon. ® Related Story Qualcomm monoculture is 'killing American wireless'