ExclusiveExclusive Intel appears to be moving in the direction of rivals Sun and IBM with a future version of the Itanium processor code-named Tanglewood, according to documents seen by The Register. Tanglewood will mark Intel's most serious foray yet into the world of multicore processors. It will follow the dual core Montecito Itanium that arrives in 2005 and have even more cores per die and low power characteristics, sources confirm. The chip is a significant departure from its beastly predecessors, including Madison and the low power Deerfield that arrive this year. Instead, Tanglewood resembles the dual core UltraSPARC IV due out from Sun this year or IBM's Power5 chip coming in 2004. All three companies are trying to keep a close eye on power consumption and hoping to take advantage of multicore chips than resemble mini-SMPs. Intel, however, has a long way to go before customers see it as a power friendly chip supplier. At 130 watts, current Itanics push the limits for what you would want to squeeze safely into a server. By contrast, Sun and IBM have tried to keep their chips cool, saying they take the whole system design into account, and Intel doesn't do systems. Perhaps, the old Alpha team had a hand in changing Intel's ways. Our documents show that several former Alpha engineers were working on technology for Tanglewood while still at Compaq. Their suggestions for some fault tolerant aspects to Tanglewood appear to have caught the eye of their current bosses at Intel. Coincidentally, both Intel and Compaq have offices near the scenic Tanglewood area back East in Massachusetts. If Tanglewood appears in 2006 or 2007, it will likely play a major role in Itanic's success or failure with that being the time period pegged by analysts for Itanic to make its move. But by that time, Sun and IBM will both have chips loaded with many processor cores, so its hard to say exactly how Intel will stack up. HP will also be relying on Tanglewood-powered servers to please old PA-RISC and Alpha customers forced to move onto Itanic. With both PA-RISC and Alpha on their last legs, 2006 should be a prime time to upgrade kit. An Intel spokeswoman declined to "comment on speculation about future products that may or may not be in development." ®
Sun finally appears ready to start shipping Jalapeno-based low-end systems, along with faster versions of its four and eight processor machines. Documents disclosed to The Register show the company gearing up its two-processor "Enchilada" server with 950MHz and 1.1GHz UltraSPARC IIIi 'Jalapeno' processors. Accompanying this box will be a new version of the eight-processor v880 with 1.05GHz to 1.2GHz UltraSPARC IIIs and a new four processor v480 with the same chips. The v880 and v480 currently ship with 900MHz UltraSPARC IIIs. Sun may announce the systems at its Sun Network conference in Shanghai next month, or it could prefer to stick to its new quarterly release schedule, and unveil the hardware in early June. A Sun spokeswoman declined to comment on the products' existence or timing for a release. The four processor Chalupa server - a thinned down version of the v480 - should arrive a little bit later on with 1.1GHz UltraSPARC IIIis. It wil be joined by a tower version of the Enchilada box. Low end systems have been key for Sun, during the economic downturn. The company touts particular success with the v480 and v880 systems. IBM hopes to steal some of Sun's thunder in the low end with the introduction of its Power5 chip next year. The company claims this product will be ideally suited for midrange and low end servers. It's unclear what has delayed the Enchilada and Chalupa systems, as they have appeared with frequency in Sun roadmaps for some time. Sun executives were showing off the UltraSPARC IIIi chips at their analyst conference last month, saying they have some versions of the chip running at 1.75GHz. ® Related Stories Sun goes the whole Enchilada Sun peppers low-end with McKinley-killer Jalapeno Sun revamps workgroup servers Missing Sun chip found in abandoned taqueria Sun's Jalapeno almost cooked Sun discloses UltraSPARC VI and VII, shows IV silicon Power5 boasts quadruple performance gain
Nvidia has signed a three-year chip-making agreement with IBM. The deal means that TSMC is no longer the graphics company's only foundry partner. IBM will start production this summer, punching out GeForce FX parts at 0.13 micron on 300mm wafers at its East Fishkill plant. The period of the agreement will allow Nvidia to utilise IBM's 90nm process once it becomes commercially viable. IBM will start producing its first 90nm parts later this year. Nvidia is likely to split the three-chip GeForce FX line between TSMC and IBM on a product-by-product basis. Nvidia was keen to play down any suggestion that it was unhappy with TSMC's 0.13 micron work - held by some observers to be the reason for the delayed introduction of the GeForce FX 5800. Indeed, today, the day after the IBM announcement, Nvidia and TSMC put out a release "reaffirming" the relationship between the two companies. Clearly someone has an eye on TSMC's share price. "We are going to continue to engage in that [TSMC] relationship to the fullest extent possible," an Nvidia spokesman said yesterday, cited by EE Times. Nvidia may not be parting company with TSMC, but that statement isn't exactly a ringing endorsement either - the statement implies there is a limit to what TSMC can do for Nvida. TSMC has produced around 200 million graphics chips for Nvidia during the last five years, and done rather nicely out of it. Now it will be losing revenue to IBM. It can't be happy about that. ® Related Stories Nvidia brings latest GeForce FX chips to notebooks Nvidia unwraps expected GeForce FX chips
Danish security service outfit Secunia this week launched an independent mailing list for security vulnerabilities. Secunia makes no bones in saying that its Security Advisories mailing list initiative is a direct attack against competitor SecurityFocus. The Danes are highly critical of SecurityFocus and security clearing house CERT. And they hope that their Secunia mailing list will replace at the "one source of information regarding the latest vulnerabilities and the security patches released by vendors". The Secunia Security Advisories list is based on more than 200 different sources of security information, including VulnWatch and Full-Disclosure. All the advisories on the Secunia Security Advisories list are written, verified and qualified by Secunia staff based on security research made by the security community and Secunia's own security staff. Thomas Kristensen, CTO of Secunia, says: "At Secunia we feel that SecurityFocus has betrayed the community it used to serve so loyally, that's why we started Secunia". SecurityFocus and CERT deliberately "delays and censors the information disclosed on BugTraq and in their vulnerability database," Secunia alleges. Symantec acquired SecurityFocus last year in a move greeted by suspicion in some segments of the security community. SecurityFocus is run as a separate organisation, Symantec tells us (most recently when we quizzed it about its handling of early alerts on the Slammer worm). The reason for any delay is attributed soley to the time needed by the list's moderators to review information, Symantec says. In the case of CERT the more valid criticism appears to be that the organisation is not doing enough to keep sensitive information confidential. eWeek reports that CERT is to review its security disclosure policy following the leak of three (actually four) unpublished security advisories in recent days. The leaked information, taken from advance copies of advisories on cryptographic weakness in the popular Kerberos protocol, Open SSL vulnerabilities and a flaw in a Sun library, made its way onto full disclosure mailing lists ahead of patch availability. eWeek publishes a timeline on the premature disclosure of these serious vulnerabilities. The anonymous cracker had been in touch with us to say that he's since posted a fourth (less serious flaw) onto a full disclosure mailing list. His motives remain unclear. But back to the main point. Secunia's strident criticism is premised on the idea that their needs to be a single source for security information in order for security to improve. This ignores the point that people in the community get their information from numerous sources (BugTraq, CERT, and yes Secunia, security blogs, news sites, vendor sites etc. etc.) If Secunia does a good job of informing people, then people will gravitate towards its free service. Meanwhile let a thousand flowers of free information bloom. ®
ExclusiveExclusive Microsoft South Africa last week pulled an ad, following a ruling that its claims could not be substantiated by the Advertising Standards Authority. Here Richard Clarke, a South African freelance journalist, explains why he made the complaint which set the ASA ball rolling. Microsoft ran an ad in a South African IT magazine called Brainstorm last November which misled the unsuspecting reader into thinking that the hacker is going to become extinct just like the dodo, woolly mammoth and the sabre-tooth tiger. There were pictures of all four of these creatures with the text below that read: "Not everybody benefits from our secure software." The small print at the bottom of the page read: "Microsoft software is carefully designed to keep your company's valuable information in, and unauthorized people and viruses out. Which means that your data couldn't really be safer, even if you kept it in a safe. Which is great news for the survival of your company. But tragic news for hackers." Yet Microsoft systems, servers and email are littered with vulnerabilities and there is a constant stream of viruses which hit their systems. One of the avenues open to me as a concerned consumer was to complain to the Advertising Standards Authority (ASA) here in South Africa. My complaint hinged on the twin issues of misleading the reader and having to substantiate the claims made in the ad. The complaint was lodged on the 17 December 2002 and certain procedures were followed with Microsoft being given time to respond to my allegations. The ad campaign had been planned for flighting in some of financial publications such as Business Day that target business decision makers. One of my major concerns was the fact that many business decision makers read the magazine and would be wowed by the rhetoric. In fact there would have been many boardrooms and executive meetings where this ad would have been quoted without any substantiation added to the conversation. The Advertising Standards Authority ruled in favour of my complaint on 17 February 2003. In their ruling the ASA highlighted the fact that documentation submitted by Microsoft was not "evaluated by a person/entity, which is independent, credible, and an expert in the particular field to which the claims relate". "The directorate notes that the documentary evidence submitted is internal documentation, which the respondents (Microsoft) did not have evaluated by an independent entity. The secure software claims are therefore ex facie unsubstantiated." I feel that Microsoft should then have been forced to retract the spurious statements made in the ad in a media that targets the same readers that their original ad was aimed at. This was outside the ASA framework and I had to settle for the pulling of the ad. ®
A top EU commissioner has been banging on about the importance of eGovernment. Speaking last week in Barcelona Erkki Liikanen said that eGovernment is now a "central theme in information society policy at all levels" and that it should "help to deliver better government". He believes that the public sector can be "made more efficient by digitising information and processes" while also making "democracy function better". eGovernment is about "increasing democratic participation and involvement", he said, which should make decision-making "visible and transparent" leading to "increased transparency and accountability". Oh, and let's not forget that eGovernment should aim to deliver public services in such a way that they are "accessible and relevant" for all. "We should aim for all citizens to be able to use electronic government, whether they have less digital skills, are living in remote regions, have less income, or have special physical or mental needs," he said. But he warned that Governments have a much more difficult task to fulfil than businesses. "They cannot choose their clients, they have to serve every one. Where business can focus on efficiency, public administrations need to pursue both efficiency and equity," he said. "In terms of technology this means that it would be insufficient to offer online services only on PCs. Even though PC Internet access has rapidly rising and is now around 43 per cent, television reaches almost all households. And so it goes on. ® Related Link The full speech can be found here
Intel has been granted a patent outlining a technology designed to block attempts to over-clock its processors. Intel applied for the technology to be patented in September 1999. The patent, number 6,535,988, was granted on 18 March this year, as reported by HardOCP. The patent covers "a mechanism for detecting and deterring over-clocking of a clock signal for use in a processor, comprising: a detection circuit adapted to detect over-clocking of a clock signal for use in the processor based on a reference signal; and a prevention circuit adapted to prevent over-clocking of said clock signal by limiting or reducing performance of the processor in response to detection of said over-clocking of said clock signal". In short, the chip dynamically compares its current operating clock speed to a reference signal, generated by an oscillating quartz crystal, and automatically reduces its clock speed if it finds it's running faster than it should be. Interestingly, Intel didn't have enthusiast overclockers in mind when it filed the patent application. Its documentation refers to "resellers and/or distributors remarking processors at higher frequencies and then selling the processors as the higher speed part to charge for resale at higher prices". Of course, Intel only has itself to blame. It admits in the same patent documentation that "processor manufacturers may be very conservative when rating such a clock frequency. For example, a processor which successfully operates during tests at 333MHz may be only intentionally rated (marked) at only 133MHz, 150MHz, 166MHz, 200MHz or 250MHz for different market reasons". By the way, if those frequencies were low, don't forget that this stuff was written when "speeds of host processors can vary from 66 MHz to about 500 MHz". HardOCP notes that an Intel spokesperson told them: "We have not made an announcement on any kind of implementation of this." In other words, Intel may well be using this technology - it just hasn't formally admitted the fact. It's certainly been a while since we heard of anyone passing off low-speed Pentiums as higher-speed parts, and the presence of the technology outlined in patent 6.535,988 may be why. But we suspect not - particularly since Intel appears to have a more lenient attitude to overclocking these days, if rumours surrounding the upcoming Springdale and Canterwood chipsets are anything to go by. ® Related Story Intel Canterwood, Springdale to be easier to overclock? Related Links Intel's patent, number 6,535,988 The HardOCP story
BT Tower in central London - recently bathed in green light to launch a rival directory enquiries service - has been listed by the Government. The move means that the tower can't be demolished, and should ensure that any proposed alterations should respect the particular character and interest of the building. The 640ft tower is one of six other structures - including the Equatorial Telescopes at Herstmonceux in East Sussex, BT's Earth/Satellite Station Antenna at Goonhilly Downs, Cornwall, and NTL's Broadcasting Tower at Emley Moor, Yorkshire - that have all been judged to be worth preserving. Apparently, each structure is a "landmark of its type" and illustrates the rise of British communications technology in the 1950s and 60s. Said Baroness Blackstone, Minister of State for the Arts, in a statement: "Our built heritage should be about much more than old buildings. "The best of our modern architecture also merits the recognition and protection that listing brings. "Structures like the BT Tower and the NTL Broadcasting Tower are cultural and architectural icons of Harold Wilson's 'white heat of technology'. "These buildings mark the early milestones of Britain's transformation into one of the most technologically advanced nations in the world today," she said. Whatever. The tower - formerly the Post Office Tower, before Margaret Thatcher split telecommunications away from snailmail - has been closed to the public for more than 30 years, after a terrorist bomb led to the closure of the observation galleries in 1971. Now that the tower has been preserved for the nation, will the nation once more be allowed to visit the site and take in its spectacular views of London? (weather permitting, ahem) We doubt it. ® Related Story BT Tower in St Patrick's Day green light stunt
O2 is to slash the number of free texts people can send from its Web site from 100 a month to just ten. The mobile phone company is expected to introduce the new measures from April 1. In a statement the company said: "In an increasingly competitive market it is necessary to review, from time to time, our offerings to our customer base. As a result of this the decision was made to reduce the number of free texts [that can be sent from the O2 Web site] to 10 per month." However, O2 maintains that it remains an "an extremely competitive offer" and insists that ten free texts is "more than adequate for the vast number of our regular users". One irate Register reader who spotted the change told us: "I think this is pretty disgraceful, they are citing an 'increasingly competitive market' but surely this does not justify such a drastic reduction in service. "Sorry to sound like a Watchdog viewer but I can't even find an email address to send a complaint," he said. A spokesman for O2 said that the reduction to ten texts had "not been definitely agreed" as yet and suggested that the figure "could be slightly more than ten". A final decision is expected later this week. ®
A report on business attitudes to open source software published this week indicates steady progress in the UK, with a growing number of CIOs seeing OSS as a means to tackle Total Cost of Ownership, and indications that it is being used in more sophisticated roles. The study, conducted by Trend Consulting on behalf of OpenForum Europe and published this week in the IoD's Director magazine, reveals growing confidence in open source, and notes that avoidance of lock-in is as much a driver as TCO. Last year's study, conducted before Microsoft's Licensing 6 kicked in, was something The Register at least felt Microsoft should worry about. Licensing costs then were seen as a major TCO issue, and people prepared to grit their teeth and put up with it seemed relatively scarce. This year an almost negligible proportion think software licensing costs have no impact on TCO, and only 7 per cent propose to do nothing about it. 21 per cent say they're looking at open source, while 17 per cent say they'll try to negotiate a better deal. These two strategies, seasoned Microsoft licensing watchers will be aware, are sometimes interwtwined, in that noisily appearing to do the first can often lead to successful negotiation of the second. Overall only 36 per cent of CIOs use open source in their organisation (one presumes most of the rest are outsourcing their web sites, rather than being completely barking), and this has changed little in the past year. The survey however notes that open source is no longer being just deployed tactically for infrastructure purposes, but is starting to be used in business critical applications, such as email, messaging and on the desktop. There might however be an element of whistling in the dark here - infrastructure and development platform are still overwhelmingly the major uses, although there does appear to be an intention to use open source more for business critical applications in the future. And confidence is growing. 30 per cent say they remain sceptical about open source, but 46 per cent have greater confidence, and only 2 per cent lost confidence. Humorously, although most public sector respondents were aware of UK government policy on open source, the policy has had virtually no impact on decision-making; 31 per cent of them think the policy is confusing, and the government has a lukewarm attitude. So they read between the lines too - well done, people. OpenForum Europe launched last year with the intention of helping accelerate market take-up of open source software in business and the public sector. We've noticed it's been remarkably quiet between DTi-backed surveys, but in that case perhaps one can charitably presume that a lot of work went into them. You can get the full survey here. ® Related story Cost the key factor in pushing business to open source
Mobile operators are finally halting their long-term decline in average revenue per user, thanks mainly to non-voice services, a new report claims. Telecom consultancy and research company Analysys said on Wednesday in a new report, "Western European Mobile Forecasts and Analysis 2003-2008," that the major mobile operators in France, Germany, Spain and the UK all experienced growth in average revenue per user last year, as well as in their subscriber base. Falls were recorded in Sweden and figures were flat in Italy. Average revenue per user, or ARPU, is considered a crucial figure to mobile industry watchers and analysts, as it gives a clear indication of revenue levels for mobile operators going forward. With the market now saturated and voice services thought to have little room for growth, it's generally agreed that data services are the best way to drive ARPU. Analysys said that in 2002 ARPU in Western Europe averaged €31 per user per month, and that by the end of this year the number of mobile phone subscribers in the region will grow 5 per cent to 309 million. This means that if ARPU growth is flat this year, mobile operators in Western Euroope will be collecting almost €9.6 billion from customers every month until the end of 2003. However, Analysys says that ARPU growth will not stagnate. In fact, the company says that in 2002, Western European operators drew just 12 per cent of their revenue from non-voice services, but by 2005 this figure will be closer to 24 per cent. By 2008, non-voice services should reach the 36 per cent of revenue mark, and growth in this segment will lead to higher ARPU numbers. "The main reason operators have been able to improve ARPU is because they are achieving big gains in revenue from non-voice services, which include person-to-person messaging and mobile entertainment services," said Katrina Bond, lead author of the Analysys report. "Person-to-person messaging has been responsible for most of the growth to date, and these services will continue to be important, but mobile entertainment services such as downloadable games will become increasingly significant revenue earners," Bond added. Other details in the Analysys report included a forecast which said that person-to-person messaging will grow from €13bn in 2003 to €20bn in 2008 as this service extends from SMS to include large volumes of multimedia messages (MMS) and e-mail. Moreover, mobile entertainment services revenue will grow from less than €3bn in 2003 to nearly €11bn in 2008. © ENN
The GameCube was the UK's biggest selling console last week, beating back both Xbox and PS2 with sales of 14,000 units, with Nintendo now hoping that its GBA SP rebate deal will keep sales high. The GameBoy Advance SP launches on Friday in Europe, and 400,000 units of the new console have been shipped throughout the territory, backed by a ?30 million marketing campaign. Purchasers of the unit will receive a coupon offering €50 off a new GameCube. Last week's massive boost in GameCube sales - an increase of 60 per cent over the preceding week - came mostly thanks to price cutting at retailers including Argos and Dixons, and the launch of key software title Metroid Prime. Nintendo will no doubt be hoping that the success can be sustained in coming weeks with a little help from the GBA SP rebate offer, which will bring the effective price of the GameCube to under £100. However, Argos' offer - which saw the Cube sold at £78.99 including a game and memory card - will come to an end on April 1st, and with the next really major GameCube release (Legend of Zelda: The Wind Waker) not due until May 3rd, it seems entirely likely that the Cube could slip off the consumer radar for another couple of months. The price cutting of the past few weeks was an initiative that originated with retailers; it's now up to Nintendo to carry the flame forward and keep the GameCube going. It's likely that in order to achieve this goal, a strategy incorporating both a strengthened release schedule and a significant price cut will be needed from the console giant. © gamesindustry.biz Related stories Argos to retain Cube following massive sales increase MS takes axe to Xbox Japan Dropping GameCube: Argos' turn? Dixons ditches GameCube
Network Associates (NAI) is to postpone filing results for this year and will restate figures for three previous years as a result of government enquiries into its finances. The company, best known for its flagship McAfee AV software, said today it will restate financial results for 1998, 1999 and 2000 to iron out revenue accounting irregularies. The move comes in the midst of a previously announced, ongoing Securities and Exchange Commission and Department of Justice investigation into NAI's books. In a statement NAI said: "As a result of information obtained in connection with those investigations, the company has determined to restate the financial statements for those periods. The restatement will reflect revenue on sales to distributors on a sell-through basis (which is how Network Associates has reported sales to distributors since the beginning of 2001)." "The restatement may also include other matters addressed by the government investigations," it added. NAI added that it will postpone the filing of its 2002 Form 10-K, due on March 31, 2003, in order to restate its prior financial results. Network Associates shares fell from close at $15.38 yesterday to $14.50 in early trading today on the news. ®
Project Orion, which Sun has implemented to unify its product release cycles, has been designed partly to appeal to independent software vendors. Sun wants to ensure that many companies are working with the Sun ONE applications and web services stack, offering specialist services that Sun cannot cover. Sun Microsystems' Project Orion, which seeks to unify software and release cycles, will assist partners developing products as much as it helps end-users planning installation. Software group executive vice president Jonathan Schwartz said independent software vendors (ISVs) would be able to plan and build products around a set of integrated software that conforms to predictable release cycles under Project Orion. "[ISVs] want to know when they can roll something out," Schwartz said during a recent interview. "Where it's not about the features, it's about the efficiency." Talk about Project Orion has largely centered on expected benefits to end-users, and the efficiencies they would likely experience of being able to predict upgrade cycles across Sun's software stack. Sun, like Corel, Novell and Ximian, seeks to exploit dissatisfaction among customers over Microsoft's licensing in areas where they compete such as office suites or messaging. They face varying degrees of success, with Corel now seeking possible acquisition while Sun was last week dismissed by one Windows applications specialist as "irrelevant". Schwartz, though, re-iterated Sun's own focus on customers saying, for example, that potentially problematic security changes planned for Microsoft's Office 2003 and Windows Server 2003 would provide a further opportunity for Sun to snatch customers from Microsoft. Sun's emphasis on partners, though, is designed to ensure that an ecosystem of companies line-up behind the Sun ONE applications and web services stack, offering specialist functionality where needed and filling gaps that Sun's own engineering efforts cannot cover. This would potentially help drive Sun ONE against application and web services offerings from Microsoft, as well as Sun's numerous Java 2 Enterprise Edition (J2EE) rivals such as BEA Systems and IBM. © Datamonitor is offering Reg readers some of its technology research FOC. Check it out here.
Microsoft has come up with an innovative explanation for the number of pieces that are going to have to be retro-fitted to Windows Server 2003, after it launches with them missing next month. This is apparently a modular approach to software development which will equip the company better to respond to the speed of update delivery that characterises open source. An amazing number of birds are slain with this particular stone. Microsoft has been wrestling with the problem of a: How it delivers the Longhorn updates for Server 2003 without b: Dumping customers with the service pack from hell or c: Rolling out a 'Longhorn Server' version of the OS when it's supposed to be being good by d: Not forcing its business customers to upgrade servers every two minutes or thereabouts. At the same time e: It needs to get the updates out there and in use before its product roadmaps recede somewhere in the general direction of 2020. Which is what would happen if it only added new functionality with new server releases, not in-between, and only made those server releases at intervals sufficiently wide for businesses not to scream in pain and anger. Confused? So is Brian Valentine, who seems only to open his mouth to change feet these days. Valentine first said there'd be no Longhorn Server last year, but intimated that the delivery mechanism for updates might be something of a problem. Then last week he said there might be a server version after all. Then there mightn't. Or there might be a Limited Edition version. And now with the new "modular" approach there probably won't be a Longhorn Server after all, just lots of stuff to stick into 2003. On the other hand, perhaps we shouldn't altogether rule out the possibility of features being rolled up into a single, handy distribution at some point a year or two down the line. But that wouldn't be a new OS, no sir, that would be a handy roll-up. News of the modular development comes, sort of, in a Joe Wilcox CNET piece. And we say "sort of" here because Joe doesn't attribute much of the meat to Microsoft itself - it's analysts' comment that gives the story its legs. The Microsoft input relevant to modularity comes from Bob O'Brien, who appears to have spent the past week walking behind Brian Valentine with a dart gun: "These (non-shipping) capabilities we said from the beginning we would deliver post-delivery of the server platform. The things we're going to release later simply are going to add value to the product." O'Brien also describes the additional "components" as things that "add more capabilities - power tools - for our customers," and the announced ones at least won't be charged for. But it's not clear to what extent Microsoft itself is presenting this as a counter to open source, or how much of this can be ascribed to analysts' reading of the changes. That said, by introducing a delivery route that is not a whole new OS and at the same time does not involve rolling new features into service packs (it's promised businesses several times it won't do this), Microsoft will be running an update schedule which will inevitably be compared to open source ones. Is this a solution? Well, it'll involve businesses having to make a lot of changes in order to keep their servers up to speed, and if you consider that one of the points of Red Hat's "Advanced" line of product is that it's intended not to present businesses with changes at such high velocity, you might consider the possibility that Redmond is dealing with the wrong problem here. It's come up with a formula that will allow it to shove out stuff as and when it's ready, but it hasn't answered the question: "Jeez, why d'you keep dumping me with all this stuff?" The reverse, actually, we'd say. On which subject, Mary-Jo at Microsoft Watch has taken a shot at a roadmap of some of this stuff. Figuring this lot out then having to figure it out all over again six months down the line is a thankless task, so we're deeply grateful. You can find it here. ®
The Treasury is to introduce legislation "as soon as possible" to kill tax avoidance schemes developed in response to the government's 100 per cent first year tax allowance. This is designed for small businesses buying IT "for bona fide use for their businesses". But the provision of plant and software for leasing is specifically excluded under the scheme. According to the Treasury, these avoidance schemes "make use of the fact that, although first year allowances are not due where software is acquired for leasing, there is no such restriction where the software is acquired to be exploited by the granting of rights to use or otherwise deal with that software". The Treasury is to close this loophole from today. The press release is here. ®
House builders in the US are increasingly including broadband connections and IT-related wiring when building new homes. InStat/MDR found that it's not just homes at the top end of the market (more than $1m) that \re getting the hi-tech treatment, but also houses selling for less than $250,000. InStat/MDR's research found that three quarters of those developers questioned had installed broadband into some of their new housing schemes, with half using broadband as a selling point. Indeed, in residential housing estates some believe a broadband network could be the modern answer to creating a sense of community. However, the research also revealed that less five per cent of developers thought that the inclusion of broadband access in a new property would be a critical selling point for potential homebuyers. Said In-Stat/MDR analyst, Amy Cravens: "Broadband...is being used to attract new residents with high-speed connections and robust service packages. "However, despite the numerous potential benefits to home buyers enabled through broadband connections, developers do not see broadband networks and advanced in-home wiring as being a critical decision criterion for the homebuyer. "Compared to other community and home features, such as home design and community location, broadband and networking technologies ranked considerably lower in importance when developers were asked what they believed was important to home buyers." Although this research is from the US, it could prove to be a useful pointer for house builders in the UK, where the Government has just begun a consultation into whether building regulations should be amended to help boost broadband take-up. The Government is asking the building and telecoms industry whether demand for broadband services is being hampered because of the inconvenience of installing advanced telecoms services. The consultation came around following work by the Government's Broadband Stakeholder Group (BSG) which suggested that Government should "...consult with the building industry and broadband service providers to identify the best approach to ensure cable ducting is installed in all new buildings". The concern is that some people might be put off because installing broadband and other IT services might mean having to drill holes in wall and lift floorboards to lay new cable. The alternative is to ensure that all new housing comes ready-fitted with ducting and terminal boxes that are ready to take cabling without too much effort. Of course, the Government has stressed that it will only include this into formal building regulations "if the consultation provides solid evidence for the benefits of such a change" and that it would lead to the increased take-up of broadband. Of course, there are many "ifs" and "buts" with this whole issue - not least surrounding the idea of how broadband technology will develop over the coming years and whether there will even be a need for cable ducting if it is bypassed by more affordable wireless technologies. However, in a move that mirrors what's happening in the US, its seems more and more house builders in the UK are including broadband and ducting in their new properties. A spokesman for the House Builders' Federation (HDF) told The Register that the provision of broadband and IT cabling can be a strong selling point for new homes - especially since more and more punters are demanding this kind of feature. The question is, can this be done without the need for more red tape and regulation? While the HDF appears to be all in favour of developing "smart(er) homes", it remains unconvinced - at the moment at least - that such measures should be made mandatory and believes it should be up to the market to decide. ®
UK register Nominet is to hold off implementing a wait listing service - a system that would allow people to reserve domains put back up for sale - for six months. A recommendation to delay until other wide-ranging changes are made to the .uk domain system has been made by a sub-committee of Nominet's Policy Advisory Board following a public consultation earlier this year. It will be put forward at the first PAB meeting since its recent elections on 2 April. The PAB will vote on whether to adopt the report, to which there would appear to be no reason why it shouldn't. The sub-committee makes five main recommendations: No recommendation be made to the CoM [Nominet's Council of Management] to proceed with a WLS service at this stage. That the matter be reviewed by the PAB in six months time, taking into account results of the various Nominet projects, including the automaton anti-abuse provisions, the clearing of the backlog of the detagged and expired domain names, the new renewals system, and the PRSS review. That such a review should be done with regard to the points raised in submissions to this consultation. That Nominet's executive should alert the PAB should a WLS be required for operational rather than policy reasons. That Nominet's executive should bear in mind the likely design parameters of a WLS (should one eventually be determined desirable) when designing new systems. So, basically, it's a matter of holding off until all the other changes in the system are in place. Is this wise? Well, it's pragmatic. Nominet is completely reorganising itself and introducing a wait list service would vastly complicate matters since it impinges on most of the other changes. It is also fair to say that from the responses to the public consultation, which saw people fairly evenly split between "Yes", "No" and "Not now", it was difficult to reach any other conclusion. The downside is that there will be no official system in place at a vital point in the .uk domain's history - namely when the 750,000 domains that are currently in limbo in Nominet's system are released within the next six months. The domains - which represented 20 per cent of all UK domains - are part of a massive backlog created by a flawed initial system set up at the start of the .uk domain's life. This is good news however for the mini industry that has evolved around getting hold of freshly released UK domains. The other risk is that despite the recommendation that Nominet account for a possible future WLS in reorganising its systems, that it won't and so if at a later date one is introduced, we can expect a year's worth of cock-ups. One major point that we would like to raise however is the degree of interaction between the PAB/Nominet and its members and Internet users. The report says the 22 responses it received with regard to the WLS were "encouraging in respect of prior Nominet PAB consultations" but "still represented only a small proportion of members, and a miniscule proportion of registrants". Nominet has been heavily criticised in the past for being a closed shop, however there now appears to be a genuine move toward openness and inclusion in its decision-making processes. This is supported by the posting of all submissions on Nominet's website for public consumption. With two new members joining the PAB next month - both renowned for campaigning and telling it like it is - now is your chance to make the organisation running the uk registry the sort of Internet democracy that was always envisaged in the early days of the Net - a model that has been so damaged by the actions of ICANN. Achieving that goal is simply a case of formally telling Nominet what your views are, no matter who you are.® Related story Get hold of your dream .uk domain Related link WLS report (pdf) Responses to WLS consultation
OpinionOpinion Until Unix and Linux programmers get over their macho love for low-level programming languages, the security holes will continue to flow freely, argues SecurityFocus columnist Jon Lasser. The last several weeks, as always, have brought a constant flow of security advisories. Perhaps not a torrent, but certainly more than a mere trickle. Most notable among these is the Linux kernel ptrace vulnerability, which allows local users to acquire root privileges. Next, there is a clever timing attack against OpenSSL that can reveal a site's private key and thus compromise all of its traffic. There is also the mysql configuration file vulnerability, whereby a malicious user can write out a file that will allow him to acquire full privileges; a buffer overflow and local root exploit in the venerable lpr print daemon; a buffer overflow and potential root exploit in the Mutt mail reader's IMAP code; and a glibc integer overflow that allows remote code execution via RPC. Also reported in the last three weeks are perhaps a dozen more security holes in programs including file, ethereal, ircii, qpopper, Evolution, rxvt, Samba, and others. These are, by and large, holes discovered and reported by the good guys -- there's no telling what black-hat hackers have discovered. Most of these bugs are buffer overflows, format string vulnerabilities and input validation errors. In short, these are the same sort of holes that we've seen over and over again for years. Format string vulnerabilities are new, discovered circa 1999; the other two classes of bugs have been known and actively exploited on Unix for quite a while: the first Internet worm exploited a buffer overflow in Finger in 1988. Why do we still see these bugs? In no small part, it's because programmers aren't using appropriate tools. In an age where processing power is cheap, there's no excuse for a mail client written in C or C++. For users accessing mail via IMAP or POP, network speed and congestion have a greater influence over performance than anything done on the client side; even for users with local mailboxes, I doubt that we're looking at a huge performance hit. Studies have shown that programmer productivity, measured by lines of code over time, varies little between languages. Languages that automate more of the low-level work allow a programmer to accomplish more in fewer lines of code and also, perhaps not incidentally, avoid certain types of security bugs: the low-level constructs that C and C++ programmers spend time managing are the same ones that can get them into trouble. To be sure, some software must continue to be written in lower-level languages: Database servers such as MySQL will inevitably be written in lower-level languages for legitimate performance reasons. And it would be both unlikely and counterproductive for the Linux kernel or the system library to be rewritten in Perl, Java, or Python. But none of those concerns justify writing an IRC client in C. And if it seems unimaginable for a print server to be rewritten in a high-level language, the reality is the benefit would be substantial and the performance costs negligible. eXXXtreme Coding I don't believe that software written in high-level languages is free of security holes: the number of bugs in Web applications written in Perl and PHP is astounding. Applications written in those languages have no immunity from data validation errors that can be abused to provide remote access to files or even remote execution capability. Perl's taint mode can reduce the risks from these bugs, at the cost of modest effort on the part of the programmer. If coders must use C or C++ for everything, there are tools to make these languages a little less dangerous: WireX's StackGuard and FormatGuard come immediately to mind, as do various high-level string libraries. Why are these tools not used more widely? FormatGuard and StackGuard are relatively simple to implement, and the performance penalties are typically in the single-digit percent range. However, using these tools requires modifications to standard infrastructure: StackGuard is essentially a modified version of the GCC compiler suite; FormatGuard is a modified version of the standard GNU C libraries. Neither programmers nor system administrators like diversity in the underlying environment: it makes debugging much more difficult. There is also a macho streak in programmers: a tendency to believe that one's own code is well-written, and a corresponding belief that real coders, like fighter pilots, work as close as possible to the bare metal: Real programmers manipulate the system at the lowest possible level, for the maximum possible effect. The fallacy of the comparison should be obvious. Modern fighter jets are fly-by-wire, and, while still relying on the exquisite skill of the pilot, the jets systems handle the bare-metal aeronautical tasks on their own. Furthermore, fighter pilots are highly-trained, spending much of their time on simulators and analyzing their mistakes. (I think it's safe to say that programmers spent less time at self-criticism than pilots.) Finally, very few pilots are qualified to fly modern military aircraft. It would be nice if we could expect that our programmers would act more like airline pilots than fighter pilots: that they acknowledge, and accept, the responsibility that they take for the well-being of others. Until they take this step, I doubt that the quality and security of the code that we all rely on will improve. After programmers take responsibility, perhaps they can consider using the right tool for the job, rather than the right tool for the job of their dreams. ©SecurityFocus Online SecurityFocus columnist Jon Lasser is the author of Think Unix (2000, Que), an introduction to Linux and Unix for power users. Jon has been involved with Linux and Unix since 1993 and is project coordinator for Bastille Linux, a security hardening package for various Linux distributions. He is a computer security consultant in Baltimore, MD.
Never let it be said that notebooks aren't popular: some 30.5 million were sold last year, accounting for 23.5 per cent of all PCs sold, according to the latest IDC count. Gartner puts the figure at 21.8 per cent. In 2001, 22.5 per cent of PCs sold notebooks, says IDC - 20.2 per cent, says Gartner. At those rates of growth, it's not hard to imagine notebooks grabbing at least a quarter of the worldwide PC market this year. Whoever's numbers you prefer, there's no doubt notebook shipments have been increasing each year for the last four years or so. However, 2002 seems to have been a particularly popular time for the portable, driven by increasing interest in computing on the move, lower prices and - frankly - because notebooks look a lot better than desktops. Wireless networking and broadband must have played a part too, particularly in the consumer space, by finally allowing users to move around their home untethered and surf the Net at high speed. Desktop-class mobile graphics chips from the likes of ATI and Nvidia have helped too. And it's a lot easier to justify an upgrade if you're changing form factor, now that, for many people, processors are as powerful as they need them to be. Driving sales in 2003 will be the efforts of PC manufacturers who've taken note of last year's notebook sales and will be promoting them hard this year. Apple says 2003 will be the year of the notebook; Intel is spending a fortune on Centrino marketing; AMD recently launched a staggering 12 mobile Athlon XPs. Notebooks are, of course, higher value items, so it's no wonder vendors are keen to push mobile computing. ®
When Opera unveiled its special Swedish Chef Edition, an unfortunate (for the press) side-effect was forcibly brought home to The Register. You see, just around that time we were running an ad that seemed to stop Opera displaying properly. And it was, oh dear, a Microsoft ad. You see the unfortunate nature of the side-effect. People write about Microsoft breaking Opera, then get pointed at because their own site's breaking Opera. So we fixed that one quicker than is usually the case. Then it broke again, not a Microsoft ad this time thank goodness, but we fixed that. And then it broke again, and again. And it's broken right now. In as far as you can say it's broken at all, of course. The effect of this breakage, which certain Opera 7 users can see by checking out one of our wireless sections, is to send the horizontal margin haywire, making the page tricky to read, as well as weird-looking. But this effect is only exhibited if Opera is set to identify itself as IE; if it claims to be Opera or Mozilla, then the page is fine. So there's your fix, and maybe it has a certain poetic justice to it - but why does it happen? Aha - in declaring itself as IE, Opera also retains the word "Opera" in its identification. What is therefore happening with these particular ads is that the sniffer used to detect browser type is finding two browsers and happily serving two ads at once, thus busting the margins. Whose fault is this? Opera's, because its software is claiming to be something it isn't? Yours, O Opera user, because if you had the courage of your convictions you'd stop pretending to sites that you were using IE? Opera's again, because the default setting is to pretend to be IE? Or the people whose sniffer code keeps checking for browser types when it's already found one? All of the above, no doubt. Our increasingly put-upon and truculent techies raise their heads from their Debian long enough to say (not very politely, either) that Opera should stop saying it's something it's not. We, moderate as always, favour Opera at least changing the default setting to the truth. But getting everybody to change their sniffer code, or to dump it entirely and use something more elegant instead, just because Opera's fibbing does not seem to us a particularly viable option. Whatever. Attention, Opera users. Ctrl+Alt+O. There, that didn't hurt, did it? Leave it like that, and on those odd occasions you find a site that seems not to like you, Ctrl+Alt+I. If that seems to work, moan to the site, not us. Register breaks Opera? Hah - Opera breaks Register that's our story and we're sticking to it. ®
Last week's very serious Windows 2000 vulnerability is far from limited to exploitation through IIS alone. This flaw, the root cause of which is a buffer overflow vulnerability in a core Microsoft Windows DLL (ntdll.dll), could allow attackers to gain complete control of a vulnerable system and execute arbitrary code. As we said in our article about a minor glitch with the patch last week, ISS WebDAV (World Wide Web Distributed Authoring and Versioning) is one of many Windows components which uses the problematic ntdll.dll component. So Microsoft's patch needs to be applied to all potentially vulnerable Windows 2000 boxes. Microsoft's advice on the problem has been revised to take into account potential conflicts with hot fixes which gave rise to the minor glitch. This is just as well because the problem gets worse the closer you look at it. David Litchfield, of NGSSoftware, the security firm which rose to prominence on the back of discovery of the vulnerability exploited by the Slammer worm, has dissected the myriad risks arising from the flaw. You can read his paper (PDF) here. Along with malformed WebDAV requests to Microsoft's IIS 5 Web Server (which ships with Win2K) other attack vectors including Java-based Web servers and non-WebDAV related issues in IIS exist, NGSSoftware notes. And that's just the tip of a dangerous iceberg. "Security researchers at NGSSoftware have already discovered several new attack vectors and believe that there will be many that come to light over the next few weeks," Litchfield writes. "There are too many ways for an attacker to access the vulnerability. Likely areas will be non-MS Web and FTP servers, IMAP servers, anti-virus solutions and other MS Windows Services." In light of this, NGSSoftware strongly advises that every Windows 2000 workstation and server needs to be patched - and patched soon - regardless of whether it is running IIS or not. ® Related Stories Win2K Web Server software brown alert goes out Minor glitch in Win2K patch Open and closed security are roughly equivalent The MS 'friendly' security alert service - just say D'Oh
Dogs, too, can be liberated by weblogs. A Japanese toy manufacturer has developed a $120 pet collar which transmits dog barks into "human language". The barks are matched against a database developed by animal behaviorists. "The console classifies each woof, yip or whine into six emotional categories -- happiness, sadness, frustration, anger, assertion and desire," reports Reuters, "and displays common phrases, such as 'you're ticking me off,' that fit the dog's emotional state." 300,000 of these "Bowlingual" devices have been sold in Japan, and manufacturer Takara is looking forward to launching the device in the USA this summer. Naturally a pooch-to-computer interface is being developed, so man's best friend can join the eternal conversation of blogdom, which so far has been limited to humans, with computers, who have a lot of time on their hands. It's the talk of the Net. The first dogblog is only weeks away, and the first "Woofie" - a reputation system based on barking - must surely follow. Weblogs - self-published diaries - have given people with poor social skills a valuable social outlet. The weblog vanguard maintains that blogdom is as revolutionary as the invention of capitalism and the splitting of the atom, rolled into one. A very few curmudgeons dissent from this, insisting that "you keep collecting the good stuff" until it's "time to publish". But surely these reactionaries will be swept away - especially now blogdom has dogs on its side: a sturdy and invincible alliance. They got to the dogs first! ® Bootnote: In a charming report, The Japan Times lists Bowlingual alongside other interesting technology that's designed to relieve stress, including the 'People In The Sun' doll:- "Running on solar energy, the humanlike dolls, with smiling or meditative faces painted in pastel colors, slowly nod their heads." Apart from a White House Press Conference, what does this remind you of?
CompetitionCompetition Some time back, we ran a competition to find a Reg corporate anthem, and a great success it was too. The winner was one Peter Dykes, who penned a rousing Vulture Central lyric to the classic Red Flag. At the time, we thought it would be nice to get someone to knock up a backing track so we could offer it as a download. Then we forgot all about it - until now. Cue an email from Daniel Vincent, resident of sunny Croydon. Daniel offered the services of his band, Onionjack, as Register house outfit. Naturally, we were very flattered, but after listening to Onionjack's meritorious efforts on www.mp3.com, decided that what we needed was not merely a house band, but a house band singing the Reg anthem. So, we're inviting readers to record a version of our anthem which, as already noted, is to the tune of The Red Flag ("O Tannenbaum" - a quick Google should find you an mp3 version): Don't be taken in by new IT Without you've seen the full SP There's only one place on the net Where you can read the truth, you bet. The URL, you know the one Has all the news and lots of fun The Register, The Register Will tell you just what's going on. If your OS is a bag of nails And you're hacked off each time it fails To find out why this thing should be Just read The Reg and you will see Monopolies are one big con We will not rest till they are gone The corporates and all their lies Cannot escape the Vulture's eyes The fingers lie in bloody pools Of those who would the Vulture fool Their hands are well and truly bit And they feel like a sack of shit If they had told the truth complete They'd have the kit to beat their meat But as it they can't even stick Two fingers up at The Register So all of you who live in fear Of bastards trying to bend your ear Dont listen to a word they say But read The Register every day Buy the shirt and you'll look slick Have a vulture tatooed on your dick Go down the pub and have a beer We'll keep the Reg flag flying here Yup, we want the whole thing with instruments and vocals and everything. The style is up to you - surprise us. We need the finished article in mp3 format and are allowing entrants until the end of May to complete their efforts. We'll give you a nudge as the deadline draws close. And the prize? Well, apart from international fame and fortune - in the form of your pic and a suitably overblown puff piece on The Register - and the chance to have your anthem as a permanent downloadable fixture on the site, we'll throw in a raft of t-shirts and stuff. Who knows, if the standard is high enough, we may even give some kit to a few runners-up. So, dust down your midi keyboards and get to it. Here are a few rules: The anthem must use the above lyrics and be to the tune of The Red Flag (See here for more on tune) For reasons of length, entrants may choose to cut one or more verses, but not the first two or last two It must be delivered in mp3 format to email@example.com Send a lo-fi version first, and we'll get back to you Closing date for entries is 5.00pm GMT on Saturday 31 May