24th > October > 2005 Archive

channel

Flaw finders score loyalty rewards from iDefense

Security firm iDefense, a subsidiary of VeriSign, announced on Friday the recipients of two rounds of bonuses rewarding the most prolific researchers taking part in the firm's Vulnerability Contributor Program (VCP). The researchers split $40,000 in bonuses: Three people divvied up $10,000 awarded to the top flaw finders for the quarter, while five researchers received $30,000 split among the most prolific flaw finders for the year. One researcher, identified only by his handle "infamous41md," took home an award in each category for a total of $13,000 in bonuses for the quarter. The awards come as security researchers are still debating whether such bug bounties help make software more secure. However, despite questionable benefits, the programs are becoming more popular. In July, TippingPoint, a subsidiary of 3Com, announced its own program, the Zero-Day Initiative. And the Mozilla Foundation pays researchers who find serious security holes in its Internet browser. This year, iDefense has published almost 120 vulnerabilities discovered by participants of the firm's flaw-finding program, according to the company's Web site. Copyright © 2005, SecurityFocus
Robert Lemos, 24 Oct 2005

Marconi sale nears

The race for Marconi is nearly over - bidders have until tomorrow morning to get their offers in. The telco equipment maker has been looking for a buyer since May when it failed to get on BT's supplier list for its 21CN next generation network. Alcatel and Ericsson are still interested in the once mighty British firm, according to the Sunday Telegraph. The paper said the outcome of the pensions issue will be as important as the attitude of shareholders. Marconi said it was in talks back in August but refused to name names. Chinese firm Huawei Technologies was mentioned at the time. One obstacle to the takeover is a hole in Marconi's pension fund. UK pension regulators want £400m put aside to cover the company's pension liabilities. More from the Telegraph here.®
John Oates, 24 Oct 2005
homeless man with sign

Office12 includes Business Intelligence

Microsoft is adding business intelligence software in the next version of its Office software - Office 12 - due in the second half of next year. Office 12 will make better use of Excel spreadsheets to analyse company data. It will be easier to link Excel to SQL servers to get better access to enterprise information. Spreadsheet functions will also be improved. Excel Services will make it easier to share and secure spreadsheets. Microsoft Office Business Scorecard Manager 2005 will be available from November for $5,000 for the server plus $175 per user. It will allow users to track KPIs (Key Performance Indicators). More details on the Microsoft website here and here.
John Oates, 24 Oct 2005
arrow pointing up

BMC gets even more service oriented

Fred Johannessen, VP and program executive of capacity management and provisioning at BMC Software, describes SORM (Service Oriented Resource Management) as the “third leg on our BSM (Business Service Management) stool”. He claims that BMC has strong Process Management capabilities with its Configuration Management Database-enabled identity and change management tools; strong Service Impact Management to relate business services to its Patrol infrastructure monitoring and management tools; and then, “we can manage the service,” he says, “we can tell you what is causing a problem, and now, with SORM actually solve the problem before it impacts the service”. Well, that is good, although SORM seems to be mostly a repackaging of what BMC already has. We appear to be looking at, initially, “just in time” (JIT) provisioning of additional, virtualised, resources (real-time capacity management) if monitors detect emerging issues – exploiting the reasonable assumption these days that you're over-provisioned by about 85 per cent for much of the time, if only you can find this spare capacity and share it effectively. BMC has tools to help you discover and manage your resources; SORM adds the capability of allocating them to business services (“provisioning”) in near real time. Its tools are policy-based and virtualise the underlying hardware infrastructure. This is almost essential for effective BSM, as modern IT infrastructures are complex and constantly changing. Policy-based systems are more manageable because policies often needn't change when hardware does. The BMC approach to SORM includes: A new BMC Virtualizer Suite for high availability (fail-over in a pool of shared servers) and capacity-on-demand management. A new version (7.3) of BMC Performance Assurance for Virtual Servers, for performance management and prediction). Cross-solution integration with existing BMC tools including Performance Manager, Performance Assurance, Marimba Software Configuration Management and Discovery and the BMC Atrium CMDB. However, we think Managed Objects actually formulated the BSM concept SORM suports (that is, the idea of managing operational IT systems in the context of the business services being delivered, not the hardware or software packages that aren't of real interest to the business) in the first place. So we talked to Dr Jim White, Business Technologist at Managed Objects, about his views on SORM. He claims to identify a key differentiator between Managed Objects' approach and that of BMC (beyond the fact that BMC has an excellent HelpDesk and Managed Objects doesn't see that as being within its scope). BMC primarily relies on agents for infrastructure discovery – agents can be very intelligent and discover detailed in-depth information about your inventory, but they must be loaded (so they'll miss technology you've overlooked, and there's a deployment overhead) and are potentially invasive (performance impacting), although these issues can be mitigated in a well-designed system. Managed Objects, however, takes a lightweight, agent-less approach, so its discovery is usually both shallower and wider – and deployment costs are lower. Both approaches can work well if well implemented, and White notes that, in fact, a hybrid approach is emerging – in which agents are installed on your key business critical systems and agent-less technologies used on the rest. Nowadays, BMC is as much an evangelist for BSM as Managed Objects, and who was first doesn't matter much. What does matter is the completeness of the service management vision and that it is based on independent standards (in this context, ITIL is important) and that it is built on a strong technical foundation (in the present case, this means that a “Configuration Management Database, CMDB, should be available). An effective CMDB is an ITIL concept and there is more to it than just the name – Gartner, in its "Hype Cycle for IT Operations Management, 2005" report puts CMDB at the "peak of inflated expectations" with the "plateau of productivity" some 10 years out. It should be logically centralised, physically distributed, model the structure of IT service delivery and be maintained by some form of automated discovery. Both BMC and Managed Objects appear to have a CMDB worthy of the name, but don't underestimate the effort and maturity needed to actually implement one in your company. In the end, BSM is important because it helps to nail the gap between IT and the business shut – and its prospective implementers have to decide for themselves which BSM toolset (there are others too, from IBM Tivoli for example) suits them. Nevertheless, underlying all this BSM and SORM vision is an assumption that the parent organisation is reasonably mature – that it is committed to knowing or measuring what it has and how it is performing (for purposes of improvement, not blame), to documenting its goals, and to reducing the gap between reality and aspiration. This could be one of the reasons Gartner put CMDB maturity some 10 years out. If you work for the sort of company that values fire-fighting heroes and technical efficiency over effective business service delivery, attempts at implementing BSM or SORM may be doomed to failure. And what does this have to do with developers? Job security, that's what. The business pays for IT resources in order to do business more effectively. It doesn't, in the end, care whether you can process 500 transactions/second 24x7, it cares about how many new customers it can process in a day and how quickly it can complete whole business transactions (and your 500 tpi 24x7 system may still deliver a two week overall turnaround for new contracts, say, if manual or batch bottlenecks aren't removed). Unless developers can deliver complete, working business systems that are resilient and easily managed at the business level, developers may well fall foul of one of several nasty (from their point of view, and possibly only their point of view) kinds of outsourcing. BSM is part of ensuring that IT not only delivers manageable business systems but also is seen to deliver them, and SORM helps complete the BSM vision for BMC's customers. ®
David Norfolk, 24 Oct 2005
cloud

Met police hooks up with commonwealth to fight cybercrime

London police and industry are teaming up to fight economic crime. The inaugural meeting of the New Scotland Yard Economic Crime Working Group took place at Lord's Cricket ground in London on Thursday. The Economic Crime Working Group, announced at the Commonwealth Technology Forum in July, will address areas such as cyber crime, data and identity theft, counterfeiting and intellectual property rights as well as money laundering. The group has been put together with the assitance of the Commonwealth Business Council. In particular, it will focus on international crime and reputation risk. The initial meeting included British government representatives from the Foreign and Commonwealth Office and Department of Trade and Industry. India, Pakistan, Bangladesh and the Philippines sent diplomatic officials, along with the Indian National Association of Software and Service Companies (NASSCOM). Clearing organisation APACS, the National Hi-Tech Crime Unit and City of London Police were also represented. The high powered meeting also featured representatives from industry (including McAfee, White Hat, KPMG and Capgemini) and banks including Deutsche Bank, Merrill Lynch, HSBC and Abbey. The working group is designed to increase awareness in the business community of cybercrime risks and provide advice to policy makers. It is also designed to provide a fourm for the sharing of best practices in fighting cybercrime. Speaking at the inaugural meeting, Metropolitan Police Assistant Commissioner Tarique Ghaffur said: "Economic crime has to be treated as a global issue and so this unprecedented partnership with the CBC will allow us to pool intelligence from across many regions. It is important to note that this initiative will include those private sector organisations most at risk - particularly the SME sector - in addition to international law enforcement agencies and governments all working in partnership toward a common goal." Director General of the CBC, Dr Mohan Kaul added that cyber-criminals can only be beaten by working within a "multi-stakeholder and multi-jurisdictional framework, engaging the private sector across the 53-nation Commonwealth region, and beyond." Dr Linda Spedding, international lawyer and chair of the working group, added: "This is a very different initiative that is focused on marshalling a number of international resources into measures that should directly reduce operational and reputational risk for companies in the sectors that are subject to these forms of crime." ®
John Leyden, 24 Oct 2005
channel

AMD slashes chip prices by up to 25%

AMD took the axe to its desktop and mobile processor families this morning, chopping up to 26 per cent off what it charges for its chips. The company reduced prices for its mobile Turion 64, dual-core desktop Athlon 64 X2 series, Mobile Athlon 64 products, and both mobile and desktop Sempron processors. Athlon 64 and Athlon 64 FX prices remain unchanged, though AMD took the opportunity to drop its two bottom-most Athlon 64 parts, the 3000+ and 2800+. The Mobile Athlon 64 2800+ is also no longer with us. Gone too are the Socket 754 desktop Semprons, the remaining Socket A desktop Semprons and all Socket 939 versions without 64-bit support. AMD's full price list, valid as of 24 October 2005, can be found here. ®
Tony Smith, 24 Oct 2005

Hynix slams Japan's 27.2% DRAM tax plan

Hynix has criticised plans announced by Japanese regulators to slap a 27.2 per cent punitive import duty on its DRAM products as "unfair" and "inappropriate". The intention to impose the levy emerged last week, and follows complaints made in June 2004 by Japanese memory makers. They alleged Hynix had received unlawful financial aid from the South Korean government which provided the memory maker with an unfair competitive advantage that was hurting the Japanese memory industry. The Japanese government announced an investigation into the claims the following August. More than a year later, its verdict is that Hynix did receive unwarranted state funding, and unless Hynix can show otherwise, the 27.2 per cent duty will be duly levied. The provisional ruling is likely to be made formally before February 2006. The levy and the reasons behind it mirror similar events in Europe and the US, where regulators imposed import duties of 34.8 per cent and 44.3 per cent, respectively. Hynix subsequently complained to the World Trade Organisation (WTO), which has since told European and US authorities to rethink the levies. That said, the WTO in June reversed an earlier ruling against the US levy on appeal, and it's possible the EU verdict may go the same way. For its part, Hynix said it would "actively counter" the Japanese tariff, suggesting it will similarly ask the South Korean government to raise the matter on its behalf with the WTO. In the mean time, it has until 21 November to lodge a formal complaint against the provisional ruling. Hynix said the ruling was unfair because "current market conditions" make it difficult to conclude that anything Hynix did damaged Japanese firms. What's more, it said, "the claims Hynix received [illegal] state subsidies are not backed by clear legal evidence", the Financial Times noted today. ®
Tony Smith, 24 Oct 2005

Contamination delays Venus Express launch

Mission managers at the European Space Agency have postponed the launch of Venus Express, Europe's first mission to our so-called twin planet. The new launch date has not been announced, but ESA says it will be "several days" late. The launch was due to take place atop a Soyuz-Fregat launcher this Wednesday (26 October), but during final preparations for the launch, insulation from the rocket launcher was found to have contaminated the satellite, ESA said. The contamination could have come either from the fairings, which protect the satellite during the launch, or from the upper booster stage of the rocket, the BBC reports. A spokesperson told the BBC: "The satellite is contaminated, so they will have to dismantle and re-mount it." If its launch is successful, and its journey to Venus uneventful, Venus Express will slot into a polar orbit around our sister planet some time next spring. The satellite is set to gather data on the Venusian atmosphere so scientists can work out how the conditions on the planet arose. Although Venus is very similar to Earth in size, its environment could hardly be more different. It has an incredibly thick and acidic atmosphere causing crushing pressures of around 100 Earth atmospheres at the surface, and an ambient temperature of nearly 500C. ®
Lucy Sherriff, 24 Oct 2005

Goodfellas voted top movie ever

Martin Scorsese's Goodfellas has been awarded the top movie ever crown by Total Film - relegating the critics' perennial fave Citizen Kane into sixth spot. Scorsese is also honoured for Taxi Driver at 14th, but Francis Ford Coppola emerges as director of choice with three titles in the top 25 (see below). Naturally, the list will provoke plenty of debate among film buffs - especially since the Texas Chain Saw Massacre (recently voted best horror flick of all time) is apparently a greater cinematic acheivement that Sunrise or Apocalypse Now. And as for Goodfellas; well, it's good, but is it that good? Here's the Total Film top 25 for your consideration: Goodfellas Vertigo Jaws Fight Club The Godfather: Part II Citizen Kane Tokyo Story The Empire Strikes Back The Lord of the Rings trilogy His Girl Friday Persona Chinatown Manhattan Taxi Driver It's a Wonderful Life The Apartment Once Upon A Time In The West All About Eve The Texas Chain Saw Massacre Apocalypse Now Crash Sunrise The Godfather Rear Window Sunset Boulevard
Lester Haines, 24 Oct 2005
channel

Virus writers craft PnP botnet client

UpdatedUpdated Virus writers have created a botnet client that uses a recently discovered Microsoft vulnerability to spread. Mocbot uses the same MS05-039 as the infamous Zotob worm in an attempt to create a botnet of compromised, "zombie" PCs under the control of hackers. Early indications are that the attack is not particular successful. Mocbot tries to connect to two IRC servers in Russia, but the servers seem to be down (or overloaded), according to Finnish anti-virus firm F-Secure. "we received reports that the bot channel may instruct all joining bots to start automatically scanning for vulnerable computers, thus acting as automatic worms. But both channels used to control this spread are not working," said said Mikko Hyppönen, chief research at F-Secure. Despite the relatively low risk from Mocbot users are strongly urged to patch against the vulnerability (MS05-039) exploited by both it and the more prolific Zotob worm. Zotob crashed the networks of several high profile media outlets (including CNN, ABC, The Financial Times, and the New York Times shortly after its release in August. ® Update Early and, as it turns out, incorrect analysis suggested that Mocbot used a more recently Microsoft Plug and Play vulnerability, MS05-047. This was incorrect. The confusion was caused by the exploit code used by Mocbot, which resembles a publicly available exploit code for MS05-047.
John Leyden, 24 Oct 2005

Virgin and Carphone Warehouse off to France

Virgin and Carphone Warehouse have confirmed to the Stock Exchange that they are talking about a possible joint venture in France. A statement was released this morning and more information is expected soon. The joint venture will be branded as Virgin. The two firms are setting up as a Mobile Virtual Network Operator - using someone else's network to offer branded services. The two are talking to Orange about using its network. The French government is keen to promote MVNOs to encourage competition. Carphone has 200 stores in France and an existing mobile offering called Omer. Virgin and Carphone are already virtual operators in the UK and both use T-Mobile's network.®
John Oates, 24 Oct 2005

Europe looks to adapt to climate change

European authorities are launching a new programme to investigate ways that the continent can reduce its greenhouse emissions, and adapt to the effects of climate change. The second European Climate Change Programme kicks off today at a conference in Brussels. One of the main aims will be to advocate for a "meaningful global climate change regime" post 2012, when the Kyoto Protocol expires. Environment Commissioner Stavros Dimas said that recent extreme weather events around the world "are consistent with scientific findings about the effects of our changing climate". He said it was high time new measures for combating climate change were developed, to bring emissions below the targets set in the Kyoto protocol. The programme is set to focus on "new cost-effective measures and technologies that will allow the EU to reduce its greenhouse gas emissions", the Commission said in a press release. Areas of particular interest will include passenger road transport and aviation. The programme will also have working groups examining the field of geological carbon capture and storage, and on how Europe might adapt to the effects of climate change that cannot be avoided. The emphasis on developing new technologies to reduce emissions, rather than imposing restrictions on homes and businesses is broadly in line with the US' publicly stated views on the subject. The UK's minister for climate change and the environment, Elliot Morely, will also be speaking at the conference. Morely said in late September that he favours investment in so-called clean fossil fuels over a return to nuclear power. ®
Lucy Sherriff, 24 Oct 2005

NTL unveils Sky+ rival

Tech DigestTech Digest Top stories on HDTV UK NTL unveils Sky+ rival Following on from its merger with Telewest a few weeks back, cable company NTL is finally starting to rev up for its upcoming battle with Sky, and anyone else for that matter, in the HDTV and personal video recorder markets. On Saturday the company announced details of its soon-to-be launched personal video recorder the Explorer 8450DVB. The model will apparently be HD and MPEG4 compatible and is being manufactured for NTL by Scientific Atlanta. It will sport a trio of TV tuners enabling users to record three programmes simultaneously. Other features include a 160GB hard drive (the same as Telewest's model and the current top-end Sky+ box), USB 2.0 and Ethernet connectivity, and the option of multi-room which will enable users to stream programmes to other Scientific Atlanta boxes in their home. Telewest, NTL’s one time friendly rival and now partner in the merged company, also has a unit which includes a similar set of features but isn't MPEG4 compatible. Panasonic adds low-end HD-ready LCDs Top story on Tech Digest Boffins deliver Plasma screen fireplace The Tech Digest team spent the weekend hanging out with the nation’s boffins at the British Invention Show in Alexandra Palace north London. We saw some pretty amazing things (more on those later), and some completely hat-stand ones too and at the moment we are not entirely sure which category the Plasma Screen Fireplace fits into. As you’d imagine the Plasma Screen Fireplace is a rather pleasant looking fireplace and surround that houses a plasma screen. The idea is that when the screen is not in use it slides away inside the fireplace, so all you see is the fireplace and genuine electric flame style heater. Obviously we had concerns that the from the fire might lead to accidental incineration of our beloved plasma screen, but apparently there’s no worries on that score as heat from the fire pushes forward which means that the plasma screen would be unaffected. There's a range of fireplaces available from the Beachcomber (£2140) to the top-end "Firelight Modern", which has a remote controlled fire, built in downlights and a space to conceal the DVD/VCR. Alas that price doesn't include the screen. More here. Who won what at The What Hi-Fi Awards Sanyo slims the camcorder Top story on Bayraider Hyperdimensional oscillator Forget Prof Hawking and his theories of space and time. If you want to take your brain to another dimension, just head to this auction and grab yourself a hyperdimensional oscillator. "The device incorporates several elements of futuristic Tesla technology, psychotronics and quantum physics to allow the user to access other dimensions of space time," says the listing, which sounds good to us. And for just $89.95, which is even better. The listing has a lot of information about cosmic energies and bioenergy fields to persuade you that it's a good investment. We're sold! Superman’s wig Ridiculously OTT mobile phones PC tip of the day from Propellerhead Q. Is it possible to connect two PCs together using a USB cable? I want to copy files from my old Windows 98 laptop, which is now getting on a bit, to the hard drive on my new XP desktop computer? Andrew Hayes A. Whoa there Andy! Whatever you do don’t try it, at least not with an ordinary USB cable, it’s odds-on you will zap the USB ports on one or both computers. Standard USB cables carry a supply voltage, to power peripherals like scanners etc, so it’s really not a good idea to have two PCs trying to suck and blow the juice out of each other… There is a way to do it but you must use a specially designed link cable and file transfer software. There’s plenty to choose from, like the Expansys USB Data Transfer Cable, which sells for around £16. Better still; why not create a simple wireless network. It’s really easy with Windows XP and the extra bits and pieces should only cost you around £50 to £80. You won’t have to retire your laptop and you’ll be able to use it around your home and in the garden and share your broadband Internet connection. Email request notification Dealing with Spyware Popjunkie (great lost pop albums) Actor in quality pop album shocker Hippy Shopper (ethical buying) Recycled backpack Winter shoes for cyclists Mazda's greener people carrier Games Digest Games out this week Games out next week There's loads more of this stuff at Tech Digest, Shiny Shiny, Green consumer blog HippyShopper, and Bayraider, which highlights the best and worst of online auction sites.
Tech Digest, 24 Oct 2005

IBM seeks mid-market relevance

Quocirca’s Changing ChannelsQuocirca’s Changing Channels As part of its drive to make itself more relevant in the mid-market IBM has launched three new, what it calls, IBM Express Managed Services (IEMS). These are in addition to a couple of existing services, one of which IBM is keen to remind us of and another which it would prefer to sweep under the carpet. All these services are being sold via IBM’s channel partners and are available in the UK and certain other European countries. IBM has long faced a challenge in the mid-market. The infrastructure software that mid-sized businesses (defined by IBM as 100 to 1,000 employees) choose is predominantly that of its main rival Microsoft and the hardware they run it on is most likely to come from two other rivals; HP and Dell. Whilst IBM is prepared to fight for mid-market share with these competitors, it is an uphill battle. However, with managed services things look rather different. The market is still in the early stages of development and pretty fragmented. IBM has a chance to take the lead. Email Security is the existing managed service IBM is wants to remind us of. IBM likes to describe it as a “washing machine” for email. There is nothing new about this; similar services are already available from companies like Postini and BlackSpider. Microsoft has also recently entered this market through the acquisition of Frontbridge. In fact, in this case, IBM is simply reselling the service from the dominant European supplier of email filtering services – MessageLabs. One of the new managed services IBM has announced is Email Recovery Services, a stable mate to email Email Security. Again there is nothing unique about this, Microsoft/Frontbridge can already offer this, Postini has announced plans to do so and there similar services from specialist vendors like Mimecast. However, when viewed alongside the second of its new offerings – On Line Backup and Recovery – IBM’s capabilities start to look more comprehensive than the others. On Line Backup and Recovery does what it says on the box. This should prove attractive for mid-sized businesses because it kills two birds with one stone. First data backup is automated and regular and second it goes straight off-site – an essential step for data protection that many organisations fail to take. Again IBM is not the first vendor to offer such a service - there are many others such as Netstore, Datafort and Iron Mountain - but if it can persuade businesses to switch to a managed service for this most fundamental of requirements, it has opened the door for its email security and recovery services and other Express offerings The third of the new offering – RFID Solution for Mid-market – is more niche, but likely to be increasingly important. Many mid-sized businesses are being required by their larger customers to interface to radio frequency identification systems (RFID). A good example is a small supplier to a large supermarket having to supply all their products with RFID tags. IBM’s service will allow the small supplier to source the tags and interface with the supermarkets own RFID system without having to invest in any new technology on its own premises. What of the exciting service that IBM is less keen to talk about? In 2004 IBM signed an agreement with Siebel Systems to resell its Siebel On-Demand hosted CRM service. It is not clear how successful this has been for IBM, and since Oracle acquired Siebel last month it has turned into the hot potato in IBM that no-one wants to hold. Anyway, IBM says it is not an IEMS: although try telling salesforce.com that hosted CRM is not relevant to the mid-market. If IBM is to become a leading vendor of managed services to the mid-market it will need to be more committed to these new offerings than it has been with Siebel OnDemand. As most of the new offerings are infrastructure services rather than business applications, they are closer to IBM’s core competencies – if it executes well it might be able to take a lead, but it will need the help of the channel to do so and will need to expand its partner base to bring in more resellers with a mid-market focus. Copyright © 2005, Bob Tarzey is a service director at Quocirca focussed on the route to market for IT products and services in Europe. Quocirca is a UK-based perceptional research and analysis firm with a focus on the European market.
Bob Tarzey, 24 Oct 2005

Original atomic keyring resurfaces

Cash'n'CarrionCash'n'Carrion Last week, Cash'n'Carrion got a call from the purveyors of the legendary atomic keyrings to say that they had unearthed a stash of the original GlowRing X2 in their tritium-powered novelty warehouse at an undisclosed location in the south of England, and did we fancy taking them on? Well, it was an offer we couldn't refuse, since we are reliably informed that these are absolutely the last stocks of these vintage glow-in-the-dark keyrings to be had anywhere. Which would explain why they're available in just pink and green and at the knock down price of £9.35 (£10.99 inc VAT) for two, and £42.55 (£49.99 inc VAT) for ten. That's to say - you can choose two green, or two pink, or ten pink or green or a mixed batch of ten green and pink. You get the idea. What's more, you pay just a quid P&P on this item, thereby saving vital cash which you can spend of lead underpants. The X2 is available in the Nite section of Cash'n'Carrion, as is the Nite Glowring Safety Marker and a raft of other illuminating products. Regular customers will recall that, due to international restrictions on the sale of atomic keyrings we can only ship GlowRings to UK addresses, so please don't order from anywhere else. ®
Cash'n'Carrion, 24 Oct 2005
channel

Dutch plan for prison call centres slammed

A plan by Dutch justice officials to establish commercial call centres in jails was greeted with disbelief last week. The Dutch Association of Cell Centres deemed the plan "totally unacceptable", in particular if inmates are drafted in to sell insurance policies, such as theft coverage. A government spokesman has dismissed the criticism and told news agency Reuters that in the Dutch prison system it is normal for prisoners to work part of the day. The idea of prison call centres is not new. In the US the federal prison system started offering customer service call centers a couple of years ago. Inmates sit in guarded rooms taking orders for goods and products and handling shipping questions. Dutch justice officials have already started negotiations with some call centres. Prisons in Nieuwersluis (near Utrecht) and Zwolle have been selected to offer so called dial-a-cell services first. What products and services will be offered is still a matter of debate. The Dutch Association of Call Centres fears the plan could damage the call centre community and has advised its members not to sign any contracts with prisons.®
Jan Libbenga, 24 Oct 2005

Apple sued over Nano scratches

Apple knew its iPod Nano was susceptible to scratching but chose to ship the product anyway - so claims a lawsuit filed against the Mac maker last week. The complaint was filed in US District Court in San Jose California by Seattle-based lawyers Hagens Berman Sobol Shapiro (HBSS) on behalf of California resident Jason Tomczak. The complaint alleges the Nano "is defectively designed, allowing the screen to quickly become scratched with normal use". The suit also claims "the excessive, rapid wear renders the device unusable". Apple, the plaintiff alleges, was aware of the problem, but chose to launch the Nano anyway. The Mac maker's behaviour was "deceptive and unlawful", the complaint maintains. "Rather than admit the design flaw when consumers began to express widespread complaints... Apple concealed the defect and advised class members that they would need to purchase additional equipment to prevent the screen from scratching excessively," the complaint said. It seeks the return of money the plaintiff spent on his Nano and costs arising from its return, replacement and return of the replacement for the same reason. While it does not yet have class action status, HBSS is hoping the District Court judge assigned to the case will allow it to be so classified. In September, Apple admitted that a small batch of Nanos contained a manufacturing fault that could cause the screen to crack. However, it did not accept allegations that the player's plastic shell was susceptible to scratching. It maintains the Nano is made of the same polycarbonate as previous iPods, none of which have yielded quite so many online complaints of scratching. HBSS is currently representing one of the individual plaintiffs suing Intel on the back of AMD's anti-trust action against the chip giant. Like the Nano lawsuit, the Intel complaint alleges behaviour in contravention of California consumer law. ®
Tony Smith, 24 Oct 2005

Aussies fine illegally-parked corpse

Melbourne council is unlikely to collect a parking fine imposed on a 71-year-old man for exceeding his alloted time in the car park of the Croydon Market shopping centre since he had lain dead for "several days" in the vehicle when an enforcement officer moved in. The man had been reported missing nine days previously and was known to be "seriously ill", The Age reports. The Mayor of the eastern suburb of Maroondah, Paul Denham, explained that the "parking officer had not noticed the man when he attached the parking fine to the windscreen", offering: "The parking bays are 90-degree with cars parked nose in. A small garden bed is located immediately at the front of the parking bays. Our local laws officer checked and wrote out the ticket at the rear of the vehicle and placed the ticket from the passenger side on the windscreen. The local laws officer did not notice anything unusual regarding the vehicle, and is extremely distressed to have learned of the situation." A female friend of the man was less than impressed with the explanation. She said: "From what the police had told me, it would have been very obvious he was deceased so I'm really disgusted." Premier Steve Bracks last week expressed his condolences to the man's family and friends, noting: "I think the whole community would be concerned at that sort of development." Bracks added there would be a full coroner's enquiry into the matter and any recommendations "will be adopted by organisations such as councils". ®
Lester Haines, 24 Oct 2005

RIM faces US Blackberry injunction

NTP is to ask a US District Court to confirm an injunction banning the sale of Research in Motion's Blackberry devices in the States after the US Court of Appeals threw out a request to put the legal fight on hold. RIM has asked the Court of Appeals to suspend the case against it while the US Supreme Court ponders whether it will hear RIM's appeal against the Court of Appeals' refusal to reconsider its own decision in the matter. In August this year, the Court of Appeals scaled back a ruling made in NTP's favour. Previously, it had said RIM infringed all of NTP's patents. But while its RIM-requested reconsideration of the ruling saw it reject some Eastern Virginia District Court infringement judgements, it upheld others, so NTP still has a case RIM must answer. Hoping for another change of mind, RIM asked the Court of Appeals to think again, but the Court refused, it was announced earlier this month. RIM essentially wants the Supreme Court to force the Court of Appeals' hand. In the meantime, the latest judgement means the case can now return to the District Court, which is expected to re-issue an injunction made in 2003 banning sales of Blackberries in the US. Certainly, NTP will ask the Court to confirm the injunction, the company said late last week. NTP sued RIM in 2002. Its initial success was limited by RIM's move to appeal against the Judge James Spencer's ruling in NTP's favour. The Court of Appeals' initial confirmation of Judge Spencer's decision persuaded the warring parties to negotiate a $450m settlement, though finalisation negotiations would eventually break down. ®
Tony Smith, 24 Oct 2005

Michigan PC glitch lets prisoners out early

A data processing glitch in Michigan has resulted in the early release of some inmates and the prolonged incarceration of others. The cock-up came to light after a state audit revealed errors in the release date of 23 prisoners between October 2003 and March 2005, local broadcaster WLNS reports. Eight prisoners doing time for crimes including embezzlement, cheque fraud and drug offences were released from state prisons between 39 to 161 days early, according to the audit. A flaw in computer programming is been blamed for the release date errors. The number of people affected only represent a small percentage of the overall prison population in the state but local politicians are still understandably concerned. "I'm glad it's not murderers. Obviously the category of crimes, the bad check [cheque] writers and the drug people, we still don't want those folks to get out early," said state assembly representative Rick Jones of the Michigan State Legislature. "Eight people is too many. I understand the department found another 15, that's too many, even one is too many." The Department of Corrections told WLNS that it had taken action to fix the computer glitch. ®
John Leyden, 24 Oct 2005

Hubble seeks breath of fresh air in lunar soil

The Hubble Space Telescope has joined the hunt for sources of oxygen on the moon, and it is already proving its worth. NASA has been imaging the moon's surface using Hubble's ultra-violet imaging systems, and has already discovered deposits of ilmenite, a mineral containing titanium and iron oxide. Two of the deposits are at old landing sites, but a third is in a region never visited by humans. Scientists should be able to confirm their initial findings by comparing the data Hubble has collected from the three sites, and comparing them to rock samples brought back from the Apollo landing sites. The findings from Hubble's survey will help mission planners work out the best places to send future robotic prospecting missions, NASA says. The work is all part of establishing candidate sites for future lunar bases. The data will boost what little we already know about lunar geology. Hubble was not designed for taking pictures of anything as close up as the moon, and NASA astronomers said that making the observations had been both "interesting and challenging". Jim Garvin, chief scientist at NASA's Goddard Space Flight Center, noted: "Our initial findings support the potential existence of some unique varieties of oxygen-rich glassy soils in both the Aristarchus and Apollo 17 regions. They could be well-suited for visits by robots and human explorers in efforts to learn how to live off the land on the moon." He added that it would be some months before the data had been fully processed and analysed. ®
Lucy Sherriff, 24 Oct 2005

Novell job cuts by 31 October - reports

Novell is preparing to lay off as many as one in five of its staff, according to reports. About 20 per cent of Novell's 5,800 staff could lose their jobs. An announcement is expected before the end of its financial year - 31 October. Novell's non-core products are likely to be hit hardest. ZDNet predicts Novell's Extend product, based on SiliverStream Software, is likely to suffer. The company is also predicted to pull out of regions where it is underperforming and leave sales to resellers. A highly-placed source at Novell told Ziff that as many as 1,160 could get the sack and would likely hear this week. Novell has struggled to reinvent itself after seeing market share for its network products disappear. Sales and profits for the last quarter were both down. In 2004 Novell paid $210m for SuSE Linux to help buy itself a slice of the corporate Linux market. More details here.®
John Oates, 24 Oct 2005

Cisco targets emergency staff with IP radio

Cisco is offering emergency services and big enterprises which use push-to-talk radios a way to integrate the devices with the rest of their communication systems. The system will convert radio signals into IP packets and route them over the IP network. That means that radio communication, which usually uses a proprietary system, can be integrated with the rest of the system. So radio users will be able to talk to anyone else on the network whether they are using a radio, a phone or other device. Emergency services are one potential market because individual services often use their own systems which cannot be linked to each other or to phone networks. Big enterprises are also big users of radios - Amsterdam's Schipol Airport has 14,000 radios in use. Cisco's IP Interoperability and Collaboration Systems technology includes server hardware and software and Cisco IPICS Push-to-Talk management Center. The company says it is working to build awareness because managers of radio networks do not necessarily know who Cisco is. It expects to bring on more channel partners as the technology matures. The technology is available to select customers now and will be available globally in the next 6 to 12 months. More in the press release here.®
John Oates, 24 Oct 2005

Apple admits SuperDrive 8x media reliability glitch

Apple has finally coughed to the existence of a glitch numerous PowerBook G4 owners have been pointing out for months. Says a recently published Apple Knowledge Base entry: "Some earlier PowerBook G4 computers that have a SuperDrive with a 2x [sic] DVD-R write speed may not be able to burn 8x DVD-R media reliably. Because of this, you should only use 2x or 4x DVD-R media." We believe Apple means "SuperDrive with a 8x DVD-R write speed". Certainly, that's the component that's been causing a number of PowerBook users grief. Despite the 8x claim, they have been able to write DVDs at, at best, 2x or 4x performance. One '8x DVD-R' owner, Steve Haigh, went as far as petitioning Apple for a firmware fix. He's likely to remain disappointed - while acknowledging the problem, Apple's KB entry provides no solution. Interestingly, the KB entry - read it here - points to problematic "Matshita [sic] DVD-R UJ-816" drives on a range of PowerBook G4s, and offers a quick way for notebook owners to check whether they are affected by the problem: assuming, of course, they don't already know because they've tried to burn 8x discs. Presumably Apple means "Matsushita" drives. Interestingly, the drive that appears to have caused users the most trouble is the Matsushita UJ-835E, which isn't mentioned in the KB article. ®
Tony Smith, 24 Oct 2005
cloud

Most DNS servers 'wide open' to attack

Four in five authoritative domain name system (DNS) servers across the world are vulnerable to types of hacking attacks that might be used by hackers to misdirect surfers to potentially fraudulent domains. A survey by net performance firm the Measurement Factory commissioned by net infrastructure outfit Infoblox of 1.3m internet name servers found that 84 per cent might be vulnerable to pharming attacks. Others exhibit separate security and deployment-related vulnerabilities. Pharming attacks use DNS poisoning or domain hijacks to redirect users to dodgy urls. For example widespread attacks launched in April attempt to fool consumers into visiting potentially malicious web sites by changing the records used to convert domain names to IP addresses. These particular pharming attacks exploited name servers that allow recursive queries from any IP address. Recurssive queries are a form of name resolution that may require a name server to relay requests to other name servers. Providing recursive queries to arbitrary IP addresses on the internet exposes a name server to both cache poisoning and denial of service attacks. Such requests should be restricted to trusted sources. But the study found that up to 84 per cent of the name servers investigated relayed requests from world + dog, violating best practices and opening the door to possible hacking attack. The survey also revealed that more than 40 per cent of the name servers investigated provide zone transfers to arbitrary queries. Like recursive name services, zone transfers, which copy an entire segment of an organization's DNS data from one DNS server to another, should only be allowed for a designated list of trusted, authorised hosts. Network configuration errors in setting up redundant servers for extra availability were also uncovering during the study, which involved using a series of carefully designed queries in order to gauge the relative vulnerability of each name server to attacks or failures. Cricket Liu, vice president of architecture at Infoblox and author of O'Reilly & Associates' DNS and BIND, DNS & BIND Cookbook, and DNS On Windows Server 2003, said "Given what enterprises are risking - the availability of all of their network services - these results are frightening, especially since there are easy ways to address these issues." Infoblox has come up with a list of 'top tips' designed to help enterprises to guard against DNS vulnerabilities: If possible, split external name servers into authoritative name servers and forwarders. On external authoritative name servers, disable recursion. On forwarders, allow only queries from your internal address space. If you can't split your authoritative name servers and forwarders, restrict recursion as much as possible. Only allow recursive queries if they come from your internal address space. Use hardened, secure appliances instead of systems based on general-purpose servers and operating software applications (such as InfoBlox's appliance for DNS, we guess the firm is saying here, well it had to get a product pitch in there somewhere). Make sure you run the latest version of your domain name server software. Filter traffic to and from your external name servers. Using either firewall or router-based filters, ensure that only authorized traffic is allowed between your name servers and the Internet. ®
John Leyden, 24 Oct 2005

Guardian explains its vanishing ricin story

Things certainly move fast in the news business - six months after 'disappearing' a particularly valuable article on the ricin 'conspiracy' trial from its web site, the Guardian has put it back, and taken a stab at explaining itself. But it has not, we fear, been entirely successful. We covered the original disappearance here, while the article itself is now back in (almost) all its glory here. Back in April it swiftly became apparent that the Guardian had pulled the original article because it had inadvertently breached the terms of a Public Interest Immunity Certificate. The PII in question was intended to keep the names of the Porton Down scientists who gave evidence at the trial confidential. The Guardian (it now tells us) had been unaware of the existence of the PII, and had named them, following which the Ministry of Defence drew its attention to the PII, and it pulled the article. All of that makes sense, apart from the bit about pulling the article, and the six months the Graun has spent contemplating its navel. So, cue Guardian Readers' Editor Ian Mayes' explanation. "In the light of the letter from the Ministry of Defence," writes Mayes, "the Guardian immediately removed the article from its website. It did so on the advice of its lawyers, who then set out to clarify the situation and in particular to obtain a copy of the relevant order." Back in April removing the whole article rather than just chopping out the names seemed a serious overreaction, and six months on the Guardian's restoration of the article after, er, just chopping out the names does seem to confirm this. It was, back in April, perfectly possible to confirm the nature and existence of the PII without going through the loops Mayes' explanation describes, and it was probably just about possible to do this and edit the article accordingly without taking it down in the first place. Failing that, it was certainly feasible to have it back up in edited form within a couple of days. The reason this didn't happen, we would suggest, has nothing to do with the "conspiracy theories" Mayes mentions, and quite a lot to do with the Guardian not quite grasping that the standard defensive legal moves you make in paper publishing aren't always the ones you should make in web publishing. With paper publishing your concern is usually the damage you may have already done and you're generally not in any great hurry, while in the case of web publishing it's still up there until you take it away. So you need to make a rapid assessment of the damage, what's causing it and what you should do about it. Safety first says pull the lot, but that's not satisfactory in the longer term, because when word gets around that you'll give in the moment a lawyer says boo, well, they're all going to be at it, aren't they? The Guardian is widely regarded (not least, by itself) as a web publishing success, and as such it seriously needs to get its corporate head around this, particularly in the case of a story as important as the ricin one. Mayes asks the right questions in his final paragraph, but we're not sure whether or not the last sentence is supposed to count as an adequate answer to all of them. ®
John Lettice, 24 Oct 2005

Biometric ID - are they counting on your fingers?

LettersLetters Three ID card-related stories in the past week drew attention to the issues of accuracy and security in biometrics. Slightly unexpectedly, Jerry Fishenden added Microsoft to the ranks of those critical of the UK ID card scheme, worrying that it could actually increase ID fraud. Jerry, whose full article can be read here, is concerned that the existence of a centralised biometric database could have a honeypot effect, while the widespread use of biometrics could spawn "new hi-tech ways of perpetrating massive identity fraud". Nigel Sedgwick of Cambridge Algorithmica (who's been in a touch a lot this week - more shortly) writes: It's a struggle to understand the relevance of the comments by the MS man. Just about all of any individual's biometric samples are (as envisaged for the NIDS) on continuous display: face for anyone, iris for IR video capture, fingerprints left all over the place. The additional risk from the NIR is, as far as I can understand, "just" that a (stolen) copy of the whole biometric database could be used to find people with biometrics sufficiently like those of any particular bad guy/gal (though most likely only of a single biometric modality or instance), that those people could be particularly good targets for identity fraud. However, if the whole NIR database of biometrics is a closely guarded secret, it is not particularly likely to be compromised. Protection approaching that of the Trident launch codes might well be appropriate, or that of our military and diplomatic cipher keys (the old ones still being very useful even following change). Next, why would anyone be allowed access to more than 1 bit of information on each access to the NIR. That bit being whether the whole of any provided enquiry was actually an adequate match with the information held on the NIR for the person in question (with or without biometrics). With suitable protection against repeated enquiries, it would be difficult (from very to extremely) to build up knowledge on a registered person. Which are perfectly valid points - you leak your biometrics all over the shop, so they're not exactly secret, and the NIR is intended to be such a critical piece of infrastructure that it at least ought to be highly secure. But we're not sure this really deals with Fishenden's worries. Critical systems secured by biometrics have not yet become a sufficiently big and attractive target for us to know how much of a threat the spoofing of biometrics might present. Nor do we know about the viability of biometrics as strong ID that could be applied on a widespread basis - for example, use of fingerprints in credit card transaction. Note that in this example fingerprint and PIN is used. In the world as it is now, PIN, personal details and personal plastic can all be obtained; so, will biometrics present a difficult extra barrier to surmount, or might to all go horribly wrong, and make the situation far worse? At this stage, Fishenden's worries seem justified to us. The security of the NIR isn't something there's a lot of point in arguing about before it's built. Critics point to the Government's poor record on IT projects, the Government insists it knows what it's doing and it will be secure, and then you move into the 'oh yes... oh...' panto argument. Nigel's argument, we however submit, is based on what it would be sensible for the Government to do, rather than what it actually will do. Documentation published by the Home Office for potential contractors suggests to us that what it does will not be wholly sensible. It appears (see Procurement Strategy Market Sounding) to envisage the NIR being hosted by the private sector in two secure locations. It envisages (see Spyblog for some commentary) 256 Government departments and 44,000 private sector organisation being accredited to use the NIR and estimates (on what basis we know not) an initial volume of verifications at 165 million a year. And it also intends to build a secure Internet login system that will allow all ID card holders to check all of the data held on them in the NIR, simply and easily. Which will mean no biometric check, probably just PIN security for accesses from consumer clients of dubious (we flatter Joe Public here) security. The political need to assuage public fears about Big Brother could conceivably end up building a handy ID theft tool here, and the verification system could turn out to be the most attractive target in the network. Jury still out here too, we feel, with new reasons to worry appearing regularly. The accuracy of biometric checks has a clear relevance here. Home Office Minister Tony McNulty opened the bidding with confused and confusing claims about the system using 13 biometrics, and therefore being really secure. A helpful soul pointed us at some work done by John Daugman showing that this needn't always be the case. We stress the 'needn't always', because as our mathematician detectors told us before we even published the piece, this one has the potential to become very confusing. So first, a quick gloss. The Daugman piece we referred to describes how, if you're applying decision rules to two tests independently, then you may find yourself getting a worse aggregate result than if you'd just used the stronger of the two tests. There are however other mechanisms you can use in order to be able to achieve greater levels of accuracy by using multiple tests, and this is discussed by Nigel Sedgwick here. We'll get back to Nigel (again), but quite a lot of other people had something to say about the matter: Hello, the article is not really accurate. It is true, that combining _2_ test will result in the average, and leading to boost the failure rates. The conclusion that using _multiple_ sensors with majority decisions will be even worse (or worse than a single test), is NOT true. (Ask Airbus, NASA ...) Though of course also this systems could fail because of the majority failing (see Minority Report ;) ), but the possibility would be less than a single single or double system. Of course this is because of the majority decision. Combining multiple tests with AND will lead to higher error rates. Sven Deichmann Mr Daugman is trying to force water on his watermill. An obvious solution for this problem is to apply weaker test result after a stronger one just in order to distinguish between possible multiple identifications. As a result of a weaker test is not any more an independent event and even better as a group of possible individuals is much smaller, probability of correct recognition will improve. Sava Zxivanovich Olin Sibert raises the issue of forged biometrics (also recently raised by Viisage announcing a 'liveness detector' to stop people using pictures instead of real faces), and the dangers of unattended readers: John Daugman's math is sound, but the article perpetuates the myth that biometric systems can be characterized in terms of just two numbers: False Accept Rate and False Reject Rate. This is true in an academic setting with no adversaries, but in the real world, one also has to worry about the use of forged biometrics, which is often a much bigger exposure than random false acceptance. A lost or stolen ID card, which has presumably been handled by the very fingers that it will allegedly be verifying, provides ample raw material for simple and inexpensive forgery attacks. Faces and irises are more resistant to this attack (you have to observe the person, not just find his card), but forgery is still quite practical, particularly in situations where the biometric reader is unattended. It's a good thing we don't leave faceprints on everything we glance at. Olin Sibert We don't? You clearly haven't been on the London Underground recently, Olin. Your final paragraph hits the spot: what really matters is what you consider "a test" and how you combine tests. The basic principle of "more info is better" always holds as long as the info is combined properly--weighted with confidence limits, mixed in properly, and so on--rather than having each piece of information arbitararily assigned "pass" or "fail." For example, imagine each whorl of a fingerprint as a separate chunk of information. It would be a nightmare to decide if each whorl "passes" or "fails" independently; it's clearly much better to put them all together and form a single fingerprint-level pass/fail from the ensemble. So I'm guessing that if the two biometric measures you speak of were combined properly BEFORE deciding pass/fail, the final answer would be better than either one alone. But giving a separate, equal veto to the weaker measure is cleary (ahem) "sub-optimal." It's a pleasure to see such important and subtle issues aired in public. Good job! Bill Softky Eddie Edwards launches a rather more complex explication of the application of three or more biometric tests: In other words, if you use two biometrics, and users must pass on both, then you're going to fail more people (that's the whole point), so more people will be failed wrongly. It's not so much counter-intuitive as non-obvious. The idea that tightening up one aspect of a process means loosening up a complementary aspect is nothing new - and that this applies to false positive and false negative rates has been suspected by a few of us for some time. For instance, it is widely believed that if you make courts less likely to let guilty people go free, then they will be more likely to jail innocent people. (Note that courts operate on the AND policy - all jurors must agree.) What is also well understood is the power of the majority vote. With only two biometrics the only sensible combining functions are AND and OR. AND gives the best false positive rate with the worst false negative rate, and OR does the opposite But with three biometrics we can combine them using the "majority vote" rule, which has something of the best of both worlds. In the majority vote case with three biometrics (see end of mail for derivation), the false negative rate is about as good as the product of the two worst tests. The false positive rate is derived similarly. This means that combining a test with a FP rate of 1/1,000 with another two that have FP rates of 1/100 gives a combined rate of about 1/10,000. (But it also means that combining a 1/100,000 test with two 1/100 tests still gives an overall rate of 1/10,000, so adding biometrics can still make things worse.) As the number of biometrics goes up, the number of false positive rates that should be multiplied only goes up half as fast, so if you check 9 fingers, you have the worst false positive rate to the power 5 compared to checking only 1 finger, and the reliability becomes very good indeed (e.g. 1/100 goes to 1 in ten billion). (Replace "finger" with "small region of pixels from the iris/fingerprint" and this is more or less how a single biometric scan works in the first place.) So naively combining a pair of biometrics doesn't work. Naively comparing three biometrics works better. And ten fingers does the trick. (It also seems that a jury of 25 people each with an equal vote would be as effective at keeping the innocent at liberty as the current system, but would acquit wrongdoers far less often.) Quite where this leaves the practicality of a national ID card scheme, I don't know! * A false (positive/negative) occurs for three voters A, B, C for the following four configurations under a majority system: p(Afalse)*p(Bfalse)*p(Cok) + p(Afalse)*p(Cfalse)*p(Bok) + p(Bfalse)*p(Cfalse)*p(Aok) + p(Afalse)*p(Bfalse)*p(Cfalse) the total probability then being (p(Aok)/p(Afalse) + p(Bok)/p(Bfalse) + p(Cok)/p(Cfalse) + 1) * p(Afalse)*p(Bfalse)*p(Cfalse) the sum in brackets on the left is dominated by (the reciprocal of) the *smallest* error probability. This then cancels with the product on the right, so the overall probability is approximately the product the two *largest* error probabilities Which seems an appropriate point to get back to Nigel Sedgwick. Nigel points out that Daugman's paper itself "acknowledges that score level fusion is not subject to the limitations he describes for decision level fusion", and points to his web site (referenced above) for a detailed explanation. The bottom line of this seems to us to be that "from a fundamental theoretical viewpoint... Any number of suitably characterised biometric devices can have their decision scores combined in such a way that the multi-modal combination is guaranteed (on average) to be no worse than the best of the individual biometric devices. In practice, there will always be an improvement from multi-modal combination." Nigel accepts that there are practical difficulties, but that they should not be viewed as overwhelming, and this is where we possibly begin to part company with him. Nigel points out that the Government has experts on board who know perfectly well what needs to be done in order to deploy multiple biometric tests effectively, and we accept this. But we have problems convincing ourselves that what ought to be done will be done, or that the theoretical ideal can be deployed practically in the field. A standard test using all three biometrics is clearly achievable if it's a low-volume 'special purposes' test, but it can't rationally be deployed (or survive) in a system intended for everyday ID verification in everybody's life. For this, maybe you end up just using a PIN (i.e. no biometrics, and don't we have PIN security already?) and/or a single biometric, possibly fingerprint set to fairly relaxed tolerances. In other circumstances you might use more than one, but then you're starting to get into circumstances in which you're having to vary the way you apply decision rules depending on location, required security level and equipment available. If one component breaks, do you just go ahead with what's left (in which case you're maybe doing weird things to your rules), do you apply a different set of rules, or do you just count everything as broken? How do you cope with the impact of environmental variations (e.g. bad lighting or sweaty fingers) on accept/reject rates? Factors such as these probably can be planned for, but it seems to us that they will add complexity that might well make the practical difficulties overwhelming. And the Home Office's Market Sounding doesn't inspire confidence here. The limited testing of the technology the Home Office has done so far didn't cover the combining of scores across the three biometric tests, and the Market Sounding roadmap anticipates a smallish test of biometrics lasting three months, commencing once the ID Cards Bill is passed by Parliament. One would hope these tests would include some evaluation of decision making processes in multiple test scenarios, but that's not something we'd put money on. On the upside, it doesn't seem likely to us that the Government would do something as daft as running one general 'front line' test, and then falling back on a second test in the event of failure by the first. Not deliberately, anyway. We'll leave you with a quote from last week's third reading of the ID Cards Bill, where Home Office Minister Andy Burnham responds to questions about accuracy in multiple tests, and demonstrates how fully on top of the facts he is: "We will proceed to a full technology trial if the Bill receives Royal Assent. In the interim, I refer the hon. Gentleman to the report by the National Physical Laboratory, which examined the matter in detail and concluded that biometric systems could be used in the way in which we propose [it didn't - Ed]. I also refer him to experiences in the United States, where such systems are already in widespread use [they're not - not on this scale, in this way - Ed]. The technology is not new and coming to us only now, but established and used today throughout the world to prove people's identities." ®
John Lettice, 24 Oct 2005

VMware sets partitioning software free . . . as in beer

VMware continues to step up pressure against rivals by maturing its partitioning play at a quick clip. The company this week hit out against Microsoft and Xen by releasing a free tool for running virtual machines on Windows and Linux PCs. Modeling itself on the likes of Adobe or Macromedia, VMware has put out the VMware Player as a kind of free taste of what its partitioning software can do. The package lets customers run virtual machines created by users of the higher-end Workstation, GSX Server and ESX Server products. As you would expect, VMware Player cannot be used to create a virtual machine and lacks a number of other features found in the for profit products. While limited, VMware Player does make it possible for a company to pass virtual machines around for testing purposes or to run different OSes within an organization. "VMware Player is ideal for safely evaluating pre-built application environments, beta software or other software distributed in virtual machines," VMware said. "With VMware Player, anyone can quickly and easily experience the benefits of pre-configured products without any installation or configuration hassles." VMware, a subsidiary of EMC, enjoys a large lead in the server virtualization market over Microsoft and relative newcomer XenSource. VMware claims thousands of large customers, while its rivals remain tight-lipped about users and struggle to match ESX Server's features. Next year, VMware will release Version 3 of the high-end ESX product. VMware Player could help the company establish an edge against the competition by tempting new customers with a free tool. It's sort of a middle-of-the-road response to XenSource, which gives away its core partitioning product, hoping to sell ad-ons around it. You can see a comparison of how VMware Player stacks up against the for profit applications here. There's more information on how to download and use VMware Player here. ®
Ashlee Vance, 24 Oct 2005
fingers pointing at man

DEC veterans prepare chip challenge for Intel, AMD, IBM and Sun

If you were contemplating starting an IT company, deciding to go up against the likes of Intel, AMD, IBM, Sun Microsystems, Toshiba, Sony and TI with a new processor probably wouldn't seem like the smartest or most feasible idea. In fact, you'd likely characterize the idea as ludicrous with a dash of hopeless. That is unless you had assembled a ton of cash and an army of very talented and successful chip design mercenaries.
Ashlee Vance, 24 Oct 2005

Wikipedia: magic, monkeys and typewriters

Letters SpecialLetters Special Smart mobs? Wise crowds? An open access internet encyclopedia that heals itself? File it all under 'flying saucers', say Register readers.
Andrew Orlowski, 24 Oct 2005
chart

Intel delays and slows dual-core Itanium

Intel has ripped open another huge hole on the troubled ship Itanic, saying that the dual-core Montecito chip won't arrive until mid-2006 and will come in much slower than expected. Just two weeks ago, Intel told reporters that Montecito would begin shipping at the end of this year and ramp to volume in the first quarter of 2006. Not so. A quality issue of mysterious origins has pushed delivery back to the middle of next year and left the chip limping along at 1.6GHz instead of 2.0GHz - a result of the Foxton power management technology being nixed for the chip. In addition, the successor chip Montvale won't ship until 2007 now and Tukwila won't arrive until 2008. These delays serve as huge blows to Itanium's two main customers - HP and SGI. Both companies had been depending on Intel to catch up finally with rivals IBM and Sun Microsystems. Customers will be most displeased to see Itanium fall close to five years behind IBM with dual-core technology. Intel also delivered some shockers on the Xeon front, although it pitched these changes as positives instead of negatives. It has cancelled the multicore Whitefield chip and replaced it with a product called Tigerton. Grrr. The Tigerton chip will now be part of the new Caneland platform for Xeon MP chips, as Intel has cancelled the Reidland platform associated with Whitefield. Intel seems to have some kind of answer to AMD's direct connect architecture planned with Tigerton called the "high-speed interconnect." This will give each processor its own connection to the chipset instead running data through a shared front-side bus. Intel's dependency on the front side bus has been one of its major handicaps in trying to compete with AMD's better performing Opteron chips. Intel said that this new technology will not replace the "next-generation interconnect" it has planned for future chips. Here, however, is the bad news. This shift in architectures will cause another delay in Intel's plans to develop a shared socket design between the Itanium and Xeon chips. This design promised to make the pricey Itanic systems more affordable. Intel once planned to introduce the common design in 2007. It won't ship the design now for Itanium systems until 2008 and refuses to release a date for a Xeon version of the design. This flood of announcements could well be remembered as the day Itanium died. ®
Ashlee Vance, 24 Oct 2005