Toshiba plans to roll out 10,000 802.11b hotspots across the US this year, the company's North American subsidiary announced yesterday. This is no big infrastructure project, of course, rather a commercial programme to appeal to stores, hotels and so on keen to implement public WLANs. So Toshiba will offer a "hotspot in a box" comprising all the kit an organisation will need to establish a public Wi-Fi zone. If that sounds like an off-the-shelf product, it isn't - Toshiba is selling the package through its reseller network, who will undoubtedly cash in on installation and set-up deals. Toshiba's announcement follows a similar plan put in place by Intel and the Singapore's Infocomm Development Authority (IDA) to explore ways in which Wi-Fi hotspots could be installed across the region. The goal is the formation of what would be tantamount to blanket 802.11b coverage, enabling true data network roaming. Toshiba's initiative is less ambitious technologically, but the company still wants to achieve dominance in the emerging hotspot market, it said. It believes the growth of hotspots will be driven by companies keen to provide the public with Wi-Fi access either as a way of tempting them into their premises, as a revenue generator, or both. Toshiba has already announced a similar programme in Canada. It is also working with US-based public WLAN operating WorkingWild to install 15,000 hotspots in Circle K convenience stores. ®
Perhaps to the surprise of many, next-generation mobile data services appear to have received a generally warm welcome in Europe, with services such as Vodafone Live! reporting unexpectedly high take-up. But there is a looming threat from a lack of cross-device compatibility that could well damage the revenue-earning potential of such services. The problem with new mobile data services lies not with the services themselves but with the devices used to access them. Operating systems, application environments and hardware platforms for mobile devices are proliferating at an alarming rate. The unfortunate upshot of this proliferation is lack of cross-device compatibility. Even where an application environment is ostensibly device-independent, there is no guarantee that the same application will run on every device so equipped. The best (or worst) example of this so far is mobile Java (J2ME). A lack of advanced features in the original MIDP specification for mobile phones, has seen handset vendors add their own custom libraries and APIs to the software in an effort to increase functionality and make it compatible with their hardware. As a result, developers currently have to tweak their applications to each device's particular specifications, not a good use of valuable programming time. Meanwhile, content providers are required to maintain multiple versions of an application in order to service users of different devices. Looking beyond J2ME, base operating systems for mobile devices are also multiplying, creating an additional problem for content providers. Palm OS- and Pocket PC-powered devices are now increasingly being joined by Symbian OS and even Linux-based alternatives. And there are numerous others. The situation is further compounded by the multiple hardware reference platforms now available for mobile devices, and the numerous form factors devices can take even when built on the same hardware and software foundations. While it is unreasonable to expect companies that have invested millions of dollars developing systems to back out of the race, something has to give. What is needed is a software platform that is genuinely independent of the underlying operating system and hardware, and also gives content providers the opportunity to differentiate their offerings in terms of branding. Such systems already exist. And for the sake of the industry they should be used. © Datamonitor is offering Reg readers some of its technology research FOC. Check it out here.
Internet filtering outfit Websense has set its sights at stamping out uncontrolled use of personal storage sites in the workplace, describing data backup sites as the latest security risk. So if Websense has its way personal storage sites will join the list of prescribed workplace Internet activities alongside surfing for porn, swapping MP3 files and instant messaging as unacceptable security and liability risks. What is an employee supposed to do - work? In fairness, Websense is a reputable firm whose technology helps firms enforce security policies on what employees are allowed to do at work, in particular restricting visits to questionable (software piracy, porn, racist etc.) Web sites. Providing these policies and clearly communicated to workers and sites properly categorised, then that's (in principle) fair enough. But Websense wants to expand its role to tackle "emerging Internet threats" such as personal storage sites, spyware and Web-based email. Extending its filtering technology to block the operation of spyware is sensible. We think Websense is over-egging the pudding in suggesting that uncontrolled use of personal storage Web sites poses an industrial espionage risk. Following on from a recent survey from which Websense earlier warned of spyware risks, the company takes aim at personal storage sites. The company this week warns "companies are exposing themselves to potentially devastating security and liability risks by not managing employee use of personal storage Web sites". Websense's survey of 400 companies across Europe revealed 71 per cent had no policies on the use of personal storage sites, such as briefcase.yahoo.com and Hotdrive.com (its examples). "Within minutes and without trace, company confidential data - such as financial forecasts or customer databases - can be uploaded onto the web. However, most companies are still unaware of how employees access personal storage sites, which now number nearly 1,000 worldwide," the company warns. Scary stuff. But wait a minute: if someone is inclined towards a spot of industrial espionage, and if access to personal storage Web sites was blocked by Websense's filters, what's to stop him burning confidential data onto CDs? Or emailing confidential data to home (perhaps as an encrypted zip file)? Geoff Haggart, vice president for EMEA, Websense, more or less conceded our point that preventing industrial espionage is a people management, rather than technology, issue. "There might be lots of different mechanisms for industrial espionage," he says. But he maintains that Websense is correct in highlighting threats which firms may have overlooked when establishing their security policies. Potential misuse of personal storage Web sites and Web email ranks "somewhere in middle of risks" which Websense customers face, Haggart told us. Websense's filtering technology is designed to help companies manage various personal productivity, bandwidth and security risks, to which we must now add personal storage. Websense is not against storing or backing up data as such; it simply wants to limit use of the technology so that it is under tighter control. Companies may want to establish a preferred backup site and block access to the rest using Websense's technology, Haggart suggests. ® Related Stories Spyware found on one in three corporate networks Websense takes aim at Spyware Websense hopes to filter more than just the Web P45s for Porn Surfers Net 'misuse' costs corporate America $2,000 per second 'Cyberslackers' are curse of workplace External Links Web-Based Storage Sites Create Security Holes and Liability Risks For Businesses, Reports Websense A list of more on-line storage providers
Most mammals hibernate through January and February, but Apple lawyers clearly belong to the reptilian class, for winter finds them at their most productive. They certainly haven't been idle in recent weeks. We must thank attorney and Apple user John Kheit for uncovering a blizzard of patent activity since the new year that must be quite unprecedented in Apple's history. The computer pioneer has decided, posthumously, to lay claim to lots of stuff that it err, … invented twenty years ago. The desktop Trash Can, window buttons, dialog boxes and a task scheduler have all been claimed as Apple Intellectual Property in a snowstorm of greedy patent filings. And the paper trail uncovers some delicious ironies. Apple has finally decided to patent the "Waste Basket" [if you're in the European Basket], also known as the "Trash Can" [if you're in the United States of Trash]. Now, correct us if we're wrong, but the desktop trash icon first appeared in Apple's own Lisa computer in 1983. But this is now, so Apple is duly required to cite prior art. And this particular filing thanks Corel, the Microsoft Windows 3.1 Users Guide and a tome called "OS/2 Unleashed" as prior art. Have a look: the claim was filed on February 13, 2003. Almost twenty years after the Lisa appeared. Microsoft has justifiably been blamed for stealing many aspects of the Windows UI from Apple, so it's no small irony to see Apple thanking Microsoft for creating something it invented itself, but never got round to protecting under US patent law. This supposes some amnesia on the part of the reader: in the look and feel litigation that Apple pursued in the late 1980s, the trash can, along with overlapping windows, played a star role. Vanity plays a large part in these Apple filings. They name the incumbent CEO as "inventor" on items such as the iPod. (Yes, the design of a hard-disk based MP3 player is now an Apple invention, so don't you dare make one.) Although the incumbent CEO has done much to distract us from the boredom of desktop computing by promoting ergonomics and design sense (very French characteristics, if you ask us), this isn't enough. Credit for inventions as "iconic", if you pardon the pun, as the desktop trash will forever belong to Apple, and the filings hint that the CEO is even more insecure than we suspected. How insecure? It seems that Apple's unfathomably insecure CEO wishes that he wasn't a member of California's arriviste nouveau riche but that, instead, that he had been born French. There's no other explanation. ® Related Link [Kheit's summary can be found here, at Mac Observer ]
Tom Bradicich, CTO of IBM's xSeries line, confirmed that Big Blue will ship a Power-based blade by the end of the year. IBM will also launch a four-way x86 blade, also by the end of the year. IBM pipped Sun to the blade business, delivering its first boxes in December. The current HS20 blade uses up to two 2GHz and 2.2GHz Xeons, starting at $1879. IBM eschewed using mobile or embedded low-power processors to achieve high densities. "Density is only one of the reasons for buying a blade," Bradicich told us. "Customers don't want last year's PIII." The HS20 supports up to 8GB of memory and Fiber Channel I/O, and duplicate Gigabit Ethernet adapters. Competitors deem IBM's current POWER4 processor too large to fit on a blade, but Bradicich says IBM has the packaging skills to produce some surprises. The RISC-blade is expected to use the POWER4-derived 970 part for workstations and low-end servers. IBM announced that chip at Microprocessor Forum last year (see IBM confirms Altivecked POWER4-lite). ®
ATI has won rival graphics chip maker 3Dlabs' backing for its RenderMonkey (RM) high-level shading language. While ATI will continue to develop the core RM framework, both partners will co-operate on the creation of plug-ins to wire the technology into the standard DirectX and OpenGL high-level shader APIs to ensure compatibility. ATI and 3Dlabs will also work together to develop RM, and will evangelise the technology among the developers of digital content creation applications in the hope the latter will integrate RM functionality into their software. These efforts, they hope, will increase the use of RM, primarily over Nvidia's alternative high-level shading language, Cg. Both languages are hardware-independent and technically different - Cg is a true language; RM is more of a development environment - but there's no doubt that there's a battle for mindshare, not only for software developers but for the punters who will see one logo or t'other (and those of the chip makers behind them) splashed over game packaging. High-level shading languages are seen as the best way to programme increasing complex and feature-rich graphics chips. HLSLs provide software developers with a way not only to code more sophisticated visual effects but to do so more easily than is possible with current vector and pixel shader APIs. ®
Sony has greeted Microsoft's announcement this week that Xbox Live now has 350,000 users with scorn. It has issued a counter-statement claiming sales of over 600,000 PS2 network adapters. Sony may have remained aloof throughout Microsoft and Nintendo's ridiculous spat over second place in the console race earlier this year, but it seems that online gaming is a more sensitive point with the market leader, and Microsoft's triumphant announcement has raised hackles in Tokyo and kicked off a round of "mine's bigger than yours" press releases. Microsoft claims that it has sold 350,000 units of the Xbox Live Starter Kit in the USA and Japan (it's not clear whether the figure includes European Xbox Live Test Drive users; the service officially launches here on Friday week), representing a significant success for the service. Sony's counter-claim is that it has rolled out 0.6 million PS2 Network Adapters in the Japan and USA, despite the fact that the Japanese market is only due to see retail units of the hardware later this spring (as it was previously available only through rental from an ISP). Sony launched the PS2 online service in the USA last August, and no concrete plans for a European launch have yet emerged - although it's known that beta testing is scheduled to start later this year. One big question, however, is what exactly those 600,000 lucky owners of PS2 network adapters are doing with them. 200,000 odd of them are accounted for by the subscribers to Final Fantasy XI in Japan, but did 0.4 million people in North America really run out and buy the sparsely supported and under-promoted network adapters - and if so, what on earth are they doing with them? ©
Internet battle lines were drawn at an extraordinary meeting in Geneva this week. The non-descript "ccTLD workshop" hosted by the International Telecommunication Union on 3-4 March attracted a stellar cast including ICANN president Stuart Lynn, ITU secretary general Yoshio Utsumi and leading representatives of just about every major organisation dealing with the Internet today. Why the huge fuss? Because the meeting threatened to turn into a caucus where rising resentment against ICANN and its attempt to stamp ultimate authority over the Internet could have escalated into international agreement and action. Many country domain managers are furious at ICANN's constant efforts to get them to sign up to a new set of ICANN terms and conditions - often under threat of withholding vital services - that would effectively hand over control of their domain to the organisation. Many do not trust ICANN to use such power correctly and have good reason to be concerned considering its previous behaviour with regard to the wider Internet community. An incredible 63 papers were introduced to the meeting, of which 17 were either implicitly or openly hostile to ICANN. ICANN retaliated with nine papers, most of them written by members of its organisation, that supported its position. The remainder were either academic studies or simply papers that more accurately fitted the full title of the meeting "Workshop on Member States' Experiences with ccTLDs". In the end, however, hostilities were tempered by the absence of many authors of ICANN-critical papers and the clear lack of consensus among those representing country-code domains, often from developing countries, who admitted to not understanding the structure behind the Internet's overseeing organisation. As such, controversial subjects such as ICANN control of the IANA function and its assumed authority over domain redelegation (essentially its control of the Internet's switchboard) were given short shrift and effectively dampened by ICANN representatives. They are unlikely to go away and may yet come back to bite the organisation. Interestingly, for such a significant meeting, it appears there is to be no official record of discussions (we explain why at the end of this feature). However, all papers contributed are available on the Web, as is (at the moment) full audio coverage of the event. However, in order to plug this gap, we felt we would provide our own report of the event from the perspective of the criticism and defence of the current system, alongside an explanation of the history that led us to this point. It can all be found below: So what the hell was going on, and why? To understand the significance of the meeting, you need to understand some of the history behind it - which is, in itself, a history of the Internet. The ITU is the international body set up over 140 years ago by countries across the world in order to standardise telegram communication. The telegram was a great advance in communication but everyone soon realised its use was hugely curtailed if every country ran a different, incompatible system. The ITU successfully introduced common systems and approaches and fast, worldwide communication was born. It then did the same job with telephones. And in fact has done the same with just about every bit of modern telecommunication, with TV the exception that proves the rule. How then is ICANN and not ITU in charge of running the standardising side of the Internet? Well, ITU did attempt to set itself up as the main Internet body in the early days of the network. It offered a complex and fairly expensive solution that would be provided by telecoms companies. But at the same time, the US government was funding early Internet networks with simpler and cheaper equipment and, crucially, insisting that the networks were freely available to everyone. The early pioneers of the Internet - universities and research labs - seized on this simpler, no-strings-attached technology and soon they had easily outstripped ITU's plans. To this day then, the Internet community remains suspicious of ITU, even though it has since produced many of the subsequent standards that have contributed hugely to the Internet's success. It was this victory that left the US government in effective control of the early Internet. Recognising that it could not be expected to run a worldwide network, the Department of Commerce set up ICANN, an organisation initially tied to it but which it hoped would become an international entity to look after the unique needs of the Internet. ICANN has achieved the goal of expanding and encouraging the Internet but at huge cost to its reputation. Despite its stated aims of open, transparent and bottom-up consensus-making in all decisions, few independent observers would conclude that it has succeeded in fulfilling this brief. After huge criticism, ICANN's unsuitability was implicitly recognised by its new president, Dr Stuart Lynn, who set about restructuring the organisation to keep the ICANN dream alive. Dr Lynn will soon leave as head of ICANN and his restructure will come into effect shortly after. At the heart of ICANN's restructure however is an effort to extend its authority over the Internet in all countries of the world. Only in this way, it argues, can the Internet be held together and helped to prosper in the future. The problem is that a large segment of the Internet community does not trust ICANN or its staff, who have often been accused of bullying, obfuscation and obstruction in order to further their own aims. ICANN has proven itself unwilling to listen to opinions outside its own sphere and despite promises that this is all due to change, those running Internet domains currently outside ICANN's control are wary of giving it ultimate control before they see more open democracy in action. In fact, it is safe to say that ICANN is now viewed with the same suspicion that ITU was faced with when its attempts to gain worldwide control of the Internet faltered all those years ago. To make matters worse, ICANN continually says that creating "global policy" is a vital element of its work. This policy element concerns itself mostly with the social, economic and cultural aspects of the Internet. As such, many would prefer the controlling, technical aspects of the Internet be given to a less political body and one that has experience in developing international agreement. You won't be surprised to hear that the ITU sees itself filling this role very easily. Right. So what does this have to do with Geneva? So, back to this week's meeting, held at ITU headquarters in Geneva. The first speaker was the secretary general of the ITU, Mr Yoshio Utsumi. "Welcome to Geneva, welcome to the birthplace of the World Wide Web," he began. Mr Utsumi gave a brief history of the ITU. Ten years after the birth of the telegram, ITU stepped in and standardised it, made it work worldwide, he explained. And 10 years after the birth of the telephone, ITU stepped in and standardised it, made it work worldwide, he continued. "Now we have a so-called Internet, which has become, in my opinion, a public utility." You do not, by the way, get any points for realising that the Web is approximately 10 years old. Mr Utsumi complained about the lack of consensus regarding the Internet, that ITU was being asked to act but not told what they should do. He explained that much of what ITU does is now Internet-related. He then finished, got up and left but not before inviting everyone to cocktails that evening. Next up was the director of the Telecommunication Standardisation Bureau (TSB) - the part of ITU that could, for example, easily take over all the Internet's underlying architecture - Mr Houlin Zhao. Mr Zhao welcomed the president of ICANN and the chairman and secretariat of possibly the most important advisory body to ICANN, the Governmental Advisory Committee (GAC). He said how much ITU supports ICANN and how he personally submitted his own paper on ICANN reform when it was under consideration two years ago. He then explained what appeared to be a misconception. "We have heard that people believe if ITU intervenes, it will be the same as governments stepping in. But this will not be." If you were to start a pitch to the owners of country domains, fearful that governmental control was behind your offer, it would probably start similarly to this. Mr Zhao also helped clear up another false belief: "ITU does not want to take over ICANN... we would like to assist ICANN." And then: "In my opinion, ICANN alone cannot solve all these problems. If it works with ITU we can find a new way to support each other." Next up was co-chairman of the meeting Dr Willie Black. Dr Black also happens to be the head of Nominet (the people behind the .uk register) the outgoing head of CENTR (a body of 27 ccTLDs) and a well-known critic of ICANN policies. He outlined how the two-day meeting would pan out. The second day would be dedicated to those papers with "problems", "criticisms" or "issues" with the current system. It is perhaps hardly surprising then that when president and CEO of ICANN Dr Stuart Lynn took the stage, he sounded nervous, lacking in his usual bluster and often stumbling over words. His performance was copied shortly after by the secretariat of GAC, Christopher Wilkinson, a man with endless experience of addressing crowds. The ICANN view Mr Lynn began by welcoming the "opportunity for frank and open dialogue". He stressed that the most important aspect for the Internet now was "stability and security". He then stuck a pin in ITU's claim to historical problem-solving. "Unlike the history of telegraphy, from the very beginning the Internet community has self-organised and self-standardised on its own fundamental protocol, TCP/IP that is used today. This defined and still defines the very meaning of the Internet." TCP/IP incidentally is the system that the US government built the Internet from. ITU tried to introduce the more complex OSI stack. He then introduced the arguments over why ccTLDs should sign up to ICANN's new ultimate control contracts. "No country is, or can be, an island of this globally interdependent [system]. A pre-requisite to global stability is interoperability, assured through the establishment of a framework of mutual accountability." My Lynn then outlined the ICANN mantra that it is "open, transparent and fully accountable". The enormous criticism that ICANN has come under (of which this author is proudly guilty) is dismissed: "Anyone can, and indeed does, use the press to amplify their views. That is the price we pay for furthering the reality and not just the illusion of openness, transparency and full accountability." ICANN acts as an "informed and neutral gatekeeper", it "cannot and does not act arbitrarily, although we are often accused of doing so". He then addresses the concerns that ccTLDs have traditionally being under-represented (often to the extent of being ignored entirely) in ICANN and that it would abuse it new-found power were it to be given it. "ICANN recognises the distinct differences between global policies and local policies. In my view, in the past ICANN lacked a sufficient mechanism for evolving global policy in as far as it affected ccTLDs." This will all be water under the bridge with the Lynn-designed reorganised ICANN. He finishes with a biting comment, and soppy analogy (for which he apologised) aimed at Willie Black. "The Internet is a very special flower in the garden of the world's communities. One that needs continuous and careful watering. ICANN and all of us only play a very small part but an important part. We can play our parts wisely in the better interests of the global community, handing on a trust to future generations. Or further, at the other extreme, we can just further our own self-interests, in which case the flower will wither and fade." And without the guff? Put simply, ICANN is appealing to people's better nature, saying it has changed and improved, and will point at criticism and any attempted intervention in its big plan as going against the deeply ingrained Internet culture of consensus. On the other hand, those who stand to lose their independence point out ICANN's appalling record and wish to pull in bodies that have a history of trustworthy behaviour before handing over control to ICANN. They do not wish (yet) to break up or remove ICANN because of the deeply ingrained Internet culture of consensus. So on with the show The rest of the day was interesting but uneventful with regard to this article. What was interesting was when the more controversial papers were released the next day. One of strongest and earliest was titled provocatively "Sovereign Domains - A Declaration of Independence of ccTLDs from Foreign Control", written by Kim G. von Arx of the Canadian Internet Registration Authority and Gregory R. Hagen of the University of Ottawa. The paper made extensive use of quotes from the past few years to argue its case that the US government exerts too much control over an international Internet and highlighted the subtle and not-so-subtle threats that ICANN has used to pressure countries to sign up to ICANN's authority. It concluded that countries should announce their principled opposition to ICANN control of any form and insist upon technical independence. The paper's forceful tone was lost however since the authors weren't there and so only a cursory, almost apologetic, summary was given by the meeting's chairmen (a trend that was to continue throughout the meeting and was hugely significant in ICANN's favour). A representative of the Canadian government stood up to say the paper did not represent his country's official views. The same situation was repeated when a paper from ICANN director Karl Auerbach stated quite clearly that he believed ICANN was acting improperly by using the IANA function (the top level of the technical side of the Internet) to induce ccTLDs to sign up to ICANN terms and conditions. By doing so, ICANN was creating "a direct and significant danger to the operational integrity of the Internet". Mr Auerbach, was not present, his statement was not read out and an ICANN representative stood up to say that Mr Auerbach's viewed was not shared by the ICANN board. Next up was Dr Willie Black, who took off his chairman's cap to don his Nominet cap and deliver three papers from Nominet, all of which argued that ICANN was not suited to running the IANA function, should not have the authority to decide who or who does not run a country-code domain and should not be entitled to decide whether to create any new global top-level domains such as .com or .org. Instead, a "treaty based international body" would be preferable to take over these roles. This was supported. However, an increasingly confident Stuart Lynn put ICANN's view across. He argued that people misunderstood what IANA is and what it does. He claimed it performs well and many of the complained-of delays were due to local problems and not ICANN or IANA. It was "nonsense" that there were performance problems, and all companies that had gone through redelegation were very satisfied (presumably he meant the ones that had won control of the domain). Besides, GAC and the new ccTLD ICANN recommendation body - ccNSO - would smooth any future problems. What could have become a heated debate was tempered however by a Kenyan representative who explained that such technical issues were of little import to him since his country was concentrating on trying to build its basic Internet infrastructure. "IANA will become an issue for you in due course," commented Dr Black. A paper asking ICANN to work with ITU was briefly introduced with no discussion. A paper from the International Chamber of Commerce supported ICANN. Again no real discussion. Then came a paper from Syria that strongly argued ITU take over the technical aspects of the Internet. The reason was simple: the US Department of Commerce has final say on any matters with the Internet, with ICANN acting as it advisor. It was therefore "not acceptable politically" for Syria to sign up with ICANN. More forcefully, the representative said for Syria to manage its own Internet domain it "does not need the permission of an organisation managed by the government of the United States". Instead, he argued, the GAC be turned into the policy-making side of ICANN. The chairman of GAC said he "fully supported" the statement, but pointed out that GAC was only an advisory committee more concerned with bridging the "digital divide" that deciding on policy. "At present it is not an ideal situation but it is the only one that seems to be working". The head of CENTR - an organisation representing 27 ccTLDs - Paul Kane said that ICANN was largely independent of the US government. And he looked forward to a transition to ICANN answering to the US government to being truly independent - something in which GAC had a key role to play. A representative from Sudan reiterated concerns over the US government control and said he hoped the transition would be short. This feeling was repeated several times. Stuart Lynn again gave a lengthy explanation of ICANN's position in which he explained that ICANN had no intention of telling countries what to do. This was met with some disagreement. It was argued that countries were independent and should not be told how to treat their domains. An increased role for both GAC and the ITU was strongly argued for. Mr Lynn argued that while the ICANN board has ultimate decision-making power, it would seek to resolve any disagreements it may have with bodies such as GAC before making a decision. And the rest of the meeting The meeting continued in the afternoon. Not much of interest happened until the meeting was finally coming to a close, at which point a summary of all the papers contributed to the meeting became an issue of hot debate. Some forceful attempts at re-editing were made by Stuart Lynn, including, incredibly, the exchange of the word "many" with the word "few", to which the author of the paper it related to - Professor Geist - took exception. Eventually it was decided to drop an executive summary and simply run with a simple summary of all papers, approved by their authors. Tellingly, requests from the floor for the production of a paper that summarised the meeting and the discussions was carefully bypassed by both the meeting's chairman and the ICANN contingent. It is fair to assume that such a summary would have worked against both sides' interests. The organisers' efforts to build a consensus against ICANN's continued authority over the technical side of the Internet were stymied by the absence of key speakers and indifference from smaller countries not literate enough in Internet politics to make a judgement. Equally, ICANN was unlikely to want an official document carefully outlining people's opposition to its policies. "Written documents tend to have a live of their own," argued Stuart Lynn. With what many may have mistaken for glee, he also noted: "There was no subject that I have heard that I could go away from this room and say there was consensus." The suggestion that a similar meeting take place in a year's time appeared popular. And finally And so ended the potentially explosive ITU ccTLD workshop meeting. ICANN, highly trained as it is in controlling and leading meetings, was prepared and got off fairly easily, but its policy of promising a paradise at the end of its seemingly endless path, will not have convinced the more experienced hands. They will be disappointed with the lacklustre nature of the more combative elements of the meeting, especially considering the depth of feeling and well-argued criticism of some of ICANN positions. It would safe to assume that next time they will be better prepared and more willing to argue their point. ® Related Links All presentations at the meeting Audio of the event
Radeon 9800 The market is led, and fed, by the 'bleeding edge', the pinnacle of 3D gaming performance, and a slot which was until recently occupied by ATI's 9700 Pro. In response to the GeForce FX, which retrieved the position of technology leader for Nvidia, it's not only news but also review samples of the Radeon 9800 Pro [Click for Pic] that have trickled out at the time of announcement. Designed around an optimised Radeon 9700 die [Click for Pic], ATI has improved on its previous 256-bit flagship in a number of ways. In come features such as true 128-bit floating point accuracy and OpenGL 2.0 support (care of the F-buffer) as well as making the necessary alterations to get the 0.15 micron chip operating at even higher frequencies (up to 380MHz core and 340MHz on the memory of the standard 128MB Pro board). Bearing in mind the early arrival of the 9700 Pro, it's no surprise that boards are already adopting OpenGL 2. But it's questionable as to what value this will add to the effective life of the part - there is uncertainty as to how the 2.0 specification will even firm up for the consumer space, and there has been very little noise on the matter over the last few months (checking opengl.org, the current public specification still sits comfortably at 1.4). Visually the card is very similar to the last generation, with an AGP 8x connector, power point at rear (molex this time, like the FX [Click for Pic]) and is only marginally larger than the 9700 Pro PCB. One tiny addition is a DIP switch, added to allow users and SIs to select PAL or NTSC for the TV-out socket. The three key features of the previous chip have all seen an overhaul, now dubbed Smartshader 2.1, SmoothVision 2.1 and Hyper Z-III+. The changes come both in terms of speed, flexibility of the shaders, as well as efficiency and as a result are now labelled by ATI as being DX9++ compliant. Of course, with so many different facets already appearing in the latest of Microsoft's gaming interfaces, it's certainly going to be interesting to see how developers react to yet another bespoke iteration. One of the main areas for criticism on the Radeon 9700 Pro was the limited length and method to which it handled fragment shader programs, only meeting the base requirements of 'full' DX9. As if in direct response, the 9800 features an F-Buffer in its pixel shaders which allows programs of a near unlimited length to be executed multi-pass, without having to be written in and out of the frame buffer. The chip will initially be available in a single 128MB Pro flavour, priced similarly to that of the current top-end FX card from Nvidia, and ATI claims its partners should have cards for sale in Europe by mid-April. A little later, hopefully by late May, two further boards will appear, one standard model, also 128MB but running at slightly reduced frequencies, and a second Pro card, this time with a 256MB frame buffer of DDR 2. ATI at present has only claimed DDR 2 memories, rather than actual DDR 2 support on its GPU, so it's unclear if the chip will benefit from the additional benefits of the new memory technology, or if instead the move is merely to get the modules onto the card (albeit running in DDR-I mode). Next Page The Radeon 9600 Contents 1. Intro 2. Radeon 9800 3. Radeon 9600 4. Benchmarks
Radeon 9600 While Nvidia's entire current DX9 range is based on a 0.13 micron process out of TSMC, ATI initially stuck to the tried and tested path of 0.15 micron, and so far it has done quite well from that choice. Of course that move has to be made eventually and its competitor has finally proven a transition viable with such complex silicon, ATI is now trying their hand with the 9600. Effectively a 0.13 micron version of the four-pipeline 9500 Pro, the new chip will run both faster and cooler than its predecessor and will also feature the SmartShader 2.1 and SmoothVision 2.1 enhancements. Two flavours of boards, a standard and Pro edition, both with 128MB, are expected in late May, pricing is expected to be similar to that of the Radeon 9500 on its entry to the market thanks to the reduced cost of a smaller, simpler die. Next Page The Benchmarks Contents 1. Intro 2. Radeon 9800 3. Radeon 9600 4. Benchmarks
Radeon 9800 Benchmarks Radeon and GeForce FX in hand, we've leapt into the lab for a spot of benchmarking to see how the new ATI GPU sizes up to NVvidia?s current top chip. Test System Intel Pentium 4 3.06GHz with HyperThreading sisabled ASUSTek P4G8X Deluxe - Granite Bay (E7205) mobo (AGP 8X) 1GB Corsair XMS3200 DDR - 2x512MB DIMMS for dual-channel operation Western Digital WD1200JB - 120.0GB, 8MB Cache, 7200RPM Platform Windows XP Professional Service Pack 1 DirectX 9 Drivers ATI Catalyst 3.1 Nvidia Detonator 42.72 Boards ATI Radeon 9800 Pro Reference (128MB) ATI Radeon 9700 Pro Reference (128MB) NVIDIA GeForce FX 5800 Ultra Reference (128MB) NVIDIA GeForce 4 Ti 4800 Reference (128MB) Sound disabled for all tests Note While such a specification might seem slightly OTT for testing products which could ultimately be installed in systems much slower, the purpose of our selected platform is to reduce as much as possible the constraints put on the GPU from core logic, CPU and memory bandwidth across the system. Even at 1024x768 many of today's benchmarks have become CPU limited even at 3GHz. Game Benchmarks Serious Sam, Elephants Atrium [Click for chart] Unreal Tournament 2003 Flyby [Click for chart 1 and Click for chart 2] Aquanox [Click for chart 1 and Click for chart 2] CodeCult Codecreatures [Click for chart] Benchmarked at 1024x768, 1280x1024 and 1600x1200 with two detail settings, 2xAA 0xAF (chart 1) and 2xAA 8xAF (chart 2) In general terms, the game benchmarks are quite clear in their message: the 9800 is undoubtedly an improvement over the previous chip, but not by a great enough margin to currently catch up the FX in current DX8 tests. That could all of course change with the arrival of true DX9-class titles, only the likes of Doom III will be able to shed any true light on the matter. Unless ATI can manage some major optimisations for the 9800 over coming weeks, before boards become available we're unlikely to be sold on their latest must-have technology. ® Contents 1. Intro 2. Radeon 9800 3. Radeon 9600 4. Benchmarks
ReviewReview In the months leading up to the arrival of DirectX 9, the topology of the 3D graphics market has changed considerably. Nvidia, which had led pretty much since the original T'n'L GeForce, dominated the high end with its GeForce 4 Ti range and was finally ousted only by the almost premature announcement of the Radeon 9700 Pro - arriving almost six months before the API for which it was intended. Since that time, the 9700 Pro has not only become the board of choice for most top-end gaming systems, but also spawned three spin-off products based on the same silicon: the Radeons 9500, 9500 Pro and 9700. With reviews of Nvidia's latest product and first DX9 part - the GeForce FX - arriving less than a fortnight ago, and the first retail boards this week, ATI is hot on its heels with both a refresh and expansion of the 9000 series in the form of two new parts, the Radeon 9800 and Radeon 9600. While its first four reference designs were based on a single chip (the only difference between the standard 9500 and 9700 Pro being simple resistor placement - or an even simpler software patch), the new products are discrete improvements or deviations from the original die. Next Page The Radeon 9800 Contents 1. Intro 2. Radeon 9800 3. Radeon 9600 4. Benchmarks
LettersLetters Do you BoFHs smell? Is the cost of a bar of soap and some natty threads too much? Such weighty questions were provoked by our recent letter "OK - I smell a bit" - but leave us BoFHs alone! from a sysadmin who resented being relegated to the smelly underclass by know-nothing IT types. Here are your views. I just read the reply you got from a Mr. Angry, and I do need to voice a bit of support. However, I'd like to voice a more balanced opinion about the issue of IS/IT staff members and their personal appearance, as well. As a member of this group for over 7 years, I feel I've got a fairly good handle on this. While it's true that the general impression of us lowly techs is less than fully desirable in the typically formal office environment, it's also generally accepted that the people who do the "dirty" work of crawling underneath/behind desks and into the guts of a typical office building can deviate from the traditional formal or semi-formal (read: business casual, whatever that's supposed to mean) office dress-code. I'll use myself as an example. In my early days of office life, I was required to be dressed in a button-down shirt, khaki or black pants, and dress shoes. Obviously, working underneath desks is more difficult in that type of attire than it would be in a t-shirt and jeans. I wore out 2 pair of khaki pants within one week of working like this and approached management for a compromise. Jeans and t-shirts were them permitted specifically to myself (the only IS/IT staff member at the time) as long as it wasn't disruptive to the other office staff. That meant no holes and no obscene words or images. I even found that a good, relatively heavy, button-down shirt worked well with jeans and made a good impression on management. Since that first experience, I've made sure to place expectations on a potential future employer during the interview that formal office dress will not work for the type of work that I would be hired for, and most have been accommodating. It's more important to be able to do one's job than to look good doing it, which seems to be the point Mr. Angry makes. However, there's a happy medium that should be reached with the policymakers if the existing policy hinders one's ability to do one's job. I've often said that to change the mind(s) of management, you'll need to make a solid business case for them to do so. Explaining why a formal dress-code hinders your ability to do your job should be easy enough if you're worn a hole through the knee on a pair of pants. If your manager is unwilling to grant you an exception, simply request that the company reimburse you for clothing that is consumed while on company time. Shoes can be a safety issue, especially if they're slippery on carpet. Is the company willing to accept the risk that you'll slip and fall while lifting "up to 50 pounds" as most companies seem to require of us techs? It's less expensive, and makes more business sense, to allow you to deviate from the dress code than to force you to follow it to the letter. Instead of looking like they want you to look, you'd be a safer, more productive employee that costs the company less money. Most managers are more likely to see it your way if you were to explain it to them in terms they understand: the bottom line of their budget and productivity charts. Todd Lynch So you must dress to get well. Or is it that simple? To The Smelly Techies: Listen up guys, wake up and smell....well you're own armpits. I too am a techie, and I too was once the mug crawling around vents trying to figure out what the hell the obviously drunk cable contractors did when wiring up the building. I completely agree that the ideal clothing, as far as practicality goes, is loose fitting casual lothes. However, have you thought about how long you really want to be in your current position? Do you really want to be shouting down aggressive members of the sales team who cannot work out how to switch their projectors on? Do you think your patience may one day run out when yet another director drives off with his iPaq left on top of his car? F**K THAT! Get yourself into a suit, shirt and tie. Keep a spare shirt and trousers in the office for when you get grubby. So your stuff is going to get dirty, so what? It almost always washes out. Hell you can even get machine washable suits from M&S these days, that will cut down on your dry cleaning bill. Get a hair cut, and buy some shower gel. Keep deodorant in your drawer at work (not in the server room, it's flammable). The only way management is ever going to start making good decisions, with sound technical knowledge, is when the grubby techies clean themselves up and get promoted. You'll also find that talking to your users becomes easier. Let's face it, half of what we do is basically telling people that they're being stupid. They take it a lot better by someone who has made some sort of effort to look professional rather than like one of their teenage kids. Mr Ex Smelly Techie, now IT Manager. Doesn't the last paragraph invoke Godwin's Law? Phil Payne It probably does, Phil. But what threads? The conensus overwhelmingly favors having a wash& Aah, Angry must be one of the ones who gives the rest of us a bad name. Have you ever heard of comfortable dockers and decent polos? Loose fitting, and still able to crawl on the floor (which I just finished doing with a printer) and in the rafters. Only 8-12 hours a day, and you can't take a shower, or keep a toothbrush at your desk, or at least a box of mints? Get a clue! Ed Roden Senior Network Engineer Thanks Ed. The most damning critique as ever comes from the Nordic region:- That people can't make 15-20 minutes a day to s**t, shower, shave (goes for women to) and brush their teeth (let's all be honest, no-one flosses) is beyond me. I work from home and I still choose to wear casual slacks and good dress shirts probably half the time (why? because they are comfortable, and a dress shirt breathes better then most T-shirts). Buy a suit and shirt that actually fit and chances are you'll stop complaining about how uncomfortable they are. Same goes for fabrics, real fabric (i.e. high quality cotton or wool with a tight weave) does not itch. Or maybe you should see a doctor about it. Kurt Seifried, OK, that's decided. Wear loose clothes. It works for me. Now can anyone find us a Bluetooth headset that doesn't make you look like you've just been Borged? Bluetooth headsets are very handy. They're the future, I just don't want to look & like& a twat. ® Related Story I use a Bluetooth headset and I'm no Nathan Barley
Lawmakers down under are considering making it compulsory for ISPs to filter out unwanted XXX content. The measure is just one proposal currently being tossed around following the publication of a report by the Australia Institute research group, which claimed that Australia's anti-porn legislation simply wasn't working. Aussie Communications Minister, Senator Richard Alston, confirmed that the Government was considering introducing measures that would force ISPs to fit filters that would automatically weed out hardcore content. A report by The Age quotes Senator Alston as saying: "There is a lot to be said for taking another look at whether we should expect more from ISPs." But the Internet industry has reacted angrily to the threat of further Government intervention, claiming that there are already measures in place to deal with such material. Australia's Internet industry insists it has taken its obligations seriously and is ready and willing to provide the necessary protection families need from this kind of unwanted content. Peter Coroneos, chief exec of Australia's Internet Industry Association (IIA) said: "The fact of the matter is, that today in Australia, there is absolutely no reason why children need be exposed to the types of content sensationalised by the Australia Institute in its recent report." Separately, Australian IT reports that federal police have raided a number of ISPs in connection with alleged online music piracy. According to the report, it seems that police are looking for information on particular ISP subscribers and any music files stored by them. Telstra, which has 1.4m punters and Eftel (around 50,000 users) were both named as outfits raided by police. Other so-far unnamed ISPs have also been targeted in the sweep. ®
PayPal scam artists are getting more ambitious, and less subtle, in their attempts to hoodwink gullible punters. A bogus email doing the rounds this week asks punter not only for their PayPal login but their bank account and credit card details. The email, which might appear authentic at first sight, tries to hoodwink punters that they need to send this data as part of a supposed security check. Yeah, right. Users of the online payment service are invited the email to click a button and submit their confidential financial details to a Russian Web site. The site is currently unavailable, hopefully never to surface again. Of course, scams involving bogus emails posing as official security checks from either PayPal or auction site eBay (examples here and here) are nothing new. However the widespread nature of the latest scam, which was forwarded to us by five Reg readers yesterday alone, and its pernicious nature merits a mention. If you get an email along the lines of the one below then on no account take it seriously. ® X-Persona:
Received: from 22.214.171.124 (12-227-172-29.client.attbi.com [126.96.36.199])
by backup.presbury.com (8.12.8/8.12.5) with SMTP id h25GhVpI007763
for ; Wed, 5 Mar 2003 16:43:33 GMT
Date: Wed, 05 Mar 2003 12:53:09 -0600
Subject: Your PayPal account is Limited.
Content-Type: text/html; charset=us-ascii
X-Mailer: Mozilla 4.08 [en] (X11; U; UnixWare 5 i386)
Dear PayPal Customer
PayPal is currently performing regular maintenance of our security measures. Your account has been randomly selected for this maintenance, and placed on Limited Access status. Protecting the security of your PayPal account is our primary concern, and we apologize for any inconvenience this may cause.
To restore your account to its regular status, you must confirm your email address by logging in to your PayPal account using the form below:
Top of Form
Enter Bank Account #:
Enter Credit Card #:
Exp. date :
[LOG IN BUTTON - DELETED]
This notification expires March 31, 2003
Thanks for using PayPal!
This PayPal notification was sent to your mailbox. Your PayPal account is set up to receive the PayPal Periodical newsletter and product updates when you create your account. To modify your notification preferences and unsubscribe, go to https://www.paypal.com/PREFS-NOTI and log in to your account. Changes to your preferences may take several days to be reflected in our mailings. Replies to this email will not be processed.
If you previously asked to be excluded from Providian product offerings and solicitations, they apologize for this e-mail. Every effort was made to ensure that you were excluded from this e-mail. If you do not wish to receive promotional e-mail from Providian, go to http://removeme.providian.com/.
Copyright(c) 2002 PayPal Inc. All rights reserved. Designated trademarks and brands are the property of their respective owners.
Good news for couch potatoes who'd like to support this year's Comic Relief Red Nose Day on 14 March, but would rather not skydive from 15,000ft dressed as a baboon to do it, comes in the form of the charity drive's e-commerce enabled website. For the first time, online donations can be processed in sterling, euro, Australian or American dollars, so there's very little excuse for not chipping in a few quid to support this most worthy of causes. Alternatively, charitable Reg readers might like to help by logging onto Kevin Watkins' Red Nose Day site where the reckless youth has invited online bidders to choose him a middle name. The highest bid by 14 March gets to pick a suitably silly moniker, and we have a feeling he'll live to regret this moment of madness. No matter, it's all in aid of those less fortunate than ourselves, and we hope that Reg readers will dig deep to swell Comic Relief's coffers. ®
A bar and restaurant owner from Lafayette, Colorado was jailed earlier this week after pumping four rounds into his Dell laptop, the Knoxville News Sentinel reports. George Doughty, 48, announced to patrons of his Sportsman's Inn Bar and Restaurant that he intended to execute the portable. He then disappeared into his office, returning after thirty minutes with said machine and revolver. Warning punters in the bar to cover their ears, he let the Dell have it with four rounds at close range. Doughty allegedly celebrated this orgy of violence by hanging the grisly remains over the bar like a hunting trophy. What provoked the gangland-style execution remains unclear, although it does give new meaning to the phrase "percussive maintenance". Doughty faces charges of felony menacing, reckless endangerment and the prohibited use of weapons. Whether his actions constitute a breach of MS Windows XP licencing terms remains to be seen. ®
Russian copy protection specialist StarForce Technology has stepped into the gap left by the DoJ's repurposing of ISONews. Not, we presume, deliberately, but it's a funny coincidence all the same. Prior to becoming an antipiracy propaganda site, and indeed prior to getting involved in Xbox mod chips, ISONews produced lists of software that had been cracked, and names of the teams that had produced the cracks. We have no idea why anyone would find such information useful, nor why ISONews did this, but you can always take your pick from one of these. Granted however that someone, somewhere found this sort of stuff useful, then they'll no doubt be interested in StarForce's new newsletter, Copy Protection Review, which you can sign up for here. Aside from providing news about what's what in the copy protection business, CPR proposes to publish a regular list of cracked copy protection, and how soon after launch it was cracked. It makes interesting reading. Says StarForce on SafeDisc 2: "Out of six games listed in the table one... stayed uncracked during two days after the official release. The other five were cracked before they even came out." Which is not very good, really. For four games using C-Dilla 2.x the record was three days, and they claim the same for SecuROM, although the way we read the table it looks like one achieved a staggering six days uncracked. But overall, it's still not very good, is it? You might say that stats like these provided an argument against bothering with copy protection in the first place. Given StarForce's line of business that would be a very silly thing for them to argue. Instead, StarForce is recommending its own systems, StarForce Professional 3.0 for "the ultimate protection of popular games" (which may in itself be a risky thing to say), StarForce Basic Edition (which we think is largely intended to discourage people), and StarForce CD-R 3.0, which is aimed at professional pirates. Er, at stopping professional pirates. As yet StarForce-protected software does not appear in the company's hall of shame, but as it's now boosting its European presence that will no doubt change. Nevertheless, we wish them the very best, and hope they have a happy CeBIT. And to be fair, the aim does not appear to be to do the impossible and produce an uncrackable system. Says the company: "Usually games publishers use copy protection to keep the maximum sales during the first two-three weeks after release." Which seems about achievable... ®
The Business Software Alliance has pulled off an astonishing anti-piracy coup, identifying a major European university as a distribution hub for... OpenOffice.org. Oops. The University of Münster last week received a "Notice of Claimed Infringement" concerning the unauthorised distribution of Microsoft Office from one Corinna Beck, of the BSA in Washington. Ms Beck must be a very busy person, because she appears not to read her own emails before sending them. The email naturally provides details of the claimed infringement, hilarious details. Infringement first found 24 November 2002, last found 24th February 2003. So they've been watching for three whole months, but somehow failed to identify one key fragment of information. This is also listed in the email, under "infringing content:" Filename: /mandrake_current/SRPMS/OpenOffice.org-1.0.1-9mdk.src.rpm (199,643kb) Filename: /mandrake_current/i586/Mandrake/RPMS/OpenOffice.org-libs-1.0.1-9mdk.i586.rpm (35,444kb) The above computer program(s) is/are being made available for copying, through downloading, at the above location without authorization from the copyright owner(s). In what we feel sure must be a desperate stab at wrapping their heads around alien licensing models, the BSA continues: "Based upon BSA's representation of the copyright owners in anti-piracy matters, we have a good faith belief that none of the materials or activities listed above have been authorized by the rightholders, their agents, or the law. BSA represents that the information in this notification is accurate and states, under penalty of perjury, that it is authorized to act in this matter on behalf of the copyright owners listed above." On receiving the email, Christian Schild of Münster Uni got in touch with OpenOffice, and Louis Suarez-Potts contacted the BSA. Corinna responded: Dear Mrs. Suarez-Potts, I apologize for the obvious mistake I made. Apparently our system detects the OpenOffice files as MS Office programs and alarms me, which in turn sends the notices. I failed my part by not reassuring clearly enough which property was infringed and now that I am aware of that fact we will try and fix the search terms of our system and of course be more aware of the possible mistake. Note that she has not lost her trademark eye for detail. Regrettably, we feel that by "alarms me" she means 'tells me to send out a menacing letter without any further checking,' rather than that it alarms her that the BSA is launching anti-piracy offensives on the say-so of deranged bots that go into condition red at the sight of anything that looks even slightly like Microsoft Office. Münster Uni might care to engage in its own bit of harrassing (which we feel sure would be a new experience for the BSA) over the BSA's false and slanderous allegations. Suarez-Potts meanwhile has asked if anybody else has experienced similar straight shooting from the outfit, "so we can more aggressively act on this--eg, publicize the misunderstandings." But of course, Louis. At your service. ® Related link: The mail and related messages
BT has said it will "vigorously" defend any legal action brought against it by ISP FreeDial.biz. The telco's resolute stand follows reports that FreeDial has been "forced to suspend" all of its broadband operations following an alleged dispute with BT Wholesale. In an email to customers published on ADSLGuide FreeDial reports that it suspended the service "due to BT reneging on our agreement". The ISP said that the current situation has arisen "through absolutely no fault of FreeDial" and claims to be preparing a dossier on the matter "pending legal action". But BT has denied that it is responsible for the problems experienced by Fredial customers. A spokesman for the telco told The Register: "BT has no involvement in any cancellation arrangements between FreeDial.biz and its customers. "In view of FreeDial.biz's indication that it intends legal action we cannot comment further. Any legal action will be vigorously defended by BT," he said. FreeDial caused a flurry of attention last September when it announced that it was offering a 512k ADSL service from just £12.99 a month. At the time the ISP said it could offer the service at well below cost because its strings-attached "budget" service would be subsidised by its "premium" services. In a statement on its site it denied that it was a "fly-by-night" operation insisting that it was in it for "for the long haul" with the "finances, resources and background to be able to carry out our commitments to the full". ® Related Story ADSL for £12.99 - How do they do that?
NTL has described as "speculation" rumours that it is to increase the price of its 128k service from May 1. Postings on its nthellworld Web site point to a £3 a month increase - from £14.99 to £17.99 - from the beginning of May. There are even suggestions that prices for its digital TV packages could also rise. A spokeswoman for the cableco said she had "no information on a price rise" and that there was currently "nothing to announce". She also said that "all companies constantly look at their products and prices". However, there could be something in the rumours since in the past NTL has introduced price rises during this time of the year. NTL has around 500,000 broadband customers. ®
Internet registry RIPE (Réseaux IP Européens) yesterday reported its services were back to normal, after it became the victim of a serious DDoS at the end of last month. All but a tenth of traffic sent to RIPE failed to reach the registry during the two and a half hour duration of the attack on February 27. The distributed ICMP (Internet Control Message Protocol) echo attack left RIPE's DNS, Whois and FTP services unavailable during the duration of the attack, between 14:00 and 16:30 GMT on February 27. RIPE's Web site was also affected. All these services are now back to normal. In a statement RIPE's Network Coordination Centre (NCC) explains "the attack caused various congestion related problems for the RIPE NCC's network to the extent that our BGP [Border Gateway Protocol, an important routing protocol] sessions were affected, and non-ICMP traffic was being randomly dropped." "The attack was successfully mitigated with cooperation of our peer networks at AMS-IX," it adds. RIPE reports that packet loss during the peak of the attack was 90 per cent or more. The motive and perpetrator(s) of that attack are currently unknown. The RIPE NCC is one of four regional Internet Registries that exist in the world, providing allocation and registration services that support the operation of the Internet globally. ® External Links Effects of the DDoS Attack on the RIPE NCC (statement) Description of ICMP Echo (AKA Smurf) attacks
Silicon Valley's economy may be in the tank, so kudos to the Oracle Corporation for doing what it can. An opening advertised on Craigslist calls for a "very special, highly professional manicurist" to work on site at the Redwood Shores complex. The candidate "will also prescribe appropriate nail products and services, educate clients in home care nail maintenance, and must be familiar with common nail disorders". Candidates must be have "excellent sanitation habits, the ability to produce a precise, lasting polish application, and the ability to communicate effectively". You must also have "proficient skills in hand and foot treatments including: spa manicures & pedicures, sports manicures & pedicures, and nail enhancements including acrylics, gel, fiberglass and silk". An Oracle spokesperson confirmed that the job would be in the company's Health Club. One requirement in particular intrigued us. The manicurist must be prepared to learn "techniques specific to Oracle employees". What does that mean? "I couldn't comment on that," the spokesperson told us, declining our invitation to elaborate on what these Oracle-specific techniques may be. The mind boggles. ®