Feeds

FTP celebrates ruby anniversary

Many happy RETRs

5 things you didn’t know about cloud backup

File Transfer Protocol (FTP) marks its 40th anniversary on Saturday (April 16).

The venerable network protocol was first proposed by Abhay Bhushan of MIT in April 1971 as a means to transfer large files between disparate systems that made up ARPANet, the celebrated forerunner to the modern interweb.

The protocol required a minimum of handshaking, and even more crucially was tolerant of temporary interruptions during long file transfer sessions, making it far more suitable for the job than anything available at the time or HTTP, which came years later.

No traffic encryption - even for users names and passwords - was included at the that point so anyone could capture traffic on a network. Once the internet moved out of academia then more security was needed.

Initially random port numbers were assigned for every FTP session, in the belief that eavesdroppers would be highly unlikely to be listening in on a selected port at the time of a data exchange. It was soon realised that this was not enough, and industry standard encryption mechanisms such as SSL and later TSL were bundled with the protocol to wrap it in a secure overcoat from about 1996 onwards.

Frank Kenney, former Gartner practice analyst managed file transfer turned VP for Global Strategy at network security tools firm Ipswitch, said the lack of built-in security was more an issue for FTP in the US than in Europe in the 90s. That's because European firms used ISDN at the time which had built-in security of its own. It was only when everyone moved to high speed lines that everyone was stuck in the same boat.

Security problems were addressed by the use of either SSL or SSH (secure shell), but that still left the problems of scalability and manageability. These problems, along with guaranteed delivery, were addressed by the introduction of managed SSL, supported by FTP server packages from the likes of Ipswitch.

Early versions of FTP were command line based, restricting mainstream use of the technology until the advent of the first browsers around 1992. Prior to that users fiddled around with gopher clients, where users needed to know IP addresses and directory information in order to initiate file transfers.

Forty years on, and the protocol's next big step will be the delivery of managed file transfer via the clouds.

"The fundamentals of FTP are not going to change," Kenney said. "It will be enhanced using third party tools but there will be no re-write."

These days talk of file transfers, legal or otherwise, often leads onto a discussion about Torrents. Kenney is confident FTP will not be rendered obsolete by torrents. "Even movie pirates use FTP when they want to transfer files between systems," he said. ®

5 things you didn’t know about cloud backup

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?