Feeds

Google to double encryption key lengths for SSL certs by year's end

2048-bit keys will be the norm

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

Google is about to start the first upgrade to its SSL certification system in recent memory, and will move to 2048-bit encryption keys by the end of 2013. The first tranche of changes is planned for August 1.

The new requirements are laid out in a blog post and a FAQ on the topic. The upgrade, based on the guidelines from National Institute of Standards and Technology (NIST), will also see Google's root certificate for signing all of its SSL certificates getting an upgrade from a 1024-bit key.

"There aren't immediate concerns about these certificates being cracked," a Google spokesman told El Reg, "but updating them now provides much better defense against any future risks."

The upgrade is required because NIST thinks it's technically possible that the standard could be broken pretty soon. The first reported factorization of a 768-bit RSA modulus came in December 2009, when an international team of computer scientists and cryptographers spent two-and-a-half years dedicating themselves to the task.

"A 1024-bit RSA modulus is still about one thousand times harder to factor than a 768-bit one," the researchers reported. "If we are optimistic, it may be possible to factor a 1024-bit RSA modulus within the next decade.

"We can confidently say that if we restrict ourselves to an open community, academic effort as ours and unless something dramatic happens in factoring, we will not be able to factor a 1024-bit RSA modulus within the next five years. After that, all bets are off."

NIST estimates it would take six or seven years for any attempt to have a realistic chance of success at breaking 1,024-bit keys, based on the speed of processor development and improvements in factoring computation.

That said, it's still an estimate, and NIST had wanted to get the changeover done faster, with 2010 picked as the original transition date. But because the 1,024-bit standard was so ubiquitous, the schedule was pushed back until the end of this year.

It's the first time anyone can remember the SSL encryption keys getting changed at Google, and it's a measure of the power and sophistication of computer processors that the update is needed. Barring some breakthrough in quantum computing or coding practice, it should be some years before another upgrade is required. ®

Beginner's guide to SSL certificates

More from The Register

next story
You really need to do some tech support for Aunty Agnes
Free anti-virus software, expires, stops updating and p0wns the world
Regin: The super-spyware the security industry has been silent about
NSA fingered as likely source of complex malware family
You stupid BRICK! PCs running Avast AV can't handle Windows fixes
Fix issued, fingers pointed, forums in flames
Privacy bods offer GOV SPY VICTIMS a FREE SPYWARE SNIFFER
Looks for gov malware that evades most antivirus
Patch NOW! Microsoft slings emergency bug fix at Windows admins
Vulnerability promotes lusers to domain overlords ... oops
HACKERS can DELETE SURVEILLANCE DVRS remotely – report
Hikvision devices wide open to hacking, claim securobods
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?