Jump to content
Sign Up To Remove Ads!

Archived

This topic is now archived and is closed to further replies. Want this topic removed from the archive?

Fourth echelon

Web's random numbers are too weak, researchers warn

Recommended Posts

The data scrambling systems used by millions of web servers could be much weaker than they ought to be, say researchers.
A study found shortcomings in the generation of the random numbers used to scramble or encrypt data.
The hard-to-guess numbers are vital to many security measures that prevent data theft.
But the sources of data that some computers call on to generate these numbers often run dry.
This, they warned, could mean random numbers are more susceptible to well-known attacks that leave personal data vulnerable.
"This seemed like just an interesting problem when we got started but as we went on it got scary," said security analyst Bruce Potter who, along with researcher Sasha Moore, carried out the study that was presented at the Black Hat security event in Las Vegas.
It looked at the ways that widely used Linux-based web server software generated strings of data that were used as a "seed" for random numbers.
Large, hard-to-guess numbers are vital for encrypting data. They are also used by servers in more mundane security tasks such as randomising where data is stored in memory to thwart attempts by hackers to predict what a machine is doing.

http://www.bbc.com/news/technology-33839925

Share this post


Link to post
Share on other sites

×