Our website is made possible by displaying online advertisements to our visitors.

Please consider supporting us by disabling your ad blocker.
Sign in to follow this  
Followers 0
Fourth echelon

Web's random numbers are too weak, researchers warn

1 post in this topic

The data scrambling systems used by millions of web servers could be much weaker than they ought to be, say researchers.
A study found shortcomings in the generation of the random numbers used to scramble or encrypt data.
The hard-to-guess numbers are vital to many security measures that prevent data theft.
But the sources of data that some computers call on to generate these numbers often run dry.
This, they warned, could mean random numbers are more susceptible to well-known attacks that leave personal data vulnerable.
"This seemed like just an interesting problem when we got started but as we went on it got scary," said security analyst Bruce Potter who, along with researcher Sasha Moore, carried out the study that was presented at the Black Hat security event in Las Vegas.
It looked at the ways that widely used Linux-based web server software generated strings of data that were used as a "seed" for random numbers.
Large, hard-to-guess numbers are vital for encrypting data. They are also used by servers in more mundane security tasks such as randomising where data is stored in memory to thwart attempts by hackers to predict what a machine is doing.

http://www.bbc.com/news/technology-33839925

Share this post


Link to post
Share on other sites
Guest
You are commenting as a guest. If you have an account, please sign in.
Reply to this topic...

×   You have pasted content with formatting.   Restore formatting

×   Your link has been automatically embedded.   Display as a link instead

Sign in to follow this  
Followers 0

Our website is made possible by displaying online advertisements to our visitors.

Please consider supporting us by disabling your ad blocker.