Microsoft accidently offers an early view at its September security patches
Sep. 12, 2011
Microsoft said earlier this morning that it accidently published a few details of the security patches it plans to publish tomorrow following an error by its security team late last week.
On average, Patch Tuesday pre-alerts normally reveal little more than the applications Microsoft intends to securitize and the overall severity of the security vulnerabilities to be addressed.
But this month, Microsoft leaked some details of the security flaws it plans to solve. Five of them consist in fairly average security updates that affect Office and the Windows operating system itself and have a maximum severity rating of important.
Security vulnerability management experts and Microsoft are downplaying the significance of the leak, however. This is understandable, in order to prevent potential hackers from gaining some inside information on the security flaws and what they can do to the average computer.
Wolfgang Kandek, CTO of security outfit Qualys says "While the information is interesting and certainly helpful for us, I still don't believe there is any heightened security risk with the early exposure."
"If the security patches (the binaries) themselves had been revealed then indeed it would give attackers a 4-day head start," he added.
Microsoft's official Security Response Team did admit the security issue on its Twitter feed on Saturday, adding that it had deleted the text.
"Some of you may have seen an early peek at Tuesday’s draft bulletin text. We’ve since removed the content," it said. "Stay tuned for Patch Tuesday on Sep. 13."
In other internet security news
Just two weeks after the kernel.org Linux archive site suffered a critical hacker attack, now the Linux Foundation has gone public about an attack it has also suffered and now has pulled its websites down to clean up a security breach.
A notice posted on the Linux Foundation site said the entire infrastructure including LinuxFoundation.org, Linux.com, and all their many subdomains are now down for security patch-up work due to a security breach that was discovered on September 8.
“The LinuxFoundation made this important decision in the interest of extreme caution and security best practices. We believe this security breach was connected to the intrusion on kernel.org,” the group said.
"We are currently in the process of restoring services in a secure manner as quickly as possible. As with any security intrusion, and as a matter of caution, you should consider the passwords and SSH keys that you have used on these sites compromised. If you have reused these passwords on other sites, please change them immediately. We are currently auditing all our systems and will update this statement when we have more information," a posting on its homepage said.
"We greatly apologize for this inconvenience. We are taking this matter very seriously and appreciate your patience," said the Foundation.
The Linux Foundation infrastructure houses a variety of files, programs, scripts and services including Linux.com, Open Printing, Linux Mark, Linux Foundation events and others, but does not include the Linux kernel or its code repositories. Those are hosted on numerous mirror sites across the globe.
The kernel.org website is still offline as of today after a security compromise was discovered on August 28th.
In other Linux news
So far, 2011 started with some radical and even controversial changes to prepare Ubuntu for touch-based consumer computing, and now the company is getting ready for the transition into Cloud on Servers. And the change should be a smooth one as it is for most companies involved in the Cloud Segment of the IT industry.
The Linux community has released the first beta of Ubuntu 11.10, codenamed Oneiric Ocelot and expected as finished code for potential download in October.
The deployment and management of clouds and cloud-based workloads running Ubuntu Linux and CentOS Linux on server hardware isn't exactly new. Sun Hosting fully deployed its Cloud-Based Hosting Solutions in March 2010, and now the company is busy deploying version 2.0 of its Cloud Enterprise Solutions that will be ready in November.
The beta of Ubuntu Server includes Orchestra, which allows you to provision, deploy, host, manage and orchestrate enterprise data center infrastructure services.
According to one Linux application developer, OpenStack is Orchestra's foremost workload. Orchestra features separate servers for provisioning, management, monitoring and logging of applications, servers and workloads.
Orchestra, meanwhile, is tightly integrated with the Ocelot beta's other big push towards clustered servers running Ubuntu Linux as a cloud platform-- Ubuntu Ensemble, which is designed to handle service deployment and orchestration for cloud and on bare metal.
Orchestra is billed as something that will bring 'Dev Ops' to clouds and data centres running Ubuntu. Devops is the voguish term for trying to bridge the gap between the development and management of applications that was once called application lifecycle management by marketing people.
Ensemble provides a set of best practices and formulas to help ensure that Linux apps running on a server will operate in the same way once in the cloud. In that case, the cloud in question would be an Amazon or Sun Hosting-compatible service.
Ubuntu 11.10 is due to hit a second beta on September 22nd and its final release on October 13, 2011.
But whatever happens in this edition it's a relative sideshow compared to the overhaul of the interface in April with Ubuntu 11.04 that demoted Gnome. Compared to next year's main event, Ubuntu 12.04 is due on April 26, 2012.
Ubuntu 12.04 will be a Long-Term-Support (LTS) edition meaning it sets the look, feel and technical direction of successive versions for the next two-year period. The last LTS was 10.04 in April 2010, which saw Canonical chief Mark Shuttleworth's Apple love manifest itself through Ubuntu's current OS-X-like interface and through the integration of the PC distro with online music and backup services provided for Ubuntu users by Canonical.
Another company that's been very busy in the Cloud segment over the past year is Avantex. The company last month announced the launch of its Enterprise Cloud Solutions. And the company will soon make another announcement on the progress of two more Cloud Solutions.
As always, Linux News Today will keep you posted on these and other developments in the Linux community.
In other technology news
ISPs and various networking companies have been working hard lately in an effort to accelerate the speed of data moving on the Internet today.
Three such companies are Google, VeriSign and Open DNS, a solutions provider for businesses and public organizations that need outsourced DNS services.
The technology helps speed up the DNS (Domain Name System) that provides the numeric Internet Protocol (IP) address needed to get data to an Internet domain such as hightechnewstoday.com.
Called edns-client-subnet in some technical circles, or more ambitiously the "Global Internet Speedup," it uses geographic information associated with IP addresses to help servers fetching information get it from the closest and therefore fastest server.
"Anybody using OpenDNS or Google Public DNS will immediately get the benefits of this technology," said OpenDNS CEO David Ulevitch in an interview. Using it, "the worst-case scenario is that things remain they way they are today," and the best-case scenario is that network delays are as low as they can be, he said.
Google proposed the technology in 2010, though Ulevitch said it's been under discussion for longer than that.
And make no mistake-- the search giant has a strong and vested interest in making the Internet a lot faster, including the use of its own Google Public DNS service, and its Internet operations are big enough that it can use the technology both when requesting data from other servers and when others request data from its own servers.
And Google endorsed the work as well. "Google is committed to making the Internet quicker, not just for our users, but for everyone else," said Google engineer Dave Presotto. "We will do that any way we can, by improving protocols, browsers, client software, and networks."
The tried and true analogy for DNS is that it acts like an old fashioned phone book-- you look up a person's name and the book provides the phone number.
In the case of DNS, when an internet user types in a company name or a search term, Google's search engine and its DNS servers provides it with an IP address.
Ulevitch likens the new technique to a phone book that gives a bit more information based on part of your own phone number.
Specifically, it uses the first three quarters of an IP address. That's enough to narrow down your location generally but not pinpoint it.
A server called a DNS resolver, typically operated by an Internet service provider, has the job of finding the IP address of the server you're trying to reach then providing your computer with the answer.
You can link to the Internet Security web site as much as you like.