Powered By Blogger

Monday, May 12, 2008

New Intel Processor Fights Rootkits, Virtualization Threats

But experts say new features still aren't true anti-rootkit technologies

By Kelly Jackson Higgins
Senior Editor, Dark Reading

Intel today rolled out a new desktop processor for business machines with hardware-based security features that it says can help prevent stealth malware attacks and better secure virtual machines.

The new vPro 2007 Platform, which was code-named Weybridge by Intel, also comes with an upgraded feature that better tracks and logs network traffic for malicious patterns, as well as support for 802.1x and Cisco NAC platforms so that if the operating system is down, you can still manage the endpoints because network security credentials are stored in hardware. Intel's new vPro platform also comes with new built-in management and energy-efficiency features.

Mike Ferrin-Jones, Intel's director of digital office platform marketing, says attackers increasingly are writing stealthier malware that evades detection by software-based tools, and some that even disable them: "That gives them free rein over the system." That has held some enterprises back from going with virtualization technology, he says.

Intel's new processor -- via its so-called Trusted Execution Technology (TXT) and Intel Virtualization Technology for Directed I/O features -- can better protect virtualized software from these kinds of attacks by detecting any changes to the virtual machine monitor; restricting memory access by unauthorized software or hardware; and protecting virtual machines from memory-snooping software, according to the company.

Stealth malware expert Joanna Rutkowska, founder of Invisible Things Lab, says Intel's new Trusted Execution Technology (TXT) and Intel Virtualization Technology for Directed I/O features sound like a step in the right direction for protecting against stealth malware attacks, as are AMD's SKINIT and External Access Protection features, which were released last year.

"I don't believe we can address some problems like kernel rootkits and especially virtualization-based rootkits, without help from the hardware vendors," she says.

Rutkowska says based on what she could surmise from the press materials provided to her, Intel's Virtualization for Directed I/O appears "to let you create more secure hypervisors and deploy secure micro kernel-based OSes, she says.

Still, these technologies aren't true anti-rootkit technologies, she says. "They are, rather, technologies that [for example] would allow [you] to build better OSes, not prone that much to rootkit infections as the OSes we have today [are]."

The key, Rutkowska says, is for OS and software vendors to use Intel's new hardware-based security, as well as AMD's in its new Barcelona processors. "It's all in the hands of software and OS vendors now," she says. "If they don't redesign their products to make use of those new technologies in a proper way, those new technologies will be pretty useless."

Intel's Ferrin-Jones says hardware-based security in the new platform, based on Intel's Core 2 Duo processor and Q35 Express chipset, help where software-based security cannot. "Most security applications run inside the OS," he says. "For the systems to be protected and secured, those apps have to be up and running, as does the OS." Features such as "remote wakeup" capabilities aren't secure or available if the OS goes down.

Meanwhile, major computer makers and resellers are now selling desktops with the new vPro processor, according to Intel, including Dell, HP, and Lenovo, and the company says 350 organizations have already deployed it.

Intel is also currently working with virtual machine monitor and security software vendors to enable their products to work with the new platform, Ferrin-Jones says.

New technologies help autistic children communicate

New technologies are helping autistic children communicate like never before.

At the Pacific Autism Center for Education in Santa Clara, California, each morning begins with a power point presentation, launching a day filled with technology.

Two out of every three students at the center are non-verbal, but thanks to a voice output device 12-year-old Alex is able to get the snacks he craves.

Malique also uses the device to "talk" for him.

"The largest benefit is the ability to give them a voice, gives them a voice that offers a breadth of options and the third benefit is the social interactions that come from having the ability to speak," explained the center's Kurt Ohlfs.

Technology also makes communicating less cumbersome.

Imagine trying to carry around a book with pictures of everything you wanted to convey in a day.

Now the students have all that information at their fingertips.

21-year-old Daniel is using a more advanced, hand held device that offers him a menu with hundreds of icon options.

He selects the ones he wants and the computer talks in sentences, conveying his thoughts.
"It's amazing when we've given some of this technology to our students and it's opened up that door and now the students are suprisingly prolific when it comes to expressing their thoughts," said Ohlfs.

Tuesday, April 8, 2008

Fujitsu Develops Technology Enabling Real-time Multiple-Point Temperature Measurement

Kawasaki, Japan, Apr 4, 2008 - (JCN Newswire) - Fujitsu Laboratories Ltd. announced today the development of a new technology, based on optical fibers, that enables accurate and real-time temperature distribution measurement in large datacenters which have multiple heat sources.

Through a single optical fiber, this technology makes it possible to measure simultaneously the temperature of over 10,000 areas in a facility, thereby enabling visibility of temperature distribution in large datacenters. Combining this technology with an air conditioning control system will enable fine-tune air conditioning, allowing for more energy-efficient large-scale data centers.

With mounting concern over global warming resulting from greenhouse-gas emissions, there is a growing awareness of the importance of reducing the energy consumed by air conditioning, which is one cause behind the increasing consumption of energy in recent years. In the IT industry, with increasingly powerful servers, larger systems installations, and expanded operational times, the waste heat generated by data centers has been on a continuous upward slope.

Thus, in order to lower both energy costs and CO2 emissions, there is a need for more efficient air-conditioning systems in large datacenters. As part of Fujitsu's Green Policy Innovation[1] initiatives, which seek to help customers reduce their environmental footprint, the group is researching ways to help lower the energy demands of IT systems.In the past, air-conditioning systems would monitor temperatures at a few specific points and cool the building to a uniform temperature, which could lead to temporary cold spots. To solve this problem requires not only designing more energy-efficient servers and other heat generators, but also fine-grained temperature control based on multiple-point temperature readings in a large facility.

Technological ChallengesConventional building temperature measurement is executed using semiconductor-based sensors and thermocouple temperature sensors[2] which can only measure the temperature at specific points. Because all these systems use power sources and signal lines, multiple-point temperature measurements would entail a proliferation of cables that would be difficult to manage and be expensive.

Particularly for mid-sized or large data centers, which have numerous server racks, collecting data from multiple points in real time and flexibly adjusting to equipment changes or the addition or subtraction of server racks in response to changes in operational requirements has been extremely difficult.As a currently available technology method for multiple-point temperature measurement, there is a method that entails using optical fibers as sensors to measure the strength of Raman scattering light[3].

However, as this method lacks" position resolution"[2], it is not suited for temperature measurement of multiple heat sources, and thus it is difficult to accurately measure temperature distribution in large data.Newly Developed TechnologyBecause the minute Raman scattering light intensity of an infrared laser shining through an optical fiber changes, the optical fiber itself can be used as a way to measure temperature.

By improving the position resolution, Fujitsu Laboratories developed a practical method technology for measuring with high accuracy the temperature distribution in large data centers.Key features of this technology are as follows:1. By combining the following two technologies, the new technology enables accurate temperature measurement (position resolution of less than 1 meter) of large data centers which have multiple heat sources, such as from severs, within the facility:-

A technology that adjusts variations in measurement that occur depending on the width of a pulse inserted into an optical fiber, or if the pulse widens during transmission- A technology based on thermo-fluid simulation that optimizes the method in which optical fiber is laid2. As the new technology makes it easy to assess positional information about the point being monitored, it can accommodate changes and updates of server racks.ResultsBecause a single strand of optical fiber can measure up to 10,000 locations in real time, large data centers can measure their temperature distribution precisely and inexpensively (assumes optic fiber length of 10 km).

As this system uses only light to take measurements, it imposes no risk of disrupting electrical signals in nearby servers and networking equipment. Because the system can be used to monitor temperature irregularities of servers and air-conditioning systems in real time, it can help lessen the risk of high temperatures creating fires.Future DevelopmentsFujitsu plans to develop this technology for use in air conditioning control systems, as well as complementary technologies to effectively use waste heat, thereby enabling large data centers to consume less energy.

Monday, March 17, 2008

Automakers look for new technology to reinvent the car

Fiona Anderson, Vancouver SunPublished: Thursday, March 13, 2008
Representatives from the big automakers may disagree about which new type of car is best, but they do agree advances in technology are needed before consumers can afford to switch from petroleum-fuelled transportation.

While hybrids can improve fuel efficiency in city driving by as much as 89 per cent, the cars still rely on petroleum, and companies are looking for alternatives.

Next in line may be plug-in hybrids which, as the name implies, plug into regular power sources to recharge. The problem is developing an affordable battery that can be continually recharged.

"The key challenge we yet have to overcome is still technology," said Nancy Gioia, the director of sustainable mobility technologies and hybrid vehicle programs at Ford Motor Co., during a a panel Wednesday at Globe 2008, a business and environment conference taking place in Vancouver.

Andrew Frank, a professor at the University of California, Davis said a plug-in hybrid should run 90 per cent of the time on electricity and 10 per cent on gas.

"We have plugs in the wall and we have gas stations," he said "And we have nothing else. So the plug-in hybrid uses the two infrastructures that we have."

But John German, manager of environmental and energy analyses at the American Honda Motor Co., believes the increased cost of plug-in cars may not be worth the benefit.
And before plug-in hybrids can be affordable, there has to be a breakthrough in battery technology, he said.

But German cautioned against looking for one solution. "There's no silver bullet," he said. "The problem is so immense we need everything and we need to avoid the trap of single solutions."
General Motors was showcasing its hydrogen-fuel cell car at the Globe 2008 trade show. And GM's vice-president of environment, energy and safety policy, Beth Lowery, said the company is currently rolling out cars for customers to test drive. The cars aren't ready for market yet, though GM's target is to have a financially competitive model by 2010, Lowery said. But hydrogen gas stations will have to become more widespread before the cars can become mainstream.