Powered By Blogger

Friday, October 12, 2007

Technology Formerly Known as Bluetooth

You've got ZigBee and Z-Wave for sensor and control networks, Near Field Communications (NFC) for proximity payments, RFID for tracking assets, Ultra Wideband (UWB) for high-speed connections between devices. You might throw Wi-Fi into that mix for networking. And, oh, don't forget about Bluetooth.

Bluetooth has become fairly ubiquitous in today's society. You can't turn around without seeing someone sporting a Bluetooth earpiece and its little blue flashing lights. It's not just geeks and business users. I've seen soccer moms with them. And there are more and more stereo headsets using Bluetooth.

Headsets were the first place Bluetooth found a home, but you're seeing it used in a lot more devices and applications now. There are watches with Bluetooth, even car radios with Bluetooth. The new ways Bluetooth is being used was driven home to me last week when I attended Motorola's annual technology forum for analysts and the press.

Several short-range technologies were part of some demonstrations - NFC to use your phone to pay at the check-out register and ZigBee to create an ad-hoc mesh network for firemen and again to monitor equipment in a hospital. But Bluetooth seemed to be used in more demonstrations as a way to connect not only mobile phones but other equipment like TV monitors, computers and storage devices.

Analysts are forecasting a slowing in the spread of Bluetooth. In-Stat says the number of Bluetooth devices will increase 34% this year, a healthy number but still down substantially from recent years. I think that shows a maturation of the technology. You can't keep doubling every year.

I'm skeptical there will be much slowing in Bluetooth's growth, though. There are some new technologies joining Bluetooth that could push it into more devices and new areas. One is the integration of the Wibree low-power technology, which is under way now with the merger of the two groups. Another advance is Bluetooth 2.1 +EDR (enhanced data rate), which could be in devices by Christmas. Still another is the integration of the UWB broadband technology from the WiMedia Alliance. You haven't heard a lot about the latter, so I have to wonder if that integration is proving to be more difficult than originally thought.

My only concern about the future of Bluetooth is that the technology and the organizations backing it - specifically the Bluetooth Special Interest Group - are becoming unwieldy and cumbersome. Bluetooth is becoming an umbrella of technologies that don't necessarily work together.

A recent report from the analyst group IMS Research says all this expansion of Bluetooth technology may hurt it in certain areas, especially among car manufacturers. Bluetooth was in 4 million vehicles last year, IMS Research says, and it still expects it to grow in the auto segment more than 300% in the next five years. Most of the interest from car makers is for hands-free uses and for audio streaming, IMS says.

But, says the report, some in the automotive industry don't think they can keep up with all the new capabilities on the horizon for Bluetooth, since the design cycle for cars is 4 to 5 years. Incidentally, IMS Research is forecasting 800 million Bluetooth devices will be sold this year, up 40% from 2006.

Sunday, October 7, 2007

Detecting Liquid Explosives On A Plane

After the plot to blow up trans-Atlantic airlines with liquid explosives was uncovered in London in August 2006, there has been pressure on the airline industry, and Homeland Security, to find new ways to not only detect liquids in baggage and on airline passengers, but also to figure out what they are. Now, the DHS Science & Technology Directorate (S&T) is teaming with scientists at the Los Alamos National Laboratory to find a possible solution.

"Having to place your consumable liquids through the baggy routine when going through airport security may one day be history," says S&T Program Manager on the project, Mr. Brian Tait, "and that's going to make a lot of people very happy. This is a new screening prototype that definitely shows promise."

In late June, Los Alamos National Laboratory team successfully completed proof of concept of an extremely sensitive future screening technology. The new technology scans the magnetic changes of individual materials at the molecular level and stores them in a database, which then allows the differentiation and identification of many materials that may be packaged together or separately as they go through the screening process.

It uses the same technology that brain scans are performed with, and is based on ultra-low field magnetic resonance imaging (MRI) which is already being used in medical field for advanced brain imaging. The end goal is to eventually put it next to the current x-ray screener.

The SENSIT technology has already demonstrated the ability to differentiate more than four dozen materials considered "safe" for carrying onto aircraft --from everyday personal items like toothpaste to mouth wash -- to those that are considered hazardous .

"With the MRI signal, we want to distinguish between harmful items, and many common carry-on liquid consumables," says Tait. "The goal is reliable detection of liquids, with high throughput, that is non-contact, non-invasive, requires no radiation, produces no residue and uses the existing airport security portal."

SENSIT is one of S&T's Homeland Innovation Prototypes (HIPS) projects -- high-impact innovative technologies that have shown great promise and are on their way to being transitioned to industry for manufacturing and distribution.

"We're working hard on getting the SENSIT technology to an airport near you very soon," says S&T's Innovation Director, Roger McGinnis.

Tuesday, September 25, 2007

As IBM Launches Innovation Center to Fuel Entrepreneurship

IBM (NYSE: IBM) today announced it will open a new IBM Innovation Center to help start-up companies, software developers, and independent software vendors (ISVs) create new software and hardware applications and services.

Located in Dallas, Texas, and serving local companies, the new IBM Innovation Center will provide technical support and expertise to help businesses test, build and optimize applications based on open platforms. The new center is designed to meet the growing demand from Business Partners and venture funded organizations looking to harness IBM expertise around innovative technologies such as System z, Cell Broadband Engine and Service Oriented Architecture (SOA).

This effort is in response to the tremendous growth that the Dallas technology community is achieving. According to the Greater Dallas Chamber Index, Dallas Fort Worth commands the highest concentration of technology activity in Texas with one-third of all tech establishments and 40 percent of the state's technology jobs.

With the introduction of this new center, IBM will now host a network of more than 35 IBM Innovation Centers around the world. Since their inception over ten years ago, IBM Innovation Centers have become a key component of IBM's strategy to help companies build innovative solutions before they expand into new markets. In 2006 alone, more than 6,000 business partners leveraged IBM Innovation Centers worldwide to build and test new solutions.

Growing Network of Global IBM Innovation Centers

The new IBM Innovation Center in Dallas will offer R&D expertise to enable start-ups, software developers, and other small to mid-sized companies to create new software and hardware applications and services. These efforts will provide start-ups an entry-point to IBM's Research network, accelerating innovation by providing insights into emerging technologies.

To meet the demands from companies that are looking to stay up-to-speed on new technology, the center will also sponsor workshops and best practices for local businesses to take advantage of emerging technologies such as SOA and Cell Broadband Engine architecture. Additionally, IBM will also host connection events offering companies unique networking opportunities for business growth.

Building on Global Successes

In addition to local companies, businesses throughout the world will also be able to take advantage of the resources available at the new IBM Innovation Center in Dallas. For example, through IBM's Virtual Loaner Program, Business Partners around the globe gain access to System i, p and x machines, giving them the same benefits as a loaner machine at their own site, but with the added bonus of IBM managing and hosting the hardware.

IBM has also recently added a new component to the IBM Innovation Centers efforts that will now provide a well-connected global support for Business Partners, independent of their location. IBM will enhance the global integration network that will now seamlessly connect its worldwide IBM Innovation Centers, giving instant access to top researchers and technical experts for all companies, regardless of their proximity to a center.

For example, a Business Partner who requests an engagement with their local IBM Innovation Center in Dublin, Ireland may learn that the IBM Innovation Center in Dallas, Texas is better equipped to meet their technical needs. With access to an Internet connection and just a few easy clicks of the mouse, a company will be able to receive hands-on support from their local IBM team while leveraging the technical resources of an IBM Innovation Center halfway around the world.

"With more than 150 IBM Business Partners in the Dallas area, and that number growing each day, this new investment is being led by the demand from these partners for increased expertise in their local region," said Jim Corgel, general manager, ISV & Developer Relations. "As companies continue to look for ways to grow and expand into new markets, IBM is looking to our partners to set the agenda for ensuring that our resources are meeting their specific needs."

Building on Local Successes
Since 1974, ARX Technologies, based just outside of Dallas, Texas, has worked hand-in-hand with utilities and public sector organizations to create and develop an array of software solutions. An IBM business partner for over 30 years, ARX has learned to take full advantage of the many resources IBM offers to its partners. For example, ARX Technologies has worked extensively with the remote assistance provided by the global network of IBM Innovation Centers to test, validate and modernize its solutions.

"As a rapidly growing company, ARX Technologies is always looking for new ways to leverage the wide range of resources IBM offers to its partners, especially on a local level," said Charlotte Foulkes, CTO, ARX Technologies. "By launching a new IBM Innovation Center in Dallas, IBM is making it possible for us to not only use its facilities for application testing and validation, but having a physical location in the Dallas region also provides us with the opportunity for our staff to attend training sessions and maintain their skills and certifications. More importantly, it provides us with a local facility that we can bring our customers to for hands-on support from IBM."

Today's announcement further demonstrates IBM's commitment to the success of its Business Partners.

Tuesday, September 18, 2007

Asymtek introduces new technologies

Asymtek has introduced two coating/dispensing technologies for applying catalyst inks onto membranes in electrode assemblies of proton exchange and direct methanol fuel cells. Pulse spray coating technology provides quick large area coverage, allowing for uniform film thicknesses. DispenseJet technology deposits the ink with a uniform

thickness and density so lines and patterns are smooth, even, and the same consistency throughout. The consistent and precision application of the ink increases the porosity of the film so fuel and byproducts can pass through the membrane efficiently. Asymtek offers fuel cell manufacturers two types of application technologies for increased dispensing flexibility: spray coating and discrete dot dispensing.

Spray coating is for large areas, for controlled thicknesses, and even film thicknesses. Discrete dot dispensing is used for specific patterns with high edge definition and exact film build up of highly volatile inks. Asymtek has a multiple array of coating and dispensing technologies for catalyst inks, polymer electrolytes, gasketing, and sealing depending on the application.

Catalyst inks are very expensive because of the platinum content. With the Asymtek jet, the ink is deposited precisely where it is programmed to be placed, so there is limited waste. The jet deposits discrete dots in whatever pattern is desired without masking. Asymtek's closed-loop system measures the exact amount of fluid that's deposited.

Process controls and Asymtek's Easy Coat(R) for Windows XP(R) (ECXP) software regulate the volume, size, shape, and viscosity of the dot. "The success of the catalyst ink depends 50% on its chemistry and 50% on its application," said Jim Klocke, Asymtek senior business development manager. "While most application methods use spray technology, the catalyst inks may vary in thickness and density. With Asymtek's jet dispensing, all those problems are eliminated and the ink is deposited precisely, with uniform thickness and density."

Germany: Siemens VDO announces new technologies at Frankfurt motor show

Siemens VDO is displaying a number of technologies at the IAA in Frankfurt.
The PN 4000 and PN 6000 portable navigation devices equipped with a DVB-T receiver as a standard feature: The units offer high-resolution map graphics displayed in a 2D or ...

Sunday, September 16, 2007

CNSE and the birth of NanoEconomics

The history of mankind is one of innovation and change. However change does not happen at a constant rate. Historically, each new technology introduced over time has penetrated established markets at a faster rate than its predecessor. The time that it has taken successive technologies to penetrate one quarter of the overall U.S. consumer market has declined from 55 years (for the automobile) to 7 years (for the Internet). For us, this means that today we can witness and study the birth, introduction and impact of many new technologies.

The purpose of new technologies is to solve problems. When we solve a problem using a new technology it is either to improve our quality of life or to help us create more value in our everyday labor. Whatever the case, new technologies drives us to do things in a better manner and consequently, makes us in general more productive.




Productivity is the effectiveness of productive effort. With passing centuries our conception of productivity has changed dramatically. Moreover, with the advent and growth of the computer and information technology industries in the last 40 years we have witnessed a tremendous technological change. The convergence of computing, information and communications is transforming every aspect of society and creating new possibilities for improved productivity. The productivity in the semiconductor industry is and has been, for several decades, a driving force for the economic growth of the world. Productivity growth in the semiconductor industry has averaged 20 percent for the last 10 years, compared to less than 5 percent for all manufacturing industries.

Productivity in this industry is inevitably related to Moore's Law, meaning the regular increase in performance and data processing capabilities in computer chips. Moore's Law is the engine for continued growth, an engine not to be found in any other industry on the planet. Furthermore, this increase in productivity has occurred hand in hand with a decline in consumer prices. The price of a single transistor was $1.00 in 1968 . Today you can buy millions of transistors for $1.00. What other item has declined in price like the transistor?

Sunday, September 9, 2007

IBM Instantly Connects People to Free Technology Service

IBM (NYSE: IBM) today announced a free service that will allow users of any computer or mobile device such as the iPhone to gain real-time access to thousands of technical resources and emerging technologies with a simple click of a button.

This extension of syndication that IBM is making available is bringing the widest range of technical know how to the fingertips of millions of Yahoo!, Google, NetVibes users and owners of mobile devices such as iPhone.

Developer gizmos from IBM aim to help reach a broader set of users and enable them to more easily interface with technology content, to streamline their daily tasks, and to keep up-to-date on new technologies arming them with the right resources to build innovative applications. Now, with a click of a button, any computer or mobile device user can easily gain the power of syndication on a technology topic of their choice with no programming, training or technical expertise required.

The unique syndication tool gives users centralized and mobile access to the widest range of open source technologies across a variety of topics including Web 2.0 technologies, game development, Web innovation and more through a website called IBM developerWorks, the central hub for thousands of technical resources. Developer gizmos are all widgets, gadgets, modules, or any technological items that are customized to display IBM technology content within a browser, operating system, user's desktop or syndication mashup page.

Developer gizmos are an extension of the services offered by Yahoo!, Google, and NetVibes, customizing the publicly available code or processes to allow users to export content from IBM through podcasts, forums, blogs, educational briefings and "how-to" tutorials, instantly, all at the push of a button, with no extra development necessary.

This move is another step by IBM to ensure that early technology adopters, developers and small to mid-sized businesses are able to capitalize on modern technologies such as Web 2.0, and integrate social networking applications into their businesses in a simple, cost-efficient manner. No other vendor is taking such a simplified and open approach to helping users leverage technology and resources to simplify their personal and business requirements.

This functionality further simplifies the user's experience and emphasizes IBM's commitment of giving the community the ability to decide what content is relevant to their needs. It also makes it easy for developers to obtain relevant content without having to conduct several searches. On an average, developers spend 50 percent of their time searching for relevant content to support their efforts to build applications. This new form of syndication alleviates developers from having to conduct web searches for several hours to obtain the right content in real time.

A small business owner can also benefit from this by creating a customized gizmo on their MyYahoo! web page to receive free updates and information on Web 2.0 technologies.

For example, if a small video game company that specializes in Playstation 3 console development wants to increase revenue by making its games more advanced, the game developer could create developerWorks Gizmos to syndicate the latest technical information directly from the game development space on developerWorks right to his iGoogle, Netvibes or MyYahoo! personal page. Having realtime access to up-to-the-minute information on Cell broadband technology enables him to tap into the power of the Playstation 3, so he can build beautiful, more robust visual effects, not to mention smarter and tougher opponents into the game. This in turn gives the small company a competitive advantage in the marketplace, which can translate into higher profits.

"The IBM developerWorks site is a comprehensive resource that covers the technologies most important to developers today, such as the recent tutorial on Yahoo! Pipes," said Chad Dickerson, senior director, Yahoo! Developer Network. "The addition of the Add to MyYahoo! Button will make it even more convenient for developers and users to tap into the vast array of new technologies made available free of charge from IBM."

How it Works

These gizmos provide a simple interface that enables any user to select content from the IBM developerWorks website, the central hub for thousands of technical resources, and export it to customizable web pages, such as MyYahoo! and iGoogle that are used by millions of users on a daily basis. With a simple click of a button, users can now create custom gizmos that allow them to take advantage of IBM technical content without the need for any additional programming skills or registration.

They can also tailor this information to their areas of interest and receive up-to-the minute updates directly to their home page, eliminating the tedious task of searching for information that is critical for their personal interests or organizational needs. IBM is now bringing the information directly to users through their gizmos on a daily basis.

Free for Taking

With today's news, IBM is bringing the power of thousands of new technology resources, free code to create applications on open standards and how-to tutorials from IBM's developerWorks to millions of users worldwide based on their topic of choice.

For example:
-- Gaming enthusiasts can get up-to-date information on how to improve
their online gaming experience. By using a gizmo, the users gain access to
forums and information on the latest technological advances in gaming while
learning how to create better gaming applications. For example, as a result
of using this gizmo, users who enjoy playing Sudoku would receive a
tutorial that shows them how they can create their own version of the game
that automatically generates new puzzles to solve each time you play.

Developer gizmos from IBM will also give early adopters access to emerging software services from IBM research and development labs. For instance, an IT professional working for a small business that is an early adopter of Web 2.0 technologies visits IBM's website for developers on a regular basis to learn about the latest technologies that can provide viable solutions for their business.

With this new functionality, the user can select blogs, forums, wikis, and more and with the push of a button, create a custom Yahoo! Widget, Google Gadget, and NetVibes Module displaying all new content from IBM on their personal page of choice. Users receive updates right to this page in real-time. This allows them to aggregate the topics that are relevant to their business right within their workspace without having to do several searches or visit multiple pages.

IBM is providing four new avenues for custom syndicated content:

-- IBM developerWorks community spaces -- This is a unique channel that
provides an open platform for developers to build communities on a broad
range of topics and business trends, such as Web 2.0 and SaaS. There are
nearly 20 community topics available. Each topic space includes portlets
displaying blogs, forums, wikis, technical resources, feature stories,
tutorials, and podcasts that are related to the topic. Now users can tap
into these resources by exporting any portlet within the space as a Google
Gadget, NetVibes Module or Yahoo! Widget.

-- IBM developerWorks "build your own feeds" application -- Users have
the ability to select any combination of developerWorks open source topics,
technologies or IBM product brands and export the corresponding article or
tutorial feeds as RSS, ATOM, HTML and now, also as a Yahoo! Widget, Google
Gadget, and NetVibes Module.

-- IBM developerWorks Google Desktop Gadget -- By downloading the IBM
developerWorks Google Desktop Gadget, users can view content and search the
developerWorks website right from their desktop.

-- For all Apple iPhone users -- Users can view the top stories from IBM
developerWorks, sorted by topic, through an interface optimized for the
iPhone. http://www.ibm.com/developerworks/iphone.

"With this effort IBM is expanding its ecosystem of IT professionals by bringing new technology resources to the user directly rather than the user having to search for this information," said Kathy Mandelstein, Director of Worldwide Developer Programs for IBM. "This new feature will also allow small businesses and early adopters with no programming skills to take advantage of all the technology resources IBM has to offer to help businesses build innovative solutions."

Monday, September 3, 2007

Car Navigator features 4.3 in. widescreen 16:9 format LCD.

With hands-free calling option, TomTom GO 720 offers voice recording capability, audio options, FM transmitter, dual-audio volume control, and real-time traffic and weather forecasts. It comes with Map Share(TM) technology, safety features, and maps of US and Canada preloaded on 2 GB internal flash memory. Measuring 4.7 x 3.2 x 0.9 in. and weighing 7.7 oz, device features 400 MHz CPU, 64 MB RAM, SD card slot, GPS receiver, RDS-TMC Traffic receiver, and Bluetooth(TM).

The New TomTom GO With Unique New Features and Map Share(TM) Technology is the Ultimate Car Navigator

o Elegant and slim design with high-quality finish

o Latest version of TomTom's award-winning navigation software

o Unique new TomTom Map Share(TM) technology allows users to improve maps instantly and to benefit from all improvements made by other users

o Extra large 4.3 inch touchscreen with high-quality 3D graphics

o Extensive range of safety features

o Enhanced hands-free calling

o Built-in FM Transmitter to transmit sound over car stereo

o New smart & fun extras such as the ability to record driving instructions

CONCORD, Mass., June 5 -- TomTom, the world's largest portable navigation solutions provider, today reveals the TomTom GO 720 US & Canada edition. Ground-breaking new technology allowing for daily map improvements and a host of new features, such as safety extras and the ability to record driving instructions, have been combined into one compact and stylish design, providing users with the ultimate driving experience.

"The new TomTom GO 720 sets the standard in smart car navigation," says Jocelyn Vigreux, president of TomTom Inc. "With this introduction, we are making unique new features and technologies available to our users to deliver the highest performing, most cutting-edge satellite navigation solutions available on the market today. Our unique TomTom Map Share(TM) technology harnesses the power of our large installed user base for the benefit of all our users."

"Next generation navigation users are looking for navigation offerings that provide more than simple directions," said Thilo Koslowski, vice president and Lead Automotive Analyst, Gartner Inc. "These consumers want solutions that improve their individual driving experience and provide daily relevance via innovative personalization options, seamless communication functionality, intelligent safety features and dynamically updated content including map data."

Sleek Design

The TomTom GO 720 features a new elegant slim design with a high-quality finish that allows it to fit perfectly in any car or to be carried in a shirt pocket. The large 4.3 inch touchscreen has TomTom's renowned easy-to-use and intuitive user interface with 3D graphics, including building footprints, that ensures drivers have an even better overview of their surroundings.

Unique TomTom Map Share(TM) Technology

The TomTom GO 720 also introduces TomTom's unique new map improvement technology, TomTom Map Share(TM). This technology enables drivers to easily and instantly improve their own maps. TomTom Map Share(TM) users will always have the most accurate and up-to-date information available. The technology also enables users to share improvements and benefit from all other users' improvements daily, automatically and easily via TomTom HOME - TomTom's free desktop application. TomTom has the world's largest satellite navigation community with over 10 million users. TomTom Map Share(TM) users can contribute and exchange all their improvements amongst each other, making the best maps available for all of them. TomTom Map Share(TM) means TomTom drivers can always have the most up-to-date maps and inside local knowledge at their fingertips (see separate press release for more information on TomTom Map Share(TM)).

Smart and Fun Extras

With the new TomTom GO, drivers can choose from a host of smart extras, ensuring that their destination is reached in a relaxed and enjoyable way. A number of personalization options have been added for the first time, including:

o Voice recording capability, so users can be guided to their destination
with the voices of their children, family or friends
o Millions of points of interest, including over 100 different
recognizable brand icons to help spot favorite coffee shops,
restaurants and more
o Extensive audio options for integration with the car stereo. Dual-
audio volume control allows the user to set navigation instruction
volume level separately from music volume level. Drivers can play their
favorite songs through the integrated MP3 player or by connecting their
iPod® to the TomTom GO. The built-in FM Transmitter enables
navigation instructions and music to be heard through the car stereo
o Real-time traffic, five-day weather forecasts and celebrity voice
downloads*

New Safety Features

Safety is a key priority in the development of all TomTom products. The new TomTom GO comes with uniquely designed safety features so drivers always have direct access to safety and roadside assistance information wherever they go. The extensive 'Help Me!' menu includes information such as the nearest police station, hospital or car repair service center. It also allows the user to quickly identify their location for emergency assistance providers. Another new safety feature is the shortcut menu, so drivers can jump straight to their favorite, most accessed information with just one touch on the screen for even easier and safer navigation (see separate press release for more information on the extensive safety features).

Enhanced Hands-Free Kit

The enhanced hands-free kit features a new high-quality sound system via Bluetooth®. With optimized acoustic design, enhanced noise reduction, echo cancellation and an automatic answer function, the TomTom GO 720 ensures drivers can make and take calls easier while keeping their eyes on the road.

TomTom HOME and TomTom PLUS

As with all TomTom products, the new TomTom GO 720 comes with TomTom HOME, a free desktop application. TomTom HOME is the essential tool to keep devices up-to-date at all times. Customers can easily download new software versions as well as the latest maps. TomTom's large satellite navigation community can contribute and exchange their map corrections with each other through TomTom HOME, making the most accurate maps available to all users. In addition, the new TomTom GO 720 comes pre-installed with a host of other TomTom PLUS services including TomTom Weather and TomTom QuickGPSfix (for extra fast connection between the devices and GPS satellites). Users can also shop for more services and prepare for trips on their computer.

Traffic Ready

The new TomTom GO 720 offers the latest and most up-to-date traffic information through an RDS-TMC Traffic receiver* and TomTom Traffic subscription*, making driving more efficient and stress-free. The enhanced traffic interface monitors current traffic information and gives precise details about traffic incidents.

TomTom GO 720

The TomTom GO 720 comes pre-installed with the most up-to-date maps of the US and Canada stored on the internal memory.

Availability

The new TomTom GO will be available across the US and Canada starting the end of July.

Product technical specifications
o 4.3" widescreen 16:9 format LCD (WQVGA: 480*272 pixels)
o CPU 400 MHz, 64MB RAM
o Maps of US and Canada preloaded on 2GB internal flash memory
o SD card slot
o High-sensitivity GPS receiver
o RDS-TMC traffic information receiver (optional accessory, not included)
o Integrated FM transmitter
o Bluetooth(TM)
o Lithium-polymer battery (5 hours operation)
o Optimized integrated microphone and speaker for high quality hands-free
functionality
o Dimensions: 4.7" x 3.2" x 0.9"
o Weight: 7.7 ounces

* Not included with the TomTom GO720. Available for separate purchase from www.tomtom.com.

For more information please contact:
Karen CK Drake,
TomTom Inc.
Telephone: 978-405-1688
Email: us.publicrelations@tomtom.com

About TomTom

TomTom NV is the world's largest navigation solution provider. TomTom's products are developed with an emphasis on innovation, quality, ease of use and value. TomTom's products include all-in-one navigation devices which enable customers to navigate right out of the box; these are the award-winning TomTom GO family, the TomTom ONE range and the TomTom RIDER. TomTom PLUS, is the location-based content and services offering for TomTom's navigation products easily available through TomTom HOME. TomTom also provides navigation software products which integrate with third party devices; the TomTom NAVIGATOR software for PDA's and smartphones. TomTom WORK combines industry leading communication and smart navigation technology with leading edge tracking and tracing expertise. TomTom's products are sold through a network of leading retailers in 25 countries and online. TomTom was founded in 1991 in Amsterdam and has offices in Europe, North America and Asia Pacific. TomTom is listed at Euronext, Amsterdam Stock Exchange in The Netherlands. For more information, go to http://www.tomtom.com/.

Source: TomTom

Web site: http://www.tomtom.com

Tuesday, August 28, 2007

New Intel Processor Fights Rootkits, Virtualization Threats

But experts say new features still aren't true anti-rootkit technologies

AUGUST 27, 2007 | 10:34 AM

By Kelly Jackson Higgins
Senior Editor, Dark Reading

Intel today rolled out a new desktop processor for business machines with hardware-based security features that it says can help prevent stealth malware attacks and better secure virtual machines.

The new vPro 2007 Platform, which was code-named Weybridge by Intel, also comes with an upgraded feature that better tracks and logs network traffic for malicious patterns, as well as support for 802.1x and Cisco NAC platforms so that if the operating system is down, you can still manage the endpoints because network security credentials are stored in hardware. Intel's new vPro platform also comes with new built-in management and energy-efficiency features.

Mike Ferrin-Jones, Intel's director of digital office platform marketing, says attackers increasingly are writing stealthier malware that evades detection by software-based tools, and some that even disable them: "That gives them free rein over the system." That has held some enterprises back from going with virtualization technology, he says.

Intel's new processor -- via its so-called Trusted Execution Technology (TXT) and Intel Virtualization Technology for Directed I/O features -- can better protect virtualized software from these kinds of attacks by detecting any changes to the virtual machine monitor; restricting memory access by unauthorized software or hardware; and protecting virtual machines from memory-snooping software, according to the company.

Stealth malware expert Joanna Rutkowska, founder of Invisible Things Lab, says Intel's new Trusted Execution Technology (TXT) and Intel Virtualization Technology for Directed I/O features sound like a step in the right direction for protecting against stealth malware attacks, as are AMD's SKINIT and External Access Protection features, which were released last year.

"I don't believe we can address some problems like kernel rootkits and especially virtualization-based rootkits, without help from the hardware vendors," she says.

Rutkowska says based on what she could surmise from the press materials provided to her, Intel's Virtualization for Directed I/O appears "to let you create more secure hypervisors and deploy secure micro kernel-based OSes, she says.

Still, these technologies aren't true anti-rootkit technologies, she says. "They are, rather, technologies that [for example] would allow [you] to build better OSes, not prone that much to rootkit infections as the OSes we have today [are]."

The key, Rutkowska says, is for OS and software vendors to use Intel's new hardware-based security, as well as AMD's in its new Barcelona processors. "It's all in the hands of software and OS vendors now," she says. "If they don't redesign their products to make use of those new technologies in a proper way, those new technologies will be pretty useless."

Intel's Ferrin-Jones says hardware-based security in the new platform, based on Intel's Core 2 Duo processor and Q35 Express chipset, help where software-based security cannot. "Most security applications run inside the OS," he says. "For the systems to be protected and secured, those apps have to be up and running, as does the OS." Features such as "remote wakeup" capabilities aren't secure or available if the OS goes down.

Meanwhile, major computer makers and resellers are now selling desktops with the new vPro processor, according to Intel, including Dell, HP, and Lenovo, and the company says 350 organizations have already deployed it.

Intel is also currently working with virtual machine monitor and security software vendors to enable their products to work with the new platform, Ferrin-Jones says.

Monday, July 30, 2007

Europe Unlocks Wireless Potential

lONDON - Telecommunications companies and wireless operators in Europe such as Nokia and Ericsson look set to get a healthy chunk of their 3G license costs reduced, after the European Commission proposed further liberalization of the wireless communication spectrum Wednesday.

The Commission aims to repeal a 20-year-old directive that reserved the use of certain radio frequencies for GSM-standard services, which initially helped make the European standard a success but ended up making spectrum space scarce for new technologies like 3G--"third-generation"--broadband and wireless data services.

"This proposal is a concrete step towards a more flexible, market-driven approach to spectrum management," said Telecommunications Commissioner Viviane Reding. "It will increase competition in the use of spectrum bands and enhance accessibility of European citizens to multimedia services."

Until now, national regulators across Europe have allocated bands in the electromagnetic spectrum according to specific technologies, a process that tends to be excessively bureaucratic and slow to respond to new services and technologies.

The old system also pushed up license fees for operators, whose fierce competition over a limited slice of spectrum led to high prices and equally high losses for 3G investments. Companies including Vodafone (nyse: VOD - news - people ), Deutsche Telekom (nyse: DT - news - people ), France Telecom (nyse: FTE - news - people ) and Hutchison Whampoa (nyse: HTX - news - people ) have desperately battled to reclaim tax compensation on $100 billion worth of European 3G licenses bought in 2000, but last month their appeals were rejected.

If the new proposals are adopted by the European Parliament, the 900 MHz and 1800 MHz bands previously reserved for GSM usage will be opened up to other technologies. 3G services currently use the 2100 MHz frequency, allocated in the 1990s.

"With this decision it's going to be a lot cheaper to deploy 3G," said Matthew Howett, analyst with Ovum Research. "It's cheaper to use the lower frequency and allows you to cover a much wider area," he added.

The European Commission said that the wireless communications sector could see reductions in network costs of up to 40% over the next five years. The trade association GSMA said last month that a 3G network in the 900 MHz band would achieve up to 40% greater coverage for the same costs as within the 2100 MHz band.

Sunday, July 29, 2007

Don't Blink: Interview with Jeff Harrow

Sam Vaknin, Ph.D. - 7/28/2007
Jeff Harrow is the author and editor of the Web-based multimedia "Harrow Technology Report" journal and Webcast, available at www.TheHarrowGroup.com. He also co-authored the book "The Disappearance of Telecommunications". For more than seventeen years, beginning with "The Rapidly Changing Face of Computing", the Web's first and longest-running weekly multimedia technology journal, he has shared with people across the globe his fascination with technology and his sense of wonder at the innovations and trends of contemporary computing and the growing number of technologies that drive them. Jeff Harrow has been the senior technologist for the Corporate Strategy Groups of both Compaq and Digital Equipment Corporation. He invented and implemented the first iconic network management prototype for DECnet networks.

He now works with businesses and industry groups to help them better understand the strategic implications of our contemporary and future computing environments.

Question: You introduce people to innovation and technological trends - but do you have any hands on experience as an innovator or a trendsetter?

Answer: I have many patents issued and on file in the areas of network management and user interface technology, I am commercial pilot, and technology is both my vocation and my passion. I bring these and other technological interests together to help people "look beyond the comfortable and obvious", so that they don't become road-kill by the side of the Information Highway.

Question: If you had to identify the five technologies with the maximal economic impact in the next two decades - what would they be?

Answer:

A) The continuation and expansion of "Moore's Law" as it relates to our ability to create ever-smaller, faster, more-capable semiconductors and nano-scale "machines." The exponential growth of our capabilities in these areas will drive many of the other high-impact technologies mentioned below.

B) "Nanotechnology." As we increasingly learn to "build things 'upwards" from individual molecules and atoms, rather than by "etching things down" as we do today when building our semiconductors, we're learning how to create things on the same scale and in the same manner as Nature has done for billions of years. As we perfect these techniques, entire industries, such as pharmaceuticals and even manufacturing will be radically changed.

C) "Bandwidth." For most of the hundred years of the age of electronics, individuals and businesses were able to 'reach out and touch' each other at a distance via the telephone, which extended their voice. This dramatically changed how business was conducted, but was limited to those areas where voice could make a difference.

Similarly, now that most business operations and knowledge work are conducted in the digital domain via computers, and because we now have a global data communications network (the Internet) which does not restrict the type of data shared (voice, documents, real-time collaboration, videoconferencing, video-on-demand, print-on-demand, and even the creation of physical 3D prototype elements at a distance from insubstantial CAD files), business is changing yet again.

Knowledge workers can now work where they wish to, rather than be subject to the old restrictions of physical proximity, which can change the concept of cities and suburbs. Virtual teams can spring up and dissipate as needed without regard to geography or time zones. Indeed, as bandwidth continues to increase in availability and plummet in cost, entire industries, such as the "call center," are finding a global marketplace that could not have existed before.

Example: U.S. firms whose "800 numbers" are actually answered by American-sounding representatives who are working in India, and U.S. firms who are outsourcing "back office" operations to other countries with well-educated but lower-paid workforces.

Individuals can now afford Internet data connections that just a few years ago were the expensive province of large corporations (e.g., cable modem and DSL service). As these technologies improve, and as fiber is eventually extended "to the curb," many industries, some not yet invented, will find ways to profitably consume this new resource. We always find innovative ways to consume available resources.

D) "Combinational Sciences." More than any one or two individual technologies, I believe that the combination and resulting synergy of multiple technologies will have the most dramatic and far-reaching effects on our societies. For example, completing the human genome could not have taken place at all, much less years earlier than expected, without Moore's Law of computing.

And now the second stage of what will be a biological and medical revolution, "Proteomics", will be further driven by advances in computing. But in a synergistic way, computing may actually be driven by advances in biology which are making it possible, as scientists learn more about DNA and other organic molecules, to use them as the basis for certain types of computing!

Other examples of "combination sciences" that synergistically build on one another include:

- Materials science and computing. For instance: carbon nanotubes, in some ways the results of our abilities to work at the molecular level due to computing research, are far stronger than steel and may lead to new materials with exceptional qualities.

- Medicine, biology, and materials science. For example, the use of transgenic goats to produce specialized "building materials" such as large quantities of spider silk in their milk, as is being done by Nexia Biotechnologies.

- "Molecular Manufacturing." As offshoots of much of the above research, scientists are learning how to coerce molecules to automatically form the structures they need, rather than by having to painstakingly push or prod these tiny building blocks into the correct places.

The bottom line is that the real power of the next decades will be in the combination and synergy of previously separate fields. And this will impact not only industries, but the education process as well, as it becomes apparent that people with broad, "cross-field" knowledge will be the ones to recognize the new synergistic opportunities and benefit from them.

Question: Users and the public at large are apprehensive about the all-pervasiveness of modern applications of science and engineering. People cite security and privacy concerns with regards to the Internet, for example. Do you believe a Luddite backlash is in the cards?

Answer: There are some very good reasons to be concerned and cautious about the implementation of the various technologies that are changing our world. Just as with most technologies in the past (arrows, gunpowder, dynamite, the telephone, and more), they can be used for both good and ill. And with today's pell-mell rush to make all of our business and personal data "digital," it's no wonder that issues related to privacy, security and more weigh on peoples' minds.

As in the past, some people will choose to wall themselves off from these technological changes (invasions?). Yet, in the context of our evolving societies, the benefits of these technologies, as with electricity and the telephone before them, will outweigh the dangers for many if not most people.

That said, however, it behooves us all to watch and participate in how these technologies are applied, and in what laws and safeguards are put in place, so that the end result is, quite literally, something that we can live with.

Question: Previous predictions of convergence have flunked. The fabled Home Entertainment Center has yet to materialize, for instance. What types of convergence do you deem practical and what will be their impact - social and economic?

Answer: Much of the most important and far-reaching "convergences" will be at the scientific and industrial levels, although these will trickle down to consumers and businesses in a myriad ways. "The fabled Home Entertainment Center" has indeed not yet arrived, but not because it's technologically impossible - more because consumers have not been shown compelling reasons and results. However, we have seen a vast amount of this "convergence" in different ways. Consider the extent of entertainment now provided through PCs and video game consoles, or the relatively new class of PDA+cell phone, or the pocket MP3 player, or the in-car DVD...

Question: Dot.coms have bombed. Now nano-technology is touted as the basis for a "New Economy". Are we in for the bursting of yet another bubble?

Answer: Unrealistic expectations are rarely met over the long term. Many people felt that the dot.com era was unrealistic, yet the allure of the magically rising stock prices fueled the eventual conflagration. The same could happen with nanotechnology, but perhaps we have learned to combine our excitement of "the next big thing" with reasonable and rational expectations and business practices. The "science" will come at its own pace - how we finance that, and profit from it, could well benefit from the dot.bomb lessons of the past. Just as with science, there's no pot of gold at the end of the economic rainbow.

Question: Moore's Law and Metcalf's Law delineate an exponential growth in memory, processing speed, storage, and other computer capacities. Where is it all going? What is the end point? Why do we need so much computing power on our desktops? What drives what - technology the cycle-consuming applications or vice versa?

Answer: There are always "bottlenecks." Taking computers as an example, at any point in time we may have been stymied by not having enough processing power, or memory, or disk space, or bandwidth, or even ideas of how to consume all of the resources that happened to exist at a given moment.

But because each of these (and many more) technologies advance along their individual curves, the mix of our overall technological capabilities keeps expanding, and this continues to open incredible new opportunities for those who are willing to color outside the lines.

For example, at a particular moment in time, a college student wrote a program and distributed it over the Internet, and changed the economics and business model for the entire music distribution industry (Napster). This could not have happened without the computing power, storage, and bandwidth that happened to come together at that time.

Similarly, as these basic computing and communications capabilities have continued to grow in capacity, other brilliant minds used the new capabilities to create the DivX compression algorithm (which allows "good enough" movies to be stored and distributed online) and file-format-independent peer-to-peer networks (such as Kazaa), which are beginning to change the video industry in the same manner!

The point is that in a circular fashion, technology drives innovation, while innovation also enables and drives technology, but it's all sparked and fueled by the innovative minds of individuals. Technology remains open-ended. For example, as we have approached certain "limits" in how we build semiconductors, or in how we store magnetic information, we have ALWAYS found ways "through" or "around" them. And I see no indication that this will slow down.

Question: The battle rages between commercial interests and champions of the ethos of free content and open source software. How do you envisage the field ten years from now?

Answer: The free content of the Internet, financed in part by the dot.com era of easy money, was probably necessary to bootstrap the early Internet into demonstrating its new potential and value to people and businesses. But while it's tempting to subscribe to slogans such as "information wants to be free," the longer-term reality is that if individuals and businesses are not compensated for the information that they present, there will eventually be little information available.

This is not to say that advertising or traditional "subscriptions," or even the still struggling system of "micropayments" for each tidbit, are the roads to success. Innovation will also play a dramatic role as numerous techniques are tried and refined. But overall, people are willing to pay for value, and the next decade will find a continuing series of experiments in how the information marketplace and its consumers come together.

Question: Adapting to rapid technological change is disorientating. Toffler called it a "future shock". Can you compare people's reactions to new technologies today - to their reactions, say, 20 years ago?

Answer: It's all a matter of 'rate of change.' At the beginning of the industrial revolution, the parents in the farms could not understand the changes that their children brought home with them from the cities, where the pace of innovation far exceeded the generations-long rural change process.

Twenty years ago, at the time of the birth of the PC, most people in industrialized nations accommodated dramatically more change each year than early industrial-age farmer would have seen in his or her lifetime. Yet both probably felt about the same amount of "future shock," because it's relative The "twenty years ago" person had become accustomed to that year's results of the exponential growth of technology, and so was "prepared" for that then-current rate of change.

Similarly, today, school children happily take the most sophisticated of computing technologies in-stride, while many of their parents still flounder at setting the clock on the VCR - because the kids simply know no other rate of change. It's in the perception.

That said, given that so many technological changes are exponential in nature, it's increasingly difficult for people to be comfortable with the amount of change that will occur in their own lifetime. Today's schoolchildren will see more technological change in the next twenty years than I have seen in my lifetime to date; it will be fascinating to see how they (and I) cope.

Question: What's your take on e-books? Why didn't they take off? Is there a more general lesson here?

Answer: The E-books of the past few years have been an imperfect solution looking for a problem.

There's certainly value in the concept of an E-book, a self-contained electronic "document" whose content can change at a whim either from internal information or from the world at large. Travelers could carry an entire library with them and never run out of reading material. Textbooks could reside in the E-book and save the backs of backpack-touting students. Industrial manuals could always be on-hand (in-hand!) and up to date. And more.

Indeed, for certain categories, such as for industrial manuals, the E-book has already proven valuable. But when it comes to the general case, consumers found that the restrictions of the first E-books outweighed their benefits. They were expensive. They were fragile. Their battery life was very limited. They were not as comfortable to hold or to read from as a traditional book. There were several incompatible standards and formats, meaning that content was available only from limited outlets, and only a fraction of the content that was available in traditional books was available in E-book form. Very restrictive.

The lesson is that (most) people won't usually buy technology for technology's sake. On the other hand, use a technology to significantly improve the right elements of a product or service, or its price, and stand back.

Question: What are the engines of innovation? what drives people to innovate, to invent, to think outside the box and to lead others to adopt their vision?

Answer: "People" are the engines of innovation. The desire to look over the horizon, to connect the dots in new ways, and to color outside the lines is what drives human progress in its myriad dimensions. People want to do things more easily, become more profitable, or simply 'do something new,' and these are the seeds of innovation.

Today, the building blocks that people innovate with can be far more complex than those in the past. You can create a more interesting innovation out of an integrated circuit that contains 42-million transistors today - a Pentium 4 - than you could out of a few single discrete transistors 30 years ago.

Or today's building blocks can be far more basic (such as using Atomic Force Microscopes to push individual atoms around into just the right structure.) These differences in scale determine, in part, why today's innovations seem more dramatic.

But at its heart, innovation is a human concept, and it takes good ideas and persuasion to convince people to adopt the resulting changes. Machines don't (yet) innovate. And they may never do so, unless they develop that spark of self-awareness that (so far) uniquely characterizes living things.

Even if we get to the point where we convince our computers to write their own programs, at this point it does not seem that they will go beyond the goals that we set for them. They may be able to try superhuman numbers of combinations before arriving at just the right one to address a defined problem, but they won't go beyond the problem. Not the machines we know today, at any rate.

On the other hand, some people, such as National Medal of Technology recipient Ray Kurzweil, believe that the exponential increase in the capabilities of our machines - which some estimate will reach the complexity of the human brain within a few decades - may result in those machines becoming self-aware.

Don't Blink!

Saturday, July 28, 2007

FERC announces pilot project licensing process for new hydropower technologies

Washington, D.C., July 27, 2007 -- The Federal Energy Regulatory Commission (FERC) announced it will convene a technical conference on licensing pilot projects for ocean energy hydro technologies to discuss a staff proposal for a process that could complete licensing in as few as six months.

Commissioner Philip Moeller will lead the conference, to be conducted Oct. 2, 2007, in Portland, Oregon. This is the latest in a series of measures the commission has undertaken since 2006 to address intensifying interest in the development of ocean, wave and tidal, or hydrokinetic, technologies.

"Perhaps the greatest barrier to realizing the potential of new hydrokinetic technologies is that they are unproven," chairman Joseph T. Kelliher said. "These technologies must be demonstrated before large scale commercial deployment can occur. Today we take a major step to reduce the barriers to the success of these new hydro technologies, by proposing a simplified licensing process suitable for licensing pilot projects."

"This new generation of hydrokinetic technologies will bring hydropower to the forefront of the renewable energy debate," Moeller said. "It is generating a lot of enthusiasm throughout the country, particularly in coastal states like my home state of Washington. FERC wants to harness this enthusiasm by exploring ways to reduce the regulatory barriers to realize the amazing potential of this domestic renewable power source?one that can help meet renewable portfolio standards established by states."

The goal of the commission staff proposal is to complete the full project licensing process in as few as six months, provide for commission oversight and input from affected states and other federal agencies, and allow developers to generate electricity while conducting the requisite testing.

The process would be available for projects that are 5 megawatts or smaller, removable or able to shut down on relatively short notice, located in waters that have no sensitive designations, and for the purpose of testing new hydro technologies or determining appropriate sites for ocean, wave and tidal energy projects.

At its December 2006 conference on hydrokinetic energy, FERC learned that these technologies are in a developmental phase, which presents significant risks for developers due to a lack of information about engineering performance and environmental effects, and limited access to financing.

In response to FERC's February 2007 notice of inquiry on preliminary permits for the new technologies, at least 14 entities addressed the need for a pilot program licensing process.

Comments included recommendations that FERC address the unique characteristics of pilot projects by: permitting connection to the national grid both for study purposes and to generate revenue; implementing a simpler, faster review process; requiring site restoration following experimental deployments; and requiring a license period of five years rather than 30-50 years.

Friday, July 27, 2007

Comment: Managing technological risks


27-July-2007:
Recent comments by former United nations secretary General Kofi Annan over the role of genetically modified sed in african agriculture have reopened debate over the risks posed by new technologies.

And this is the point I want to make.

Technological risks to society and the environment have become an integral part of global public policy. Managing these risks and responding to public perceptions of them are necessary if engineering are to be effectively deployed to meet development needs.

New technologies have been credited with creating new industrial opportunities but also with destroying the status quo. In fact, maintaining the status quo comes with its risks as well. In some cases the risks of doing nothing may outweigh the challenges associated with investing in new responses to social challenges.

In the past, technological risks were confined to countries in which new technologies emerged or even to the sectors in which they were applied. Concerns two decades ago over the use of microprocessors in industry were restricted to their possible impact on employment displacement in the manufacturing sector.

Workers and labor organizations around the world protested the use of this emerging technology. Today echoes of these debates are still heard in discussions of the “deskilling” of workers.

Although interest in the impact of microelectronics on employment was expressed in many parts of the world, it did not become a mass movement involving a wide range of social groups, for at least two reasons.

First, the possibility of job displacement was weighed against the benefits of raising industrial productivity. Second, the global economy was not as integrated as it is today, and many debates were localized. Globalization gives technological risk a wider meaning and turns local debates over certain products into mass movements.

This is itself a source of new risks.

Risk perceptions vary considerably across technologies, and this defines the scale of social mobilization. Attempts in the 1980s to promote the adoption of renewable energy technologies were bedeviled with social opposition.

Sporadic opposition was recorded in many parts of the world, but it did not translate into mass movements. Risk perceptions of pharmaceutical products are not a major challenge to the use of new medicines, partly because of the limited range of options available to consumers in life-threatening situations.

Fear of genetic engineering, for example, stems from scientific, technical, economic, cultural, and ethical concerns. Opposition is differentiated along product lines (transgenic crops, fish, trees, cattle). Many people oppose genetic engineering because of corporate control of the industry, which they perceive as a social risk.

A focus on technological risks can overshadow the possible benefits of an emerging technology, which are often difficult to predict. At the time of their invention, computers were envisioned as able to perform nothing more than rapid calculation in scientific research and data processing. The rise of the Internet as a global phenomenon could not have been predicted based on the early uses of information technology.

Technological risks have to be weighed against the risks of existing technologies as well as the risks of not having access to new technologies. The risks of not having access to the Internet may outweigh the risks of employment displacement that shaped many of the attitudes toward information technology in its earlier years.

The concerns over employment displacement were genuine, but the risks were often projected to whole industries or sectors. The debate is dominated by concerns over job displacement rather than the potential contributions of the new technologies to economic productivity.

There are also numerous cases in which society has underestimated the risks posed by new technologies or adopted them without adequate knowledge about their dangers. Managing technological uncertainty will require greater investment in innovative activities at the scientific and institutional levels.

At the technical level, technological diversity is essential to ensuring that society is able to respond to emerging challenges with the knowledge at its disposal. Technological diversity demands greater investment in scientific enterprises as well as the creation of measures to facilitate access to available technical options.

It also requires flexibility in institutional arrangements, to enable society to respond swiftly to technological failure. Such flexibility can be built into the design of technological systems themselves. Diversification of energy sources, for example, is an institutional response to the risks posed by dependence on centralized fossil fuel power plants. Similar diversification principles apply to information storage, especially as the world community moves into the information age.

Trends in industry suggest that a combination of incentives and regulations can shift technological change to meet environmental goals. Already enterprises are responding to the growing environmental consciousness among consumers and starting to adopt new environmental standards in fields such as emissions and energy use.

Efforts to focus on risk in the absence of actual technology will remain hollow exhortations and a distraction from genuine development efforts. Using emerging technologies without considering their risks is likely to elicit social pressures that can undermine technological development. The challenge is to maximize the benefits of new technologies while reducing their risks.

Calestous Juma teaches at Harvard University’s Kennedy School of Government where he directs the Science, Technology and Globalization Project.

Thursday, July 26, 2007

Two HDTV Technologies Worth Waiting For

LED backlighting and 120-Hz refresh rates are coming to mainstream HDTVs like the ones Samsung showed off this week.

If you're planning an HDTV purchase this fall (or looking ahead to one this winter), keep an eye out for two emerging technologies. 1080p is now everywhere, LCD HDTVs are taking over, and 40-inch displays are evolving into the new sweet spot. But new sets slated for this fall and winter are will be among the first mainstream displays to incorporate several new technologies that can significantly improve picture quality.

Samsung, for example, recently showed off its latest lines of LCD HDTVs due out in August. One line sports a 120-Hz refresh rate--double the 60 Hz of standard LCD TVs--which makes for sharper fast-moving images. Another line uses LED backlights, which dramatically boost contrast and allow for a wider range of colors.

Both technologies should be available from a wide variety of vendors this fall, including LG Electronics, Philips, and Sharp. And as these enhancements make their way into more and more TVs, the price difference between standard LCD TVs and these newer models should shrink rapidly. Here's a look at Samsung's plans for 120-Hz HDTVs and LED backlighting, and why you might want to wait for a television that makes use of either technology.

120-Hz Displays

Momentum behind 120 Hz has been building since early this year. JVC was among the first vendors to ship a 120-Hz display, and Sharp's Aquos D82U and D92U series televisions began shipping back in February. This summer, Philips, LG, and Samsung all announced their respective 120-Hz technologies, with products coming by this fall.

At 120 Hz, the television's refresh rate is double the previous standard rate for displaying video content. By doing so, it can smooth out any residual motion blur that results from fast-moving action found in such content as sports and a scrolling news ticker on the bottom of the screen. Video content is filmed at 30 frames per second, which means such content is best shown at 60 Hz or 120 Hz.

Samsung showed a split-screen demonstration of its 120-Hz technology at an event here in San Francisco, with one side showing the 120-Hz technology, and the other side showing 60 Hz. The difference between the two was noticeable: At 120 Hz, the ticker moved more smoothly and fast-moving video appeared sharper.

The 71 series displays that Samsung is launching in August use a technology called McFi--short for Motion Compensated Frame Interpolation--to create new interpolated video frames and insert them between each frame of video to smooth out fast motion. Samsung's technology looks for any movement, then it creates an average of those movements to insert a frame in between them. Other HDTV makers insert a black frame in between frames, an approach Samsung claims fixes the motion-blur issue, but degrade the panel's brightness.

LED Backlighting

If you're less concerned about fast-moving images, a display with an LED backlight may be more to your liking. The big advantage to LED-backlit TVs is improved contrast ratio. Samsung says its 81 series of displays can automatically adjust the backlight for specific parts of the picture, depending upon the source content. This allows the display to achieve deeper blacks and crisper whites than can be achieved with the Cold Cathode Fluorescent Lamp technology (CCFL) traditionally used by LCD HDTVs.

In CCFL, the tubes light up the back of the display; those tubes can be all on, or all off, and they allow some degree of light leakage. But LED backlighting allows a greater degree of control, which enables Samsung to claim a dynamic contrast ratio of 100,000:1, a four times improvement over its CCFL displays.

Price Premiums

While 120-Hz displays won't be that much more expensive than standard 1080p displays are today, you will pay a premium for an LED backlit display. The 40-inch model in Samsung's 120-Hz 71 series line should retail for $2699 when it ships in August, for example, while Samsung's 40-inch LED backlit model from the 81 series will go for $2999.

At least with TVs in that price range, other key HDTV technologies have become standard. A year ago, 1080p resolutions were still a rarity--and available in to higher-end models only. As we head into the fall, 1080p is de rigueur on HDTVs at sizes of 40 inches and up. Samsung, for example, will have only three non-1080p models going forward in that size range. HDMI 1.3 is also getting more pervasive across a wide spectrum of LCD (and for that matter, plasma) displays.

Wednesday, July 25, 2007

Power & Cooling Meet The New Tech Wave

How Innovative Technologies Are Changing Data Center Infrastructures

nnovations in data center technologies continue to transform productivity throughout the enterprise, giving way to increased flexibility and less downtime. Yet as a parallel consequence, many of these same innovations are forcing data center and facilities managers to alter their overall outlook toward power and cooling needs.

In the long run, changes forced by newer technologies can be positive because the increases in productivity can eventually outpace the increased costs to cool the devices. However, older data center rooms aren’t necessarily equipped to adequately handle new equipment, such as blade servers, and in turn can encounter far more heat issues than what existed with older devices.

“Most advances in technology require additional horsepower to take advantage of new product features and benefits,” says Kevin Houston, virtualization and consolidation practice leader at Optimus Solutions (www.optimussolutions.com), which helps firms plan, build, and maintain their IT infrastructure. “Data center consolidation projects can be complex and can easily overwhelm an IT organization struggling to maintain current operations.”

Blades Burrow In

Ask any data center manager to select a technology that’s sparking the most change in power and cooling requirements, and you’re likely to hear “blade servers” as the answer. Although these devices save power on one hand, they often require overhauled cooling infrastructures on the other.

“Blade servers are challenging the existing data center design and requiring audits to identify and propose the additional power and cooling resources to maintain an environment that meets the server manufacturer’s optimum operation requirements,” says Bill S. Annino Jr., director of the converged network solutions group at Carousel Industries (www.carouselindustries.com), a communications systems integrator and reseller.

Unlike traditional servers, blade servers forgo redundant components in their chassis, instead using single components and sharing them among the computers in the chassis. Blade enclosures also use a single power source for all the blades within them, which helps to boost the overall efficiency of the server environment.

That single blade rack can replace several traditional racks, which will help enterprises save power. “We’ve seen a tremendous increase in interest for blade technologies, as customers recognize the benefits of physical consolidation through a centralized ecosystem of power and cooling,” Houston says. “Both HP and IBM have made significant investments in their blade chassis to reduce the power and cooling needs for a server environment with as much as 50% over an equivalent physical rackmount environment.”

On the downside, many older data centers in small and midsized enterprises are designed to deliver a consistent level of cooling throughout the entire environment. The high-density form factors of blades require more concentrated cooling in specific areas, which means that a transition to blades could require facilities to be revamped to accommodate hot spots. After all, the cold air requirements for blade racks can be quadrupledor morewhen compared to the requirements for traditional racks.

This challenge becomes more difficult when SMEs mix blades with traditional servers because they still need traditional cooling methods but also need to remove the heat generated in specific areas by the blades. In these mixed environments, some experts recommend removing the traditional racks near the blade to allow for greater heat dispersion.

Call For A Change

Other increasingly popular technologies are also making their mark on power and cooling demands, albeit in different ways. In particular, VoIP is enjoying increased popularity in SMEs, and managers are witnessing a trade-off between power consumption and overall costs.

“Although VoIP phones require additional power consumption by drawing greater Power over Ethernet, or PoE [power], it is important to always evaluate the total cost of ownership,” Houston says. “Although power costs may increase, the boost in productivity from operational efficiencies and application integrationnot to mention long-distance savingsoutweighs any increase to the energy bill in most scenarios.”

Elizabeth King, general manager of Hitachi’s Servers System Group (www.hitachi.us), notes that VoIP creates stress on networks, storage, and computing resources. Further, while the technology hasn’t reached mainstream status, “there is enough experience to indicate that it will require corporations to expand their computing and network capabilities,” she says.

Of course, more capabilities often lead to more equipment, and more equipment means increased cooling and power demands. “The impact of the VoIP revolution has definitely added to the computer room,” explains Gregory T. Royal, chief technology officer and executive vice president of Cistera Networks (web.cistera.com), a company providing enterprise application platforms and engines for IP communications. “The vendors in this space have yet to face up to the heat and power requirements of this growth. Over time I expect that we will move to more efficient servers and predominantly blade servers [to handle the requirements].”

According to Royal, in theory, the endpoints accomplish 90% of the media (or heavy) work involved with new VoIP solutions. For example, he says, a SIP (Session Initiation Protocol) server provides “traffic cop” capabilities, rather than actively participating in the calls. Again in theory, this means that a given CPU can support far more endpoints than traditional PBXes, he says.

“There has, however, been a trend for a long time for PBX to shrink into what are now essentially soft switches,” Royal says. “However, there is a great increase in media gateways, media conferencing, video, applications, and other CPU-intensive applications. Unified voicemail, conferencing, notification, and quality assurance systems are now all mainstream in IP PBX environments.”

While VoIP technologies continue to push more equipment into data centers, their presence has removed the need for larger, older PBX installations that consumed plenty of floor space, similar to mainframes of the past, Royal says.

Virtual Help

Additional technologies, such as load balancers, application accelerators, caching servers, and middleware are also being increasingly deployed in today’s data centers. But Royal notes that they also bring benefits. “A lot of these new technologies now use standard, off-the-shelf technology such as Intel CPUs, which means there are better economies of scale in dealing with [power and cooling] issues,” he says.

Power and cooling concerns are also being mitigated by virtualization technologies, which allow SMEs to boost production without increasing the number of devices in their data centers. And thanks to advances in virtualization management tools, managers can now more easily integrate the technology into their data centers.

“VMware Infrastructure 3, for example, offers a complete solution for management and optimization of virtual servers, storage, and networking,” Houston says. “We’ve seen consolidation of infrastructure through virtualization deliver transformative cost savings by greatly reducing power and cooling requirements.”

Cool It Down

Regardless of whether new technologies can run more efficiently or help to save space, there’s no denying that the number of overall devices in data centers is ramping upward as enterprises converge their data centers with business practices. But while this alignment can serve to help the business in general, it can literally cramp the style of managers looking to push down power and cooling costs.

BT Group (www.bt.com), which is building an all-IP network known as the 21st Century Network, requires plenty of power to run its data centers, but it is deploying innovative methods to handle the needs of today’s newer technologies. One of these involves the use of DC power, which BT has chosen over AC to handle its power-hungry data centers.

“BT estimates that this change alone reduces power consumption by 30%,” says Steve O’Donnell, global head of data center and customer experience management at BT. “While the acquisition of switches and servers that run on DC power is more expensive, the savings in power consumption offsets the cost.”

The company also has worked with suppliers to adapt equipment from recirculated-air cooling to fresh-air cooling, which O’Donnell says allows BT’s equipment to run within a range of 5 to 50 degrees Celsius (41 to 122 degrees Fahrenheit), compared with traditional higher ranges. BT has also revamped the physical structure of its data centers to reduce contamination from fresh air.

Tuesday, July 24, 2007

German workers fear Asian competition, bemoan slow move to new technologies

BY KIRSTEN GRIESHABER ASSOCIATED PRESS WRITER

BERLIN--then Thomas Haebich started working on the assembly line at Daimler-Benz AG two decades ago, he thought he had a job for life. But he no longer feels he can count on it.

On the contrary, today the 40-year-old auto worker often fears that he might lose his job at the car plant in the southern German town of Sindelfingen. He is convinced that on the long run German companies will not be able to compete with the emerging automobile industry in Asia.

"Only last year, Daimler paid compensation for 3,000 workers because they wanted them to leave the company," Haebich said. "If you look ahead another 10 years, China will push forward on the automobile market in such an aggressive way that we will no longer be able to beat their cheap products."

Haebich works weekly rotating shifts at the assembly line that makes the Mercedes E-class model, installing pedals and brake systems on up to 260 vehicles during a regular eight-hour workday.

Technically he has a 35-hour week, but in practice he works 40 hours a week and then gets to take compensatory time off when the company has fewer orders and not enough work to keep all 22,000 employees at the Sindelfingen plant busy.

Haebich became a member of the IG Metall union when he joined Daimler-Benz--now DaimlerChrysler AG--at age 20. He earns $4,682 a month pretax--after all deductions he has $2,118 left to cover his living expenses. By German law, Haebich has to have health insurance, pays into a pension fund and also deposits money into a life insurance fund offered by Daimler.

He worries that he will not get by on his compulsory pension fund once he retires.

"It would make sense to start an additional private pension but I can't really afford that," said Haebich, who is married and has an 11-year-old daughter.

He blames management for his worries. "They knew for a while that resources were getting scarce and more expensive--why didn't they invest much more in new technologies for cars that use less fuel?"

DaimlerChrysler agreed in May to sell 80.1 percent of its money-losing U.S. unit Chrysler to the private equity firm Cerberus Capital Management LP in a $7.4 billion transaction, paving the way for a streamlined Daimler to concentrate on its luxury Mercedes brand and its truck business.

Haebich also said he can see the impact of globalization at work every day.

"The pressure to perform for new markets around the globe is constantly getting more intense," he said. "And while they used to hire workers on fixed-term contracts before, today they try to get by with cheaper temp workers."

Haebich is glad that he has a full-time contract but keeps telling his daughter that she will not be as lucky as he was.

"I always tell her: By the time you're grown up you must be flexible," he said, "even if that means taking a job in China."





Monday, July 23, 2007

New media: Good for democracy?

By Dusty Horwitt

Earlier this year, pundits and politicians were again buzzing about the apparent democratic power of the Internet when a supporter of Democratic presidential candidate Sen. Barack Obama of Illinois posted a video on YouTube portraying rival Sen. Hillary Rodham Clinton of New York as an Orwellian dictator.

Take a close look at the Internet, however, and its democratic luster disappears like Howard Dean's 2004 presidential campaign.

Let's begin with the ad about Senator Clinton. It was reported that the video had spread to television, where network and cable news shows had aired portions of it. The Washington Post reported that within days, it had been "viewed" online more than 2 million times. Two million views, however, does not mean 2 million people. "Views," "visits" and "visitors" are typically registered each time a person visits a Web site.

In a 2005 analysis of the top blogs among U.S. Internet users, comScore Media Metrix, a company that measures Internet traffic, found a large disparity between the number of "visits" the blogs received and the number of people, or "unique visitors," who accounted for those visits.

Over the first three months of 2005, for example, the liberal DailyKos.com attracted almost 3 million visits but only about 350,000 people - a little more than 0.1 percent of the U.S. population. ComScore noted that many of these unique visitors read the blogs infrequently, indicating that a large number of those views are likely by a fairly small number of people who visit the site often.

In June 2006, comScore data showed that of an estimated 30 million or more blogs, only about 70 reached an audience of 100,000 or more U.S. Internet users during that month. Of these 70, about 10 focused on noncelebrity news or politics. Their audience size ranged from 117,000 for LittleGreenFootballs.com to 1.4 million for Breitbart.com.

Why do these numbers matter? Because in a democracy, citizens who want to make a difference must be able to reach a broad audience. Only by winning a coalition of support can most people exercise political power. For example, Gene Roberts and Hank Klibanoff write in their 2006 Pulitzer-winning book, The Race Beat, that broad media coverage was essential to the civil rights movement's success.

The problem with the Internet is that it buries citizens' voices under an avalanche of information.

"When you think about the amount of information that is published to the Web, it is a physical impossibility for the vast majority of that stuff to spread virally," said Derek Gordon, marketing director for Technorati, a firm that measures the popularity of blogs.

The larger problem is that the Internet siphons audiences and revenue from the media outlets that can give citizens a voice, causing them to shrink and further impairing the media's democratic power.

Of course, the Internet is hardly alone in this. An array of technologies fueled by inexpensive energy, including cable, satellites and DVDs, has in recent years helped to consolidate the media by overproducing information.

Much like farmers hurt by overproduction of food, media companies have been forced to give away their data for less while becoming more dependent on advertisers and expensive new technologies.

Ben H. Bagdikian has identified similar trends in his book The New Media Monopoly. Mr. Bagdikian notes that as late as 1970, when the U.S. had about 100 million fewer people than it does today, there were three newspapers in my hometown of Washington, D.C., with daily circulations of 200,000 to 500,000. The papers likely had even wider reach because of multiple readers per copy. Today, only one local paper has a circulation of 200,000 or more, and its readership and staff are dwindling.

Similarly, as late as 1980, the three network newscasts reached 25 percent to 50 percent of the population each night. In 2006, nightly newscasts on ABC, NBC, CBS, CNN, FOX and MSNBC combined reached less than 10 percent. According to a report by the Project for Excellence in Journalism, some newspapers have attracted additional readers online, but visits to newspaper Web sites tend to be very brief.

The conventional wisdom is that the media and the rest of us can make the transition to the digital world. But the evidence strongly suggests that the blizzard of online data makes such a transition - at least in a way that promotes democracy - impossible.

Sunday, July 22, 2007

IBM Rides the Portal Wave

IBM is looking to ride its leadership position in the enterprise portal space into a similar position in the Enterprise 2.0 arena, where Web 2.0 technologies intersect with the enterprise.

IBM on July 20 announced that research firm IDC had named IBM the leader in the enterprise portal space market for the fifth consecutive year. IBM, based in Armonk, N.Y., is moving to parlay that leadership into new areas with additional features in its WebSphere Portal offering. According to IDC, IBM had a 31.5 percent market share in 2006, followed by BEA Systems with 19.6 percent and Oracle with 10.5 percent.

ADVERTISEMENT

"I see their leadership stemming from a few things," IDC analyst Kathy Quirk said. In particular, IBM is "successfully selling portal solutions to businesses of all sizes in conjunction with other complementary products, especially those used for collaboration," she said.

The company is aggressively adding features and capabilities to its portal software and improving the usability of the software for business users, Quirk said. IBM also is "working to make the portal environment easier to get up and running and administer on a day-to-day basis," she said.

Larry Bowden, vice president of portals and Web interaction services at IBM, said the company is integrating its WebSphere Portal software with advanced Web 2.0 expertise to create a more dynamic, personalized version of a portal. In addition, IBM's commitment to Web 2.0 and SOA (service-oriented architecture)-based systems will continue to make its collaboration software more flexible and more useful, he said.

"Portals are a natural home for a lot of the new technologies coming forward," Bowden said. "We are seeing the new social networking technologies coming into the portal."

Another thing driving the merging of portals and Web 2.0 is the collaborative element of the technologies, Bowden said.

"Portals have been primarily used to distribute information, but now things are getting more interactive," he said. "You can have instant messaging, real-time chats with customers and constituents, and even video."
Click here to read more about IBM's Web 2.0 offerings.
Bowden said some of the new directions for WebSphere Portal will include "simplifying mashup creation so that end users have the power, so that every consumer can take feed No.1 and mash it up with feed No. 2. We have customers doing it today, but it's not as easy as we want it to be."
IBM also is improving the performance of the portal. "We're investing in the response experience of the portal," Bowden said. "We're going to be able to refresh just those fields that are new and not the entire page."

Another area of focus is semantic tagging, a feature that enables the portal to deliver a more personalized experience to users based on the way they navigate through the portal. "We don't ask you questions, we build a profile based on the way you navigate the portal," he said.

Additional features include enhanced integration with forms processing and with social networking technology, as well as the ability to take portal capabilities offline using IBM's Lotus Expeditor technology.

IDC projects that the enterprise portal software market will expand more than 50 percent by 2011, to $1.4 billion. The market grew nearly 11 percent in 2006, with license and maintenance revenue of $901 million, IDC reported.

IDC also anticipates solid market growth fueled by mashing up portals with new Web 2.0 collaboration and development technologies. A recent IDC report on the subject said. "Web 2.0 collaboration features are finding a welcome home within the portal.

As business users want to take advantage of these new egalitarian methods that offer easy ways for end users to customize content, while IT can take comfort in the portal's ability to deliver them within a secure deployment environment."