Powered By Blogger

Monday, July 30, 2007

Europe Unlocks Wireless Potential

lONDON - Telecommunications companies and wireless operators in Europe such as Nokia and Ericsson look set to get a healthy chunk of their 3G license costs reduced, after the European Commission proposed further liberalization of the wireless communication spectrum Wednesday.

The Commission aims to repeal a 20-year-old directive that reserved the use of certain radio frequencies for GSM-standard services, which initially helped make the European standard a success but ended up making spectrum space scarce for new technologies like 3G--"third-generation"--broadband and wireless data services.

"This proposal is a concrete step towards a more flexible, market-driven approach to spectrum management," said Telecommunications Commissioner Viviane Reding. "It will increase competition in the use of spectrum bands and enhance accessibility of European citizens to multimedia services."

Until now, national regulators across Europe have allocated bands in the electromagnetic spectrum according to specific technologies, a process that tends to be excessively bureaucratic and slow to respond to new services and technologies.

The old system also pushed up license fees for operators, whose fierce competition over a limited slice of spectrum led to high prices and equally high losses for 3G investments. Companies including Vodafone (nyse: VOD - news - people ), Deutsche Telekom (nyse: DT - news - people ), France Telecom (nyse: FTE - news - people ) and Hutchison Whampoa (nyse: HTX - news - people ) have desperately battled to reclaim tax compensation on $100 billion worth of European 3G licenses bought in 2000, but last month their appeals were rejected.

If the new proposals are adopted by the European Parliament, the 900 MHz and 1800 MHz bands previously reserved for GSM usage will be opened up to other technologies. 3G services currently use the 2100 MHz frequency, allocated in the 1990s.

"With this decision it's going to be a lot cheaper to deploy 3G," said Matthew Howett, analyst with Ovum Research. "It's cheaper to use the lower frequency and allows you to cover a much wider area," he added.

The European Commission said that the wireless communications sector could see reductions in network costs of up to 40% over the next five years. The trade association GSMA said last month that a 3G network in the 900 MHz band would achieve up to 40% greater coverage for the same costs as within the 2100 MHz band.

Sunday, July 29, 2007

Don't Blink: Interview with Jeff Harrow

Sam Vaknin, Ph.D. - 7/28/2007
Jeff Harrow is the author and editor of the Web-based multimedia "Harrow Technology Report" journal and Webcast, available at www.TheHarrowGroup.com. He also co-authored the book "The Disappearance of Telecommunications". For more than seventeen years, beginning with "The Rapidly Changing Face of Computing", the Web's first and longest-running weekly multimedia technology journal, he has shared with people across the globe his fascination with technology and his sense of wonder at the innovations and trends of contemporary computing and the growing number of technologies that drive them. Jeff Harrow has been the senior technologist for the Corporate Strategy Groups of both Compaq and Digital Equipment Corporation. He invented and implemented the first iconic network management prototype for DECnet networks.

He now works with businesses and industry groups to help them better understand the strategic implications of our contemporary and future computing environments.

Question: You introduce people to innovation and technological trends - but do you have any hands on experience as an innovator or a trendsetter?

Answer: I have many patents issued and on file in the areas of network management and user interface technology, I am commercial pilot, and technology is both my vocation and my passion. I bring these and other technological interests together to help people "look beyond the comfortable and obvious", so that they don't become road-kill by the side of the Information Highway.

Question: If you had to identify the five technologies with the maximal economic impact in the next two decades - what would they be?

Answer:

A) The continuation and expansion of "Moore's Law" as it relates to our ability to create ever-smaller, faster, more-capable semiconductors and nano-scale "machines." The exponential growth of our capabilities in these areas will drive many of the other high-impact technologies mentioned below.

B) "Nanotechnology." As we increasingly learn to "build things 'upwards" from individual molecules and atoms, rather than by "etching things down" as we do today when building our semiconductors, we're learning how to create things on the same scale and in the same manner as Nature has done for billions of years. As we perfect these techniques, entire industries, such as pharmaceuticals and even manufacturing will be radically changed.

C) "Bandwidth." For most of the hundred years of the age of electronics, individuals and businesses were able to 'reach out and touch' each other at a distance via the telephone, which extended their voice. This dramatically changed how business was conducted, but was limited to those areas where voice could make a difference.

Similarly, now that most business operations and knowledge work are conducted in the digital domain via computers, and because we now have a global data communications network (the Internet) which does not restrict the type of data shared (voice, documents, real-time collaboration, videoconferencing, video-on-demand, print-on-demand, and even the creation of physical 3D prototype elements at a distance from insubstantial CAD files), business is changing yet again.

Knowledge workers can now work where they wish to, rather than be subject to the old restrictions of physical proximity, which can change the concept of cities and suburbs. Virtual teams can spring up and dissipate as needed without regard to geography or time zones. Indeed, as bandwidth continues to increase in availability and plummet in cost, entire industries, such as the "call center," are finding a global marketplace that could not have existed before.

Example: U.S. firms whose "800 numbers" are actually answered by American-sounding representatives who are working in India, and U.S. firms who are outsourcing "back office" operations to other countries with well-educated but lower-paid workforces.

Individuals can now afford Internet data connections that just a few years ago were the expensive province of large corporations (e.g., cable modem and DSL service). As these technologies improve, and as fiber is eventually extended "to the curb," many industries, some not yet invented, will find ways to profitably consume this new resource. We always find innovative ways to consume available resources.

D) "Combinational Sciences." More than any one or two individual technologies, I believe that the combination and resulting synergy of multiple technologies will have the most dramatic and far-reaching effects on our societies. For example, completing the human genome could not have taken place at all, much less years earlier than expected, without Moore's Law of computing.

And now the second stage of what will be a biological and medical revolution, "Proteomics", will be further driven by advances in computing. But in a synergistic way, computing may actually be driven by advances in biology which are making it possible, as scientists learn more about DNA and other organic molecules, to use them as the basis for certain types of computing!

Other examples of "combination sciences" that synergistically build on one another include:

- Materials science and computing. For instance: carbon nanotubes, in some ways the results of our abilities to work at the molecular level due to computing research, are far stronger than steel and may lead to new materials with exceptional qualities.

- Medicine, biology, and materials science. For example, the use of transgenic goats to produce specialized "building materials" such as large quantities of spider silk in their milk, as is being done by Nexia Biotechnologies.

- "Molecular Manufacturing." As offshoots of much of the above research, scientists are learning how to coerce molecules to automatically form the structures they need, rather than by having to painstakingly push or prod these tiny building blocks into the correct places.

The bottom line is that the real power of the next decades will be in the combination and synergy of previously separate fields. And this will impact not only industries, but the education process as well, as it becomes apparent that people with broad, "cross-field" knowledge will be the ones to recognize the new synergistic opportunities and benefit from them.

Question: Users and the public at large are apprehensive about the all-pervasiveness of modern applications of science and engineering. People cite security and privacy concerns with regards to the Internet, for example. Do you believe a Luddite backlash is in the cards?

Answer: There are some very good reasons to be concerned and cautious about the implementation of the various technologies that are changing our world. Just as with most technologies in the past (arrows, gunpowder, dynamite, the telephone, and more), they can be used for both good and ill. And with today's pell-mell rush to make all of our business and personal data "digital," it's no wonder that issues related to privacy, security and more weigh on peoples' minds.

As in the past, some people will choose to wall themselves off from these technological changes (invasions?). Yet, in the context of our evolving societies, the benefits of these technologies, as with electricity and the telephone before them, will outweigh the dangers for many if not most people.

That said, however, it behooves us all to watch and participate in how these technologies are applied, and in what laws and safeguards are put in place, so that the end result is, quite literally, something that we can live with.

Question: Previous predictions of convergence have flunked. The fabled Home Entertainment Center has yet to materialize, for instance. What types of convergence do you deem practical and what will be their impact - social and economic?

Answer: Much of the most important and far-reaching "convergences" will be at the scientific and industrial levels, although these will trickle down to consumers and businesses in a myriad ways. "The fabled Home Entertainment Center" has indeed not yet arrived, but not because it's technologically impossible - more because consumers have not been shown compelling reasons and results. However, we have seen a vast amount of this "convergence" in different ways. Consider the extent of entertainment now provided through PCs and video game consoles, or the relatively new class of PDA+cell phone, or the pocket MP3 player, or the in-car DVD...

Question: Dot.coms have bombed. Now nano-technology is touted as the basis for a "New Economy". Are we in for the bursting of yet another bubble?

Answer: Unrealistic expectations are rarely met over the long term. Many people felt that the dot.com era was unrealistic, yet the allure of the magically rising stock prices fueled the eventual conflagration. The same could happen with nanotechnology, but perhaps we have learned to combine our excitement of "the next big thing" with reasonable and rational expectations and business practices. The "science" will come at its own pace - how we finance that, and profit from it, could well benefit from the dot.bomb lessons of the past. Just as with science, there's no pot of gold at the end of the economic rainbow.

Question: Moore's Law and Metcalf's Law delineate an exponential growth in memory, processing speed, storage, and other computer capacities. Where is it all going? What is the end point? Why do we need so much computing power on our desktops? What drives what - technology the cycle-consuming applications or vice versa?

Answer: There are always "bottlenecks." Taking computers as an example, at any point in time we may have been stymied by not having enough processing power, or memory, or disk space, or bandwidth, or even ideas of how to consume all of the resources that happened to exist at a given moment.

But because each of these (and many more) technologies advance along their individual curves, the mix of our overall technological capabilities keeps expanding, and this continues to open incredible new opportunities for those who are willing to color outside the lines.

For example, at a particular moment in time, a college student wrote a program and distributed it over the Internet, and changed the economics and business model for the entire music distribution industry (Napster). This could not have happened without the computing power, storage, and bandwidth that happened to come together at that time.

Similarly, as these basic computing and communications capabilities have continued to grow in capacity, other brilliant minds used the new capabilities to create the DivX compression algorithm (which allows "good enough" movies to be stored and distributed online) and file-format-independent peer-to-peer networks (such as Kazaa), which are beginning to change the video industry in the same manner!

The point is that in a circular fashion, technology drives innovation, while innovation also enables and drives technology, but it's all sparked and fueled by the innovative minds of individuals. Technology remains open-ended. For example, as we have approached certain "limits" in how we build semiconductors, or in how we store magnetic information, we have ALWAYS found ways "through" or "around" them. And I see no indication that this will slow down.

Question: The battle rages between commercial interests and champions of the ethos of free content and open source software. How do you envisage the field ten years from now?

Answer: The free content of the Internet, financed in part by the dot.com era of easy money, was probably necessary to bootstrap the early Internet into demonstrating its new potential and value to people and businesses. But while it's tempting to subscribe to slogans such as "information wants to be free," the longer-term reality is that if individuals and businesses are not compensated for the information that they present, there will eventually be little information available.

This is not to say that advertising or traditional "subscriptions," or even the still struggling system of "micropayments" for each tidbit, are the roads to success. Innovation will also play a dramatic role as numerous techniques are tried and refined. But overall, people are willing to pay for value, and the next decade will find a continuing series of experiments in how the information marketplace and its consumers come together.

Question: Adapting to rapid technological change is disorientating. Toffler called it a "future shock". Can you compare people's reactions to new technologies today - to their reactions, say, 20 years ago?

Answer: It's all a matter of 'rate of change.' At the beginning of the industrial revolution, the parents in the farms could not understand the changes that their children brought home with them from the cities, where the pace of innovation far exceeded the generations-long rural change process.

Twenty years ago, at the time of the birth of the PC, most people in industrialized nations accommodated dramatically more change each year than early industrial-age farmer would have seen in his or her lifetime. Yet both probably felt about the same amount of "future shock," because it's relative The "twenty years ago" person had become accustomed to that year's results of the exponential growth of technology, and so was "prepared" for that then-current rate of change.

Similarly, today, school children happily take the most sophisticated of computing technologies in-stride, while many of their parents still flounder at setting the clock on the VCR - because the kids simply know no other rate of change. It's in the perception.

That said, given that so many technological changes are exponential in nature, it's increasingly difficult for people to be comfortable with the amount of change that will occur in their own lifetime. Today's schoolchildren will see more technological change in the next twenty years than I have seen in my lifetime to date; it will be fascinating to see how they (and I) cope.

Question: What's your take on e-books? Why didn't they take off? Is there a more general lesson here?

Answer: The E-books of the past few years have been an imperfect solution looking for a problem.

There's certainly value in the concept of an E-book, a self-contained electronic "document" whose content can change at a whim either from internal information or from the world at large. Travelers could carry an entire library with them and never run out of reading material. Textbooks could reside in the E-book and save the backs of backpack-touting students. Industrial manuals could always be on-hand (in-hand!) and up to date. And more.

Indeed, for certain categories, such as for industrial manuals, the E-book has already proven valuable. But when it comes to the general case, consumers found that the restrictions of the first E-books outweighed their benefits. They were expensive. They were fragile. Their battery life was very limited. They were not as comfortable to hold or to read from as a traditional book. There were several incompatible standards and formats, meaning that content was available only from limited outlets, and only a fraction of the content that was available in traditional books was available in E-book form. Very restrictive.

The lesson is that (most) people won't usually buy technology for technology's sake. On the other hand, use a technology to significantly improve the right elements of a product or service, or its price, and stand back.

Question: What are the engines of innovation? what drives people to innovate, to invent, to think outside the box and to lead others to adopt their vision?

Answer: "People" are the engines of innovation. The desire to look over the horizon, to connect the dots in new ways, and to color outside the lines is what drives human progress in its myriad dimensions. People want to do things more easily, become more profitable, or simply 'do something new,' and these are the seeds of innovation.

Today, the building blocks that people innovate with can be far more complex than those in the past. You can create a more interesting innovation out of an integrated circuit that contains 42-million transistors today - a Pentium 4 - than you could out of a few single discrete transistors 30 years ago.

Or today's building blocks can be far more basic (such as using Atomic Force Microscopes to push individual atoms around into just the right structure.) These differences in scale determine, in part, why today's innovations seem more dramatic.

But at its heart, innovation is a human concept, and it takes good ideas and persuasion to convince people to adopt the resulting changes. Machines don't (yet) innovate. And they may never do so, unless they develop that spark of self-awareness that (so far) uniquely characterizes living things.

Even if we get to the point where we convince our computers to write their own programs, at this point it does not seem that they will go beyond the goals that we set for them. They may be able to try superhuman numbers of combinations before arriving at just the right one to address a defined problem, but they won't go beyond the problem. Not the machines we know today, at any rate.

On the other hand, some people, such as National Medal of Technology recipient Ray Kurzweil, believe that the exponential increase in the capabilities of our machines - which some estimate will reach the complexity of the human brain within a few decades - may result in those machines becoming self-aware.

Don't Blink!

Saturday, July 28, 2007

FERC announces pilot project licensing process for new hydropower technologies

Washington, D.C., July 27, 2007 -- The Federal Energy Regulatory Commission (FERC) announced it will convene a technical conference on licensing pilot projects for ocean energy hydro technologies to discuss a staff proposal for a process that could complete licensing in as few as six months.

Commissioner Philip Moeller will lead the conference, to be conducted Oct. 2, 2007, in Portland, Oregon. This is the latest in a series of measures the commission has undertaken since 2006 to address intensifying interest in the development of ocean, wave and tidal, or hydrokinetic, technologies.

"Perhaps the greatest barrier to realizing the potential of new hydrokinetic technologies is that they are unproven," chairman Joseph T. Kelliher said. "These technologies must be demonstrated before large scale commercial deployment can occur. Today we take a major step to reduce the barriers to the success of these new hydro technologies, by proposing a simplified licensing process suitable for licensing pilot projects."

"This new generation of hydrokinetic technologies will bring hydropower to the forefront of the renewable energy debate," Moeller said. "It is generating a lot of enthusiasm throughout the country, particularly in coastal states like my home state of Washington. FERC wants to harness this enthusiasm by exploring ways to reduce the regulatory barriers to realize the amazing potential of this domestic renewable power source?one that can help meet renewable portfolio standards established by states."

The goal of the commission staff proposal is to complete the full project licensing process in as few as six months, provide for commission oversight and input from affected states and other federal agencies, and allow developers to generate electricity while conducting the requisite testing.

The process would be available for projects that are 5 megawatts or smaller, removable or able to shut down on relatively short notice, located in waters that have no sensitive designations, and for the purpose of testing new hydro technologies or determining appropriate sites for ocean, wave and tidal energy projects.

At its December 2006 conference on hydrokinetic energy, FERC learned that these technologies are in a developmental phase, which presents significant risks for developers due to a lack of information about engineering performance and environmental effects, and limited access to financing.

In response to FERC's February 2007 notice of inquiry on preliminary permits for the new technologies, at least 14 entities addressed the need for a pilot program licensing process.

Comments included recommendations that FERC address the unique characteristics of pilot projects by: permitting connection to the national grid both for study purposes and to generate revenue; implementing a simpler, faster review process; requiring site restoration following experimental deployments; and requiring a license period of five years rather than 30-50 years.

Friday, July 27, 2007

Comment: Managing technological risks


27-July-2007:
Recent comments by former United nations secretary General Kofi Annan over the role of genetically modified sed in african agriculture have reopened debate over the risks posed by new technologies.

And this is the point I want to make.

Technological risks to society and the environment have become an integral part of global public policy. Managing these risks and responding to public perceptions of them are necessary if engineering are to be effectively deployed to meet development needs.

New technologies have been credited with creating new industrial opportunities but also with destroying the status quo. In fact, maintaining the status quo comes with its risks as well. In some cases the risks of doing nothing may outweigh the challenges associated with investing in new responses to social challenges.

In the past, technological risks were confined to countries in which new technologies emerged or even to the sectors in which they were applied. Concerns two decades ago over the use of microprocessors in industry were restricted to their possible impact on employment displacement in the manufacturing sector.

Workers and labor organizations around the world protested the use of this emerging technology. Today echoes of these debates are still heard in discussions of the “deskilling” of workers.

Although interest in the impact of microelectronics on employment was expressed in many parts of the world, it did not become a mass movement involving a wide range of social groups, for at least two reasons.

First, the possibility of job displacement was weighed against the benefits of raising industrial productivity. Second, the global economy was not as integrated as it is today, and many debates were localized. Globalization gives technological risk a wider meaning and turns local debates over certain products into mass movements.

This is itself a source of new risks.

Risk perceptions vary considerably across technologies, and this defines the scale of social mobilization. Attempts in the 1980s to promote the adoption of renewable energy technologies were bedeviled with social opposition.

Sporadic opposition was recorded in many parts of the world, but it did not translate into mass movements. Risk perceptions of pharmaceutical products are not a major challenge to the use of new medicines, partly because of the limited range of options available to consumers in life-threatening situations.

Fear of genetic engineering, for example, stems from scientific, technical, economic, cultural, and ethical concerns. Opposition is differentiated along product lines (transgenic crops, fish, trees, cattle). Many people oppose genetic engineering because of corporate control of the industry, which they perceive as a social risk.

A focus on technological risks can overshadow the possible benefits of an emerging technology, which are often difficult to predict. At the time of their invention, computers were envisioned as able to perform nothing more than rapid calculation in scientific research and data processing. The rise of the Internet as a global phenomenon could not have been predicted based on the early uses of information technology.

Technological risks have to be weighed against the risks of existing technologies as well as the risks of not having access to new technologies. The risks of not having access to the Internet may outweigh the risks of employment displacement that shaped many of the attitudes toward information technology in its earlier years.

The concerns over employment displacement were genuine, but the risks were often projected to whole industries or sectors. The debate is dominated by concerns over job displacement rather than the potential contributions of the new technologies to economic productivity.

There are also numerous cases in which society has underestimated the risks posed by new technologies or adopted them without adequate knowledge about their dangers. Managing technological uncertainty will require greater investment in innovative activities at the scientific and institutional levels.

At the technical level, technological diversity is essential to ensuring that society is able to respond to emerging challenges with the knowledge at its disposal. Technological diversity demands greater investment in scientific enterprises as well as the creation of measures to facilitate access to available technical options.

It also requires flexibility in institutional arrangements, to enable society to respond swiftly to technological failure. Such flexibility can be built into the design of technological systems themselves. Diversification of energy sources, for example, is an institutional response to the risks posed by dependence on centralized fossil fuel power plants. Similar diversification principles apply to information storage, especially as the world community moves into the information age.

Trends in industry suggest that a combination of incentives and regulations can shift technological change to meet environmental goals. Already enterprises are responding to the growing environmental consciousness among consumers and starting to adopt new environmental standards in fields such as emissions and energy use.

Efforts to focus on risk in the absence of actual technology will remain hollow exhortations and a distraction from genuine development efforts. Using emerging technologies without considering their risks is likely to elicit social pressures that can undermine technological development. The challenge is to maximize the benefits of new technologies while reducing their risks.

Calestous Juma teaches at Harvard University’s Kennedy School of Government where he directs the Science, Technology and Globalization Project.

Thursday, July 26, 2007

Two HDTV Technologies Worth Waiting For

LED backlighting and 120-Hz refresh rates are coming to mainstream HDTVs like the ones Samsung showed off this week.

If you're planning an HDTV purchase this fall (or looking ahead to one this winter), keep an eye out for two emerging technologies. 1080p is now everywhere, LCD HDTVs are taking over, and 40-inch displays are evolving into the new sweet spot. But new sets slated for this fall and winter are will be among the first mainstream displays to incorporate several new technologies that can significantly improve picture quality.

Samsung, for example, recently showed off its latest lines of LCD HDTVs due out in August. One line sports a 120-Hz refresh rate--double the 60 Hz of standard LCD TVs--which makes for sharper fast-moving images. Another line uses LED backlights, which dramatically boost contrast and allow for a wider range of colors.

Both technologies should be available from a wide variety of vendors this fall, including LG Electronics, Philips, and Sharp. And as these enhancements make their way into more and more TVs, the price difference between standard LCD TVs and these newer models should shrink rapidly. Here's a look at Samsung's plans for 120-Hz HDTVs and LED backlighting, and why you might want to wait for a television that makes use of either technology.

120-Hz Displays

Momentum behind 120 Hz has been building since early this year. JVC was among the first vendors to ship a 120-Hz display, and Sharp's Aquos D82U and D92U series televisions began shipping back in February. This summer, Philips, LG, and Samsung all announced their respective 120-Hz technologies, with products coming by this fall.

At 120 Hz, the television's refresh rate is double the previous standard rate for displaying video content. By doing so, it can smooth out any residual motion blur that results from fast-moving action found in such content as sports and a scrolling news ticker on the bottom of the screen. Video content is filmed at 30 frames per second, which means such content is best shown at 60 Hz or 120 Hz.

Samsung showed a split-screen demonstration of its 120-Hz technology at an event here in San Francisco, with one side showing the 120-Hz technology, and the other side showing 60 Hz. The difference between the two was noticeable: At 120 Hz, the ticker moved more smoothly and fast-moving video appeared sharper.

The 71 series displays that Samsung is launching in August use a technology called McFi--short for Motion Compensated Frame Interpolation--to create new interpolated video frames and insert them between each frame of video to smooth out fast motion. Samsung's technology looks for any movement, then it creates an average of those movements to insert a frame in between them. Other HDTV makers insert a black frame in between frames, an approach Samsung claims fixes the motion-blur issue, but degrade the panel's brightness.

LED Backlighting

If you're less concerned about fast-moving images, a display with an LED backlight may be more to your liking. The big advantage to LED-backlit TVs is improved contrast ratio. Samsung says its 81 series of displays can automatically adjust the backlight for specific parts of the picture, depending upon the source content. This allows the display to achieve deeper blacks and crisper whites than can be achieved with the Cold Cathode Fluorescent Lamp technology (CCFL) traditionally used by LCD HDTVs.

In CCFL, the tubes light up the back of the display; those tubes can be all on, or all off, and they allow some degree of light leakage. But LED backlighting allows a greater degree of control, which enables Samsung to claim a dynamic contrast ratio of 100,000:1, a four times improvement over its CCFL displays.

Price Premiums

While 120-Hz displays won't be that much more expensive than standard 1080p displays are today, you will pay a premium for an LED backlit display. The 40-inch model in Samsung's 120-Hz 71 series line should retail for $2699 when it ships in August, for example, while Samsung's 40-inch LED backlit model from the 81 series will go for $2999.

At least with TVs in that price range, other key HDTV technologies have become standard. A year ago, 1080p resolutions were still a rarity--and available in to higher-end models only. As we head into the fall, 1080p is de rigueur on HDTVs at sizes of 40 inches and up. Samsung, for example, will have only three non-1080p models going forward in that size range. HDMI 1.3 is also getting more pervasive across a wide spectrum of LCD (and for that matter, plasma) displays.

Wednesday, July 25, 2007

Power & Cooling Meet The New Tech Wave

How Innovative Technologies Are Changing Data Center Infrastructures

nnovations in data center technologies continue to transform productivity throughout the enterprise, giving way to increased flexibility and less downtime. Yet as a parallel consequence, many of these same innovations are forcing data center and facilities managers to alter their overall outlook toward power and cooling needs.

In the long run, changes forced by newer technologies can be positive because the increases in productivity can eventually outpace the increased costs to cool the devices. However, older data center rooms aren’t necessarily equipped to adequately handle new equipment, such as blade servers, and in turn can encounter far more heat issues than what existed with older devices.

“Most advances in technology require additional horsepower to take advantage of new product features and benefits,” says Kevin Houston, virtualization and consolidation practice leader at Optimus Solutions (www.optimussolutions.com), which helps firms plan, build, and maintain their IT infrastructure. “Data center consolidation projects can be complex and can easily overwhelm an IT organization struggling to maintain current operations.”

Blades Burrow In

Ask any data center manager to select a technology that’s sparking the most change in power and cooling requirements, and you’re likely to hear “blade servers” as the answer. Although these devices save power on one hand, they often require overhauled cooling infrastructures on the other.

“Blade servers are challenging the existing data center design and requiring audits to identify and propose the additional power and cooling resources to maintain an environment that meets the server manufacturer’s optimum operation requirements,” says Bill S. Annino Jr., director of the converged network solutions group at Carousel Industries (www.carouselindustries.com), a communications systems integrator and reseller.

Unlike traditional servers, blade servers forgo redundant components in their chassis, instead using single components and sharing them among the computers in the chassis. Blade enclosures also use a single power source for all the blades within them, which helps to boost the overall efficiency of the server environment.

That single blade rack can replace several traditional racks, which will help enterprises save power. “We’ve seen a tremendous increase in interest for blade technologies, as customers recognize the benefits of physical consolidation through a centralized ecosystem of power and cooling,” Houston says. “Both HP and IBM have made significant investments in their blade chassis to reduce the power and cooling needs for a server environment with as much as 50% over an equivalent physical rackmount environment.”

On the downside, many older data centers in small and midsized enterprises are designed to deliver a consistent level of cooling throughout the entire environment. The high-density form factors of blades require more concentrated cooling in specific areas, which means that a transition to blades could require facilities to be revamped to accommodate hot spots. After all, the cold air requirements for blade racks can be quadrupledor morewhen compared to the requirements for traditional racks.

This challenge becomes more difficult when SMEs mix blades with traditional servers because they still need traditional cooling methods but also need to remove the heat generated in specific areas by the blades. In these mixed environments, some experts recommend removing the traditional racks near the blade to allow for greater heat dispersion.

Call For A Change

Other increasingly popular technologies are also making their mark on power and cooling demands, albeit in different ways. In particular, VoIP is enjoying increased popularity in SMEs, and managers are witnessing a trade-off between power consumption and overall costs.

“Although VoIP phones require additional power consumption by drawing greater Power over Ethernet, or PoE [power], it is important to always evaluate the total cost of ownership,” Houston says. “Although power costs may increase, the boost in productivity from operational efficiencies and application integrationnot to mention long-distance savingsoutweighs any increase to the energy bill in most scenarios.”

Elizabeth King, general manager of Hitachi’s Servers System Group (www.hitachi.us), notes that VoIP creates stress on networks, storage, and computing resources. Further, while the technology hasn’t reached mainstream status, “there is enough experience to indicate that it will require corporations to expand their computing and network capabilities,” she says.

Of course, more capabilities often lead to more equipment, and more equipment means increased cooling and power demands. “The impact of the VoIP revolution has definitely added to the computer room,” explains Gregory T. Royal, chief technology officer and executive vice president of Cistera Networks (web.cistera.com), a company providing enterprise application platforms and engines for IP communications. “The vendors in this space have yet to face up to the heat and power requirements of this growth. Over time I expect that we will move to more efficient servers and predominantly blade servers [to handle the requirements].”

According to Royal, in theory, the endpoints accomplish 90% of the media (or heavy) work involved with new VoIP solutions. For example, he says, a SIP (Session Initiation Protocol) server provides “traffic cop” capabilities, rather than actively participating in the calls. Again in theory, this means that a given CPU can support far more endpoints than traditional PBXes, he says.

“There has, however, been a trend for a long time for PBX to shrink into what are now essentially soft switches,” Royal says. “However, there is a great increase in media gateways, media conferencing, video, applications, and other CPU-intensive applications. Unified voicemail, conferencing, notification, and quality assurance systems are now all mainstream in IP PBX environments.”

While VoIP technologies continue to push more equipment into data centers, their presence has removed the need for larger, older PBX installations that consumed plenty of floor space, similar to mainframes of the past, Royal says.

Virtual Help

Additional technologies, such as load balancers, application accelerators, caching servers, and middleware are also being increasingly deployed in today’s data centers. But Royal notes that they also bring benefits. “A lot of these new technologies now use standard, off-the-shelf technology such as Intel CPUs, which means there are better economies of scale in dealing with [power and cooling] issues,” he says.

Power and cooling concerns are also being mitigated by virtualization technologies, which allow SMEs to boost production without increasing the number of devices in their data centers. And thanks to advances in virtualization management tools, managers can now more easily integrate the technology into their data centers.

“VMware Infrastructure 3, for example, offers a complete solution for management and optimization of virtual servers, storage, and networking,” Houston says. “We’ve seen consolidation of infrastructure through virtualization deliver transformative cost savings by greatly reducing power and cooling requirements.”

Cool It Down

Regardless of whether new technologies can run more efficiently or help to save space, there’s no denying that the number of overall devices in data centers is ramping upward as enterprises converge their data centers with business practices. But while this alignment can serve to help the business in general, it can literally cramp the style of managers looking to push down power and cooling costs.

BT Group (www.bt.com), which is building an all-IP network known as the 21st Century Network, requires plenty of power to run its data centers, but it is deploying innovative methods to handle the needs of today’s newer technologies. One of these involves the use of DC power, which BT has chosen over AC to handle its power-hungry data centers.

“BT estimates that this change alone reduces power consumption by 30%,” says Steve O’Donnell, global head of data center and customer experience management at BT. “While the acquisition of switches and servers that run on DC power is more expensive, the savings in power consumption offsets the cost.”

The company also has worked with suppliers to adapt equipment from recirculated-air cooling to fresh-air cooling, which O’Donnell says allows BT’s equipment to run within a range of 5 to 50 degrees Celsius (41 to 122 degrees Fahrenheit), compared with traditional higher ranges. BT has also revamped the physical structure of its data centers to reduce contamination from fresh air.

Tuesday, July 24, 2007

German workers fear Asian competition, bemoan slow move to new technologies

BY KIRSTEN GRIESHABER ASSOCIATED PRESS WRITER

BERLIN--then Thomas Haebich started working on the assembly line at Daimler-Benz AG two decades ago, he thought he had a job for life. But he no longer feels he can count on it.

On the contrary, today the 40-year-old auto worker often fears that he might lose his job at the car plant in the southern German town of Sindelfingen. He is convinced that on the long run German companies will not be able to compete with the emerging automobile industry in Asia.

"Only last year, Daimler paid compensation for 3,000 workers because they wanted them to leave the company," Haebich said. "If you look ahead another 10 years, China will push forward on the automobile market in such an aggressive way that we will no longer be able to beat their cheap products."

Haebich works weekly rotating shifts at the assembly line that makes the Mercedes E-class model, installing pedals and brake systems on up to 260 vehicles during a regular eight-hour workday.

Technically he has a 35-hour week, but in practice he works 40 hours a week and then gets to take compensatory time off when the company has fewer orders and not enough work to keep all 22,000 employees at the Sindelfingen plant busy.

Haebich became a member of the IG Metall union when he joined Daimler-Benz--now DaimlerChrysler AG--at age 20. He earns $4,682 a month pretax--after all deductions he has $2,118 left to cover his living expenses. By German law, Haebich has to have health insurance, pays into a pension fund and also deposits money into a life insurance fund offered by Daimler.

He worries that he will not get by on his compulsory pension fund once he retires.

"It would make sense to start an additional private pension but I can't really afford that," said Haebich, who is married and has an 11-year-old daughter.

He blames management for his worries. "They knew for a while that resources were getting scarce and more expensive--why didn't they invest much more in new technologies for cars that use less fuel?"

DaimlerChrysler agreed in May to sell 80.1 percent of its money-losing U.S. unit Chrysler to the private equity firm Cerberus Capital Management LP in a $7.4 billion transaction, paving the way for a streamlined Daimler to concentrate on its luxury Mercedes brand and its truck business.

Haebich also said he can see the impact of globalization at work every day.

"The pressure to perform for new markets around the globe is constantly getting more intense," he said. "And while they used to hire workers on fixed-term contracts before, today they try to get by with cheaper temp workers."

Haebich is glad that he has a full-time contract but keeps telling his daughter that she will not be as lucky as he was.

"I always tell her: By the time you're grown up you must be flexible," he said, "even if that means taking a job in China."





Monday, July 23, 2007

New media: Good for democracy?

By Dusty Horwitt

Earlier this year, pundits and politicians were again buzzing about the apparent democratic power of the Internet when a supporter of Democratic presidential candidate Sen. Barack Obama of Illinois posted a video on YouTube portraying rival Sen. Hillary Rodham Clinton of New York as an Orwellian dictator.

Take a close look at the Internet, however, and its democratic luster disappears like Howard Dean's 2004 presidential campaign.

Let's begin with the ad about Senator Clinton. It was reported that the video had spread to television, where network and cable news shows had aired portions of it. The Washington Post reported that within days, it had been "viewed" online more than 2 million times. Two million views, however, does not mean 2 million people. "Views," "visits" and "visitors" are typically registered each time a person visits a Web site.

In a 2005 analysis of the top blogs among U.S. Internet users, comScore Media Metrix, a company that measures Internet traffic, found a large disparity between the number of "visits" the blogs received and the number of people, or "unique visitors," who accounted for those visits.

Over the first three months of 2005, for example, the liberal DailyKos.com attracted almost 3 million visits but only about 350,000 people - a little more than 0.1 percent of the U.S. population. ComScore noted that many of these unique visitors read the blogs infrequently, indicating that a large number of those views are likely by a fairly small number of people who visit the site often.

In June 2006, comScore data showed that of an estimated 30 million or more blogs, only about 70 reached an audience of 100,000 or more U.S. Internet users during that month. Of these 70, about 10 focused on noncelebrity news or politics. Their audience size ranged from 117,000 for LittleGreenFootballs.com to 1.4 million for Breitbart.com.

Why do these numbers matter? Because in a democracy, citizens who want to make a difference must be able to reach a broad audience. Only by winning a coalition of support can most people exercise political power. For example, Gene Roberts and Hank Klibanoff write in their 2006 Pulitzer-winning book, The Race Beat, that broad media coverage was essential to the civil rights movement's success.

The problem with the Internet is that it buries citizens' voices under an avalanche of information.

"When you think about the amount of information that is published to the Web, it is a physical impossibility for the vast majority of that stuff to spread virally," said Derek Gordon, marketing director for Technorati, a firm that measures the popularity of blogs.

The larger problem is that the Internet siphons audiences and revenue from the media outlets that can give citizens a voice, causing them to shrink and further impairing the media's democratic power.

Of course, the Internet is hardly alone in this. An array of technologies fueled by inexpensive energy, including cable, satellites and DVDs, has in recent years helped to consolidate the media by overproducing information.

Much like farmers hurt by overproduction of food, media companies have been forced to give away their data for less while becoming more dependent on advertisers and expensive new technologies.

Ben H. Bagdikian has identified similar trends in his book The New Media Monopoly. Mr. Bagdikian notes that as late as 1970, when the U.S. had about 100 million fewer people than it does today, there were three newspapers in my hometown of Washington, D.C., with daily circulations of 200,000 to 500,000. The papers likely had even wider reach because of multiple readers per copy. Today, only one local paper has a circulation of 200,000 or more, and its readership and staff are dwindling.

Similarly, as late as 1980, the three network newscasts reached 25 percent to 50 percent of the population each night. In 2006, nightly newscasts on ABC, NBC, CBS, CNN, FOX and MSNBC combined reached less than 10 percent. According to a report by the Project for Excellence in Journalism, some newspapers have attracted additional readers online, but visits to newspaper Web sites tend to be very brief.

The conventional wisdom is that the media and the rest of us can make the transition to the digital world. But the evidence strongly suggests that the blizzard of online data makes such a transition - at least in a way that promotes democracy - impossible.

Sunday, July 22, 2007

IBM Rides the Portal Wave

IBM is looking to ride its leadership position in the enterprise portal space into a similar position in the Enterprise 2.0 arena, where Web 2.0 technologies intersect with the enterprise.

IBM on July 20 announced that research firm IDC had named IBM the leader in the enterprise portal space market for the fifth consecutive year. IBM, based in Armonk, N.Y., is moving to parlay that leadership into new areas with additional features in its WebSphere Portal offering. According to IDC, IBM had a 31.5 percent market share in 2006, followed by BEA Systems with 19.6 percent and Oracle with 10.5 percent.

ADVERTISEMENT

"I see their leadership stemming from a few things," IDC analyst Kathy Quirk said. In particular, IBM is "successfully selling portal solutions to businesses of all sizes in conjunction with other complementary products, especially those used for collaboration," she said.

The company is aggressively adding features and capabilities to its portal software and improving the usability of the software for business users, Quirk said. IBM also is "working to make the portal environment easier to get up and running and administer on a day-to-day basis," she said.

Larry Bowden, vice president of portals and Web interaction services at IBM, said the company is integrating its WebSphere Portal software with advanced Web 2.0 expertise to create a more dynamic, personalized version of a portal. In addition, IBM's commitment to Web 2.0 and SOA (service-oriented architecture)-based systems will continue to make its collaboration software more flexible and more useful, he said.

"Portals are a natural home for a lot of the new technologies coming forward," Bowden said. "We are seeing the new social networking technologies coming into the portal."

Another thing driving the merging of portals and Web 2.0 is the collaborative element of the technologies, Bowden said.

"Portals have been primarily used to distribute information, but now things are getting more interactive," he said. "You can have instant messaging, real-time chats with customers and constituents, and even video."
Click here to read more about IBM's Web 2.0 offerings.
Bowden said some of the new directions for WebSphere Portal will include "simplifying mashup creation so that end users have the power, so that every consumer can take feed No.1 and mash it up with feed No. 2. We have customers doing it today, but it's not as easy as we want it to be."
IBM also is improving the performance of the portal. "We're investing in the response experience of the portal," Bowden said. "We're going to be able to refresh just those fields that are new and not the entire page."

Another area of focus is semantic tagging, a feature that enables the portal to deliver a more personalized experience to users based on the way they navigate through the portal. "We don't ask you questions, we build a profile based on the way you navigate the portal," he said.

Additional features include enhanced integration with forms processing and with social networking technology, as well as the ability to take portal capabilities offline using IBM's Lotus Expeditor technology.

IDC projects that the enterprise portal software market will expand more than 50 percent by 2011, to $1.4 billion. The market grew nearly 11 percent in 2006, with license and maintenance revenue of $901 million, IDC reported.

IDC also anticipates solid market growth fueled by mashing up portals with new Web 2.0 collaboration and development technologies. A recent IDC report on the subject said. "Web 2.0 collaboration features are finding a welcome home within the portal.

As business users want to take advantage of these new egalitarian methods that offer easy ways for end users to customize content, while IT can take comfort in the portal's ability to deliver them within a secure deployment environment."

Thursday, July 19, 2007

ALONG Mobile Displayed its New Cell Phone Network Game at the 5th China Digital

In this exhibition, ALONG introduced one of its innovative core games, "Code of Life" (or "COL"). As one multiplayer online casual game in China, COL contains several innovative features. COL is designed to be a friend-making WAP game on a massive scale, with powerful community functions that enable users to form personal relationships through role-playing. This game will also facilitate forming friendships via cell phone by integrating the powerful functions of a mobile phone terminal, with satellite positioning based on GPRS and individualized photo uploads.

Mr. Li Jianwei, CEO of ALONG, commented, "The vast market of over 400 million mobile phone users in China is creating the opportunity to develop the next wave of mobile phone network games. 'Code of Life' is designed to take advantage of easy operations and community while meeting the taste of modern consumers pursuing a simple, fast-paced and interactive life style. An exciting game will accelerate the bloom of the mobile phone network game market."

As one of the top three digital interactive entertainment conferences in the world, ChinaJoy attracts top-tier game developers, operators and hardware providers from all over the world.

Mobile phone games, a new force in the ChinaJoy Show, have developed at an extremely rapid pace in the last two years. With ALONG's presence at ChinaJoy 2007, it will further accelerate the development of the mobile phone game industry.

Tuesday, July 17, 2007

United Technologies' Net Income Rises on Aerospace

By Rachel Layne

July 18 (Bloomberg) -- United Technologies Corp., the maker of Otis elevators, said second-quarter profit rose 4.1 percent on higher overseas sales to airlines, governments and building developers. The company boosted its 2007 sales and profit forecasts.

Net income increased to $1.15 billion, or $1.16 a share, from $1.1 billion, or $1.09, a year earlier, exceeding analysts' estimates. Revenue climbed 13 percent to $13.9 billion on gains in all six business segments, the Hartford, Connecticut-based company said today in a statement.

United Technologies, which gets more than 60 percent of sales overseas, benefited from orders for electrical systems at Hamilton Sundstrand and aircraft at Sikorsky. Pratt & Whitney spare jet engine parts and service contracts increased. Otis and Carrier air conditioning sold more equipment for high-rise buildings in fast-growing Asian economies including China.

``These were solid results and we are pleased with the improved performance at Carrier despite the weaker North American residential market,'' wrote Nicole Parent, a New York- based analyst at Credit Suisse. She has an ``outperform'' rating on the stock.

Shares of United Technologies fell $1.17 to $75.67 at 10:13 a.m. in New York Stock Exchange composite trading. Before today, they had climbed 33 percent in the past year, while the Standard & Poor's 500 Index gained 26 percent. The stock had added 6.6 percent from July 10 through yesterday.

No Surprise

``We wouldn't be surprised to see the stock take a breather today despite the good results,'' wrote Myles Walton, an analyst at CIBC World Markets in Boston, in a note to clients today.
Chief Executive Officer George David raised the annual per- share profit forecast to $4.15 to $4.25 a share, up from $4.05 to $4.20, the statement said. The company also increased its revenue forecast to $53 billion from $51 billion. Analysts, on average, estimated $4.17 in earnings and $52.4 billion in sales.

Sales from businesses the company has owned more than a year, or ``organic'' revenue growth, should continue to rise, adding to a four-year pattern and prompting the forecast increase, David said. Such sales increased 10 percent in the quarter.

``Solid markets worldwide in commercial aviation and commercial construction, coupled with the successes of a wide range of new UTC products, are doing this,'' David said in the statement. ``We see these conditions continuing over the balance of the year and into 2008.''

Fire and Safety

United Technologies, which also owns UTC Fire and Security, used a 7-cent tax-related gain in the year-earlier second- quarter to help pay for cost reductions. Excluding that gain, profit would have climbed 12 percent in the just-ended quarter. The current quarter includes 2 cents a share in restructuring costs, the company said.

Otis, the world's biggest elevator and escalator maker, increased profit 13 percent to $532 million on a 13 percent sales gain to $2.86 billion. Profit at Carrier, the world's biggest air-conditioner maker, rose 19 percent to $489 million as revenue grew 8.1 percent to $4.06 billion.
Income at Pratt & Whitney declined 2.4 percent to $522 million as revenue rose 14 percent to $3.11 billion. Excluding a gain in the year-earlier period and restructuring costs, profit would have climbed 15 percent, United Technologies spokesman John Moran said.

Sikorsky sales climbed 56 percent, driving a more than doubling of profit to $87 million. Hamilton Sundstrand profit gained 16 percent to $246 million on a 9.6 percent sales rise.

The average estimate was for profit of $1.15 a share from 15 analysts surveyed by Bloomberg News.

In 2006, the company cut its workforce by 3,800 positions and closed 500,000 square feet of factory and office space, according to its annual report filed with U.S. regulators.

Sunday, July 15, 2007

New technologies for food industries

Rob Cockerill, Tuesday July 10th 2007 (Source: gasworld.com)

Consolidated Industrial Gases Inc. (CIGI) and Southern Industrial Gas (SIG) will introduce new technologies aimed at enhancing the capabilities of the food processing and packaging industry.

After gaining new ground with the government's focus on agribusiness, the industry will set in motion initiatives to expand the agricultural and fishery product mix through high value crops, while also adding value through innovative packaging and agro-processing.


CIGI and SIG have gained access to global technologies in food freezing and cooling since becoming part of The Linde Group in 2006. These technologies include the design and supply of tunnel and spiral freezers, refrigerants like liquid nitrogen and carbon dioxide, food grade gases, pipeline systems, gas mixing panels and other such equipment.


The newest addition to the range is the Cryoline tunnel freezer, a multi-purpose, cryogenic freezer that combines state-of-the-art technology with a high quality hygienic design. It is flexible and easy to use with optimized application of cryogenic gases this can be used for meat patties and pieces, whole fish and fillets, various kinds of seafood and many other food products.


The two companies are bringing these technologies closer to the end-users by participating in International Food Exhibitions like the recently held Food Processing and Packaging Technology Exhibit in Davao and the upcoming World Food Exhibit in Manila.

Sunday, July 8, 2007

NEW DISPLAY TECHNOLOGIES TO MAKE LCDS OBSOLETE

LCD displays were hailed the best technology in displays, however they have their own inherent problems. One is it is bulky ofcourse. Second is it is expensive and thirdly after a long use, the displays get permanently marked. However a new technology promises to lessen this problem. A new generation of super-thin, power saving displays are making their debut in the market that not only save battery lives, but also makes images sharper and crisper.

LCDs are commonly used in mobilephones, games, and MP3 players, however they require backlighting that consumes a lot of battery power. New technologies such as OLED (Organic Light Emitting Diode) and bi-stable are now being developed that glow on their own, consuming less power as well as making displays thinner.

However displays based on OLED and bi-stable technologies are still years away.

Tuesday, July 3, 2007

NTC plans to auction new 3G wireless licences in October

KOMSAN TORTERMVASANA

New third-generation wireless licences will be auctioned starting in October, according to Gen Choochart Promprasit, the chairman of the National Telecommunications Commission.
A final framework on the new licences, which include frequency allocations in the five-gigahertz band, would be completed by September, he said.

Telecom and IT network providers have been pressing the regulatory body to clarify its position on third-generation mobile and wireless technologies such as WiMax. Analysts said Thailand has been slow to implement new technologies offering broadband transmission speeds direct to mobile handsets

Gen Choochart said four 3G licences could be offered, but declined to comment on the method of allocation.

Existing mobile operators, who use 2.5G technologies such as Edge, have argued that they should be given priority in bidding for new licences. Authorities could choose to hold a straight auction, with the winning licences offered to the highest bidder, or hold a ''beauty contest'' in which the bidders' qualifications, experience and technological platforms are also used to evaluate winners.

Gen Choochart said the Council of State has already ruled the NTC could award new telecommunications licences without waiting for the creation of the National Broadcasting Commission.

The process of awarding new 3G licences has been stalled for years over legal questions on whether the NBC, which has been delayed due to political infighting, was required before new frequency spectrum was awarded for use. ''We will try to have all the issues finalised by September, to allow the industry to move forward starting this October. We see four 3G licences being awarded altogether,'' Gen Choochart said.

Monday, July 2, 2007

TotalView Technologies Announces Totalview 8.2

DRESDEN, Germany, june. 2007 -- TotalView Technologies, the world's leading provider of scaleable debugging and analysis software solutions for the multi-core age, today announced the availability of TotalView Debugger 8.2. This updated version of TotalView provides new features and support for important new platforms, allowing a greater range of users to benefit from the product's capabilities.

TotalView is a comprehensive source code and optional memory debugging solution that dramatically enhances developer productivity by simplifying the process of debugging data-intensive, multi-process, multi-threaded, or network-distributed applications.

Built to handle the complexities of the world's most demanding applications, TotalView is capable of scaling from one to thousands of processes or threads with applications distributed over multiple machines or processors.

TotalView Debugger is the only product on the market that supports mixed environments including mixed parallel paradigms (Open MP and MPI) and mixed languages (Fortran 90 and C++, for example) in one debugging session, and mixed versions of compilers in one session.

TotalView supports over 2,500 compiler variations and countless combinations of parallel programming paradigms and compilers. TotalView is robust and easy to use, with an intuitive GUI that enables users to quickly isolate and identify the root cause of problems.

Sunday, July 1, 2007

SAMSUNG Showcases The Latest Mobile Technologies At CommunicAsia 2007

Samsung Electronics Co., Ltd., a leading provider of mobile phones and telecom systems, today showcased a total of 53 mobile phones and its latest mobile technologies at CommunicAsia 2007.

Key highlights were the Ultra Edition II 12.1 (U700), the slimmest and elegantly designed HSDPA slider, the Ultra Edition II 10.9 (U600) 2.5G slider, and an exciting fashion phone lineup including the SGH-E950, E840, and J600. Samsung also introduced the Mobile WiMAX U-RAS Hub System (HS) and Convergence System (CS) for the first time in the world today.

Samsung’s diverse, innovative and market-leading handsets and technologies, including its brand new Ultra Edition II range and Mobile WiMAX technology, reinforces Samsung’s position as a design-led, forward-thinking company that heavily invests in research and development to advance the market and fuel consumer demand for innovative new technologies.

Mr. Geesung Choi, President of Samsung’s Telecommunications Network Business, said :

“The introduction of Samsung’s new Ultra Edition II range is testament to our strategic focus on the premium segment. In addition, Samsung’s vision for the future of mobile technology is increasingly becoming a reality with our successful Mobile WiMAX demonstration, underlining the converged mobile device as the future hub of all communications. With our increased focus on the Southeast Asian market, we will continue to provide a wide variety of mobile phones customized for local needs.”

Key highlights from Samsung at CommunicAsia 2007 include :



ULTRA EDITION II



Building on the runway success of Samsung’s Ultra Edition range, launched in July 2006, the Samsung Ultra Edition II embodies the best of advanced technology, revolutionary mobile designs and must-have business and consumer multimedia functions with four of the slimmest, most powerful and best looking handsets in each form factor. The Ultra Edition II range consists of two stylish and functional sliders, the Ultra Edition 10.9 (U600) and the Ultra Edition 12.1 (U700), a metallic clamshell, the Ultra Edition 9.6 (U300), and a candy bar handset, the Ultra Edition 5.9 (U100). Each of these designs offer advanced features such as a 3-megapixel camera and camcorder, high-speed internet connectivity, as well as extensive multimedia and audio capabilities.

Proving that Samsung truly is at the heart of the consumer’s needs, each of these handsets incorporates innovative consumer-driven design elements to improve the user experience, including external multimedia navigation keys for ease of use and a jewel-inspired casing.



FASHION PHONE LINE UP



For design- and style-focused customers, Samsung unveiled three fashion phones reflecting the latest design trends and cutting-edge multimedia features. The SGH-E950 is a powerful 3-megapixel camera phone equipped with an improved, unique touch navigation user interface, while the SGH-E840, with its minimalist and futuristic design, boasts a super slim frame at just 10.6mm and a glossy, mirror-like surface. The SGH-J600 comes with an affordable 1.3-megapixel camera in a variety of vivid, lively colours.