Senin, 24 November 2008

Apple Releases iPhone 2.2 Update

Apple has rolled out a software upgrade for the iPhone 3G and iPod Touch that adds new features and fixes some stability bugs.

The Google (NSDQ: GOOG) Maps application has received the most improvement, as iPhone users can get a closer view of locations with Street View. To access this, users have to choose a location on the map and a red icon will appear next to it. In Street View, users can zoom and rotate using the multitouch controls, but the iPhone lacks the Compass Mode that T-Mobile's G1 sports.


Need GPS navigation? Check out Garmin's 2GB SD card that can be plugged into a Palm smartphone In this Over The Air Weblog, Eric Zeman tests out MagicJack, which promises you'll never need to pay for another phone call. To good to be true? InformationWeek's Mitch Wagner demonstrates some of the latest features found in FireFox3.
InformationWeek's Mitch Wagner demonstrates some of the latest features found in FireFox3.
The updates also improves Google Maps by integrating walking and public transportation directions, adjusting the user interface, and adding the option of sharing location via e-mail with a Google Maps link inserted. The Google Maps updates only appear to affect the iPhone 3G and not the original handset or iPod Touch, presumably because it involves constant network access.

The firmware update also adds the ability to download audio or video podcasts directly to the device on the go via Wi-Fi or cellular data networks. In September, Apple pulled the iPhone Podcaster application from the App Store because the company said it duplicated functionality of the desktop version of iTunes. The company faced criticism because at the time over-the-air downloads of podcasts to iPhones weren't possible.

The iPhone's Safari browser also has received some tweaks, including a search bar next to the address bar and performance enhancements. The App Store application has received some polish with the user experience, and customers are now encouraged to rate applications before they're deleted.

Other improvements include improved formatting of HTML e-mail, the ability to turn the auto-correction off, and improved sound quality for the Visual Voicemail. To get the update, connect your iPhone to iTunes and click the "Check for Update" button. Selengkapnya...

Google To Seek Pre-Installation Of Chrome On PCs


Google (NSDQ: GOOG) reportedly plans to convince computer makers to pre-install its Chrome Web browser on PCs, a move that would escalate the battle against market leader Internet Explorer fromMicrosoft (NSDQ: MSFT).

In addition, Sundar Pichai, Google VP of product management, told The Times that the company planned to launch a major marketing push behind Chrome once Google finished with beta testing and launched the final version in January.


The startup lets users grab Internet video and mark it up. The tags stay even if the video moves. Jean-Raymond Naveau, VP of product development with Stakeware, demos the company's Register with Veedow, tell it what you like and it will deliver personalized shopping recommendations based on member input.
Jean-Raymond Naveau, VP of product development with Stakeware, demos the company's
"We will probably do distribution deals," Pichai told the London newspaper. "We could work with an OEM (original equipment manufacturer) and have them ship computers with Chrome pre-installed."

If successful, such a strategy could help the company boost use of the browser, which currently stands at less than 1% of Web users. Microsoft has more than 70% of the market, while Mozilla's Firefox has almost 20%. Because browsers are people's windows to the Web, the software is an important avenue for steering users to online services.

Pichai said Google would launch a major push to get its browser before the public, once the software is out of testing mode. "We will throw our weight behind it," he told the newspaper.

Pichai also said that Google planned to launch versions of Chrome for Linux and Apple's Mac OS X in the first half of next year. The current version of Chrome only runs on Windows.

Microsoft plans to ship next year the final version of Internet Explorer 8, which is currently undergoing beta testing. The company said this week the final version could come as early as the first quarter. IE8 includes a number of new features, particularly in the area of privacy.

InformationWeek has done its own breakdown of Google Chrome. Download the report here (registration required).

If you haven't seen Chrome in action yet, take a spin through our Google Chrome image gallery and have a look at the browser that's being touted as a game-changer. Selengkapnya...

Minggu, 23 November 2008

Going Green??? Don't forget Software

CIOs shouldn't limit their focus to server and storage hardware when launching energy-efficiency programs.

By Bob Violino
August 15, 2008



Going Green? Don't Forget SoftwareWhen people think of “green” IT efforts, chances are they consider areas such as server consolidation, energy-efficient storage and other hardware-related initiatives. After all, servers and storage systems are consuming much of the power in data centers.

But software can also play a significant role in running more environmentally friendly technology infrastructures. CIOs who fail to look at how software can help reduce energy consumption are missing out on good opportunities to make their organizations greener.

There are three distinct areas to consider when it comes to green software efforts, says Richard Hodges, founder and CEO of GreenIT, a consulting firm that specializes on the emerging field of environmentally sustainable IT and communications systems.

First, software “needs to drive hardware decisions [that] create the eco-footprint of IT,” Hodges says. “The more efficient your applications and system software is, the less hardware is needed to run it. Less hardware means less power, less cooling, less material used and less electronic waste.”

The second area encompasses software tools that can be used for measuring and managing the eco-footprint of IT. For example, Hodges says, software is available to help organizations figure out what hardware and software they actually have, and what they can do with it. Specialized applications can be used for automatically managing desktop power consumption. “There are numerous software tools available for data center power monitoring and management,” he says.

The third area is comprised of software tools that support information and communication technology-driven innovation. For example, Hodges says, corporate social responsibility software automates reporting and supplants basic record-keeping tools such as spreadsheets. Other software, such as dashboards and enterprise sustainability reporting, will become an important new product area for software and will help drive the realization of sustainability goals, he says.

“Greening, [also called] sustainability or eco-responsibility, is not a fad,” Hodges says. “It is a major, long-term trend,” and any enterprise that wants to be successful must have a sustainability plan and partner with the right people to execute it. “That plan should explicitly address the role of IT, which is too often overlooked,” Hodges says. “CIOs are well-positioned to take a leadership role for the sustainability program of the entire enterprise and demonstrate their credentials as strategic thinkers.”

When implementing greener IT operations, Hodges says CIOs should adopt what GreenIT refers to as the ER3 Principle: Eliminate, then reduce, reuse and recycle. “This is particularly true for software,” he says. “The results are better eco-performance, reduced IT costs and more-efficient overall operations.”

Hodges says the biggest mistake organizations can make with a green IT effort is not making a long-term commitment and developing a systematic process that assesses opportunities, quantifies benefits and develops a practical plan for efficiency and innovation.

What’s at stake if organizations fail to implement strategic plans for green IT? For one thing, they could miss out on immediate opportunities for a competitive advantage, Hodges says. For another, there’s “the long-term inevitability that if you don’t develop an ongoing process to make your operation more efficient and eco-responsible, somebody will come in to make you do it.” Selengkapnya...

Senin, 17 November 2008

HP, Sun Applying IT to Environmental Goals

OAKLAND, Calif. -- The world of IT has a long ways to go to improve its environmental footprint: one has only to look at the often-cited statistic that the industry contributes the same amount of emissions as the global air travel industry for an example.

But in addition to the many ways that IT manufacturers and tech professionals are improving the performance of their computers, servers and data centers, high-tech solutions to more bricks-and-mortar corporate environmental issues are increasingly coming to the fore.
The new HP Handheld sp400 All-in-One scanner and printer. The new HP Handheld sp400 All-in-One scanner and printer.

Two cases in point: yesterday both Hewlett-Packard and Sun Microsystems unveiled new and upgraded solutions to problems that companies are facing every day.

HP teamed with UPS to develop a new, paperless printer and wireless device that can save time and significant amounts of paper. Sun Microsystems gave its OpenEco.org climate-management website a significant upgrade. Both technologies bring the companies' IT expertise to decidedly non-IT arenas.

The new HP-UPS printing system is a wearable and hands-free, mobile scanning and printing device that the companies say will save over 1,300 tons of paper each year by printing shipping information directly onto boxes. UPS already brought the scanners to 41 of its package centers in the United States, and plans to expand that number to 55 centers in the next two months.

From the physical to the virtual, Sun Microsystems yesterday unveiled an overhauled version of its OpenEco.org emissions-mapping website to incorporate emissions data from many different types of sources. Based on a social network model like Facebook or LinkedIn, OpenEco.org encourages business leaders to measure and share the emissions associated with their facilities, whether they're office buildings, data centers or hospitals. The only requirement for joining OpenEco.org is sharing emissions data, which companies can do transparently or anonymously.

Launched just over a year ago, the site is still on the fledgling side, but yesterday's updates to the site's functionality bring more flexibility and the ability to compile more accurate data for businesses. Selengkapnya...

HP, Sun and Toshiba Put ‘IT’ in Green

The IT industry has had its knuckles rapped on environmental issues this week. A few vendors are taking matters into their own hands to better the environment and their every day operations, reports GreenerComputing.

Hewlett-Packard jointly developed a wearable printer-scanner with UPS to decrease the shipping company’s paper use and make its sorting and shipping more efficient. According to InformationWeek, HP is now free to sell the sp400 All-in-One mobile printer to anyone, including UPS’ rivals. HP is hopeful the device will be the first of a growing market segment.

Sun Microsystems revamped its OpenEco.org Web site. The social site equips its members with tools to calculate, compare and reduce carbon emissions. The enhancment to the site enables users to create a graphic of their carbon footprint from a number of sources such as business travel, electricity usage and mobile sources, says this InfoWorld article.

Toshiba is making plans for the long term. The company released its Environmental Report 2008 and outlined details of its strategy for its Environmental Vision for 2050 initiative. Toshiba has invested in renewable energy and plans to use clean nuclear power and geothermal power in its global operations. In addition, its long-term goal is to reduce the environmental impact of its own products and be a major contributor in increasing environmental efficiency by 10 percent over the next 40 years. Selengkapnya...

Skype Adds Calling to Mobile Phones

All Glenn Fleishman on Hardware Posts
More Business Center Blogs

Skype announced yesterday software that can be used with 50 Java-enabled mobile phones made by Motorola, Nokia, Samsung, and Sony Ericsson to receive calls from the Skype network and to SkypeIn phone numbers on the public switched telephone network (PSTN). Outgoing calls placed to Skype users and via SkypeOut, its gateway for outbound PSTN calling, will work just in six countries (Denmark, Estonia, Finland, Poland, Sweden, and the UK) as well as Rio de Janeiro in Brazil.

This extension of Skype's offering to mobile phones, with calls placed over data networks operated by cellular carriers, is an interesting mix of models. Skype is, on the one hand, challenging operators high margins in handling voice calls, but, on the other, will require that users of this service have typically expensive data plans over which the calls are placed.

It's possible that the margins on the data side are so good that cell companies can come out ahead. In the U.S., customers typically need at least a $40 per month voice plan to add a $20 to $60 per month data service; the average is closer to $40 per month for unlimited data on the phone. Data rates tend to be higher and not set to unlimited - which means 5 GB per month in the U.S. before carriers start to get agitated - in Europe. Asian nations vary as to the amount and affordability of data plans.

U.S. carriers typically don't allow the kinds of phones that would run Skype's software, as most phones sold here have a mechanism through which authorized software passes before it's allowed to run on a phone and a network. It's unclear to me whether any Java-enabled phone in the U.S. could simply use this software. I expect we'll hear about that soon.

But if you read the fine print on U.S. carriers' descriptions of what they allow over their data networks, you may be in for a surprise, something that signals that money may be there to be made even with Skype-over-cellular. Verizon some months ago added VoIP to its list of approved mobile broadband services--and they were formerly the most restrictive carrier as to what they allowed on their network. Selengkapnya...

Intel Launches Core i7 as PC Demand Softens

Intel began sales of its high-end Core i7 desktop chips in Tokyo late Saturday night, bringing to market a series of processors that are significantly more powerful than any of the company's current desktop products.

In a move intended to stoke demand among Japanese PC enthusiasts, shops in Akihabara, Tokyo's main electronics district, stayed open past midnight to put the first Core i7 chips on sale. The launch preempted a San Francisco news conference planned for Monday, as signs increasingly point to softening global demand for computers.

"This is a major new architecture for Intel and to be able to launch it here first to the user-community that Akihabara supports is a really exciting thing for us to do," said Steve Dallman, vice president of sales and marketing and general manager of Intel's worldwide reseller channel organization, shortly after the midnight launch. He was referring to the PC hobbyists and gamers who crowd the areas electronics stores in search of components to build their own computers.

"One of the features in the new processor I think they are going to be very excited about is Turbo-mode," he said. "There's also Turbo-tuning, which allows them to go in for the first time and tune 20 different parameters to optimize the performance of the processor."

The 3.2GHz Core i7 965 Extreme Edition is priced at US$999, while the 2.93GHz Core i7 940 and 2.66GHz Core i7 920 are priced at $562 and $284, respectively. Additional versions of Nehalem targeted at other market segments, including laptops, are expected to be released next year.

Several hundred people crowded stores that were open from around 10pm until 1am Sunday morning to check out the new chip and buy it. It was offered alongside compatible motherboards and other components.

"We ran-out of the high-end ones, the 965 processors, and the motherboards above ¥40,000 (US$410)," said Keisuke Kurashi, manager of the Faith store in the electronics district.

Core i7 is the first chip series based on Intel's Nehalem architecture to hit the market. Manufactured using a 45-nanometer process, these chips differ from Intel's existing products in several ways, most notably with the inclusion of an on-chip memory controller and faster links that connect the processor with main memory.

The chips that went on sale late Saturday aren't for the average user.

The first Core i7 processors were designed for systems aimed at gamers and other high-end users, and not the mass market, said Bryan Ma, director of personal systems research at IDC Asia-Pacific.

Despite the challenging economic environment, the release of Core i7 gives Intel a boost by strengthening its desktop product line and will keep the company one step ahead of rival AMD in the high-end desktop space. "They need to stay competitive," Ma said.

The Core i7 launch comes as overall PC demand is weakening in markets around the world. To what extent the new chips will convince buyers to upgrade their systems remains to be seen, and industry observers will be watching closely.

On Wednesday, Intel sent stock markets diving with a warning that it's fourth-quarter revenue will be sharply lower than the company's earlier estimates, signaling that demand for PCs was falling short of expectations. The chip maker also warned that gross margins, a broad measure of the company's profitability, will be lower than expected at 55 percent instead of the previous estimate of 59 percent.

"Revenue is being affected by significantly weaker than expected demand in all geographies and market segments," Intel said in a statement.

Intel said the revised gross margin estimate was primarily caused by lower revenue projections, but also blamed "other charges associated with the weaker-than-expected demand environment."

Those other charges include the cost of excess capacity and inventory write-offs, according to a research note put out by Credit Suisse analyst John Pitzer, who said the slowdown in PC demand will persist beyond December.

"We expect the weaker demand environment to persist into at least 1H09," Pitzer wrote, referring to the first half of next year.

As a result, Pitzer lowered his 2009 revenue forecast for Intel to US$33.8 billion, a decline of 12 percent compared to his 2008 forecast. He also said Intel's gross margin could fall to 50 percent during the first quarter of 2009 due to lower revenue, the cost of carrying excess production capacity, inventory write-offs, and startup costs for Intel's upcoming 32-nanometer process technology. Selengkapnya...


Technology reporter, BBC News, Silicon Valley

twitter doc
Web 2.0 is viewed as "me-too" social networks or widget sites said Mr O'Reilly

The economic downturn will not sound the death knell for Web 2.0 firms say analysts and experts.

But, they warn, tough times are ahead and to weather the downturn Web 2.0 must grow up focus on real problems.

"You have to conclude, if you look at the focus of a lot of what you call 'Web 2.0', the relentless focus on advertising-based consumer models, lightweight applications, we may be living in somewhat of a bubble, and I'm not talking about an investment bubble," said Tim O'Reilly, who coined the phrase "Web 2.0".

"It's a reality bubble," he said.

Mr O'Reilly, widely regarded as an industry visionary, bemoaned the frivolous applications on Web 2.0 sites that, for instance, let people throw sheep, poke friends or send virtual drinks.

"For me, Web 2.0 is about the internet as platform and its power to harness collective intelligence," Mr O'Reilly told the BBC.

"Areas like the smart power grid, collective action on early disease detection or disaster response, or personalised medicine are all examples of how the principles that drove the consumer internet can be applied in other areas," he said.

It was a message he drove home at the Web 2.0 summit in San Francisco.

Capital crunch

Mr O'Reilly told attendees: "If there is a silver lining in the downturn, it's that we're going to clear a lot of the clutter. We're going to remind people of what matters."

Companies that cannot bring value would not survive, he suggested.

...there are enormous problems to be solved and enormous opportunities.
Tim O'Reilly

Said seasoned venture capitalist John Doerr: "The good ideas will get funded. Most of the profits made in the venture capital business, if not all of them, are made by 5% of those firms.

"They are going to keep funding the good ideas but they won't be on as attractive terms and it's now a buyers' market instead of a sellers' market," said Mr Doerr

"But liquidity? How is it that we get those companies to produce a return for their employees and investors? That is another question. We might not see liquidity for the next three or four years," he said.

For angel investor Ron Conway the Web 2.0 world has a lot of life left in it.

"I don't think the hammer is going to come down on all these Web 2.0 companies," he said. "I think companies in the video space, in cloud computing, in social networking and what we call social communities will all continue to thrive.

"I see innovation continuing to happen and the mobile market playing an important role. It is still in its infancy and will see massive growth."

Consolidation

For those at the consumer end of Web 2.0 the upside of the downturn is its potential for injecting some rigour into the business.

"What we have seen in the last three or four months is that funding has pretty much dried up for internet Web 2.0 companies," said Chris de Wolfe, co-founder of MySpace.
MySpace logo, AP
Social sites are being urged to tackle serious issues

"Thus you have seen VC companies come out and tell their companies that they are not going to get any more money," he said. "Consequently many of the valuations for those companies have also gone way down."



But, said Mr de Wolfe that could be a good thing.

"I see a lot of consolidation over the next couple of years. Those companies that don't make sense in this space will go out of business or get acquired by larger companies if they are lucky," he said.

Analysts watching this sector view it with optimism mixed with a heavy dose of reality.

"The good news is there is a huge amount of growth in social networking, online video, voice over IP," said Mary Meeker of Morgan Stanley. The bad news is they carry much lower cpm's (cost per thousand impressions) for advertising."

She said the websites exciting her were those dominating the Web 2.0 consciousness.

She cited YouTube's 52% year-to-year growth and 329 million unique visitors; Facebook's 119% year-to-year growth giving it an audience of 161 million people and Skype's 370 million users representing growth of 51%.

"A better place"

But, said Mr O'Reilly, the focus on costs should not divert founders from attacking big issues.

"The next great companies don't come from jumping on the bandwagon," he said. "They come from finding something really meaty and tough and hard and putting your best minds to work on it and making the world a better place as a result.

"I think we have that opportunity brought home to us by this downturn," he said.
Facebook logo
Al Gore said Web 2.0 was in its infancy but is in need of a purpose

Lending weight to this view was former US Vice President Al Gore - now an environmental activist and Apple board member.

At the Web 2.0 conference he issued a plea for Web 2.0 to get serious.

He said: "The purpose I would urge all of you - as many are as willing to take it up - is to bring about a higher level of consciousness about our planet and the intermittent danger and opportunity we face because of the radical transformation in the relationship between human beings and the earth."

Entrepreneurs should not be leery of taking the high ground, said Mr O'Reilly.

"Back when Google first came on the scene everyone dismissed search as 'Yeah, not much of a business there'. And these guys said no we are going to organise all the world's information.

"Then there was Microsoft wanting to put a computer on everyone's desktop. The titans of industry said 'No, the PC is just a toy'. So I feel we are at one of those inflection points where there are enormous problems to be solved and enormous opportunities." Selengkapnya...

Minggu, 16 November 2008

visit this site

people.....

please visit this link...

i'll appreciate it....

http://Kumpulblogger.com/signup.php?refid=22837

thank you Selengkapnya...

LED - lighting the way

LED - green lighting

LED (Light Emitting Diode) technology has come a long way in recent years; and it may soon challenge CFL (compact fluorescent lamps) as the green lighting choice.

CFL technology has certainly had a huge impact; allowing millions of us to save substantial cash and carbon dioxide emissions through electricity savings when compared to using standard incandescent globes. In fact, in some countries, the sale of incandescent bulbs will be officially phased out within the next few years.

LED based lighting for domestic applications has recently been getting increasing attention due to advances in technology and reduction in price. LED lighting has been around for years; it's extensively used in applications such as those little red lights on your hi-fi, standby lights on TV's, operation indicators on adaptors and other electronic equipment. For room lighting, it's often used in off-grid setups, RV's and other 12 volt lighting applications.

LED's have no filament to burn out and they generate little heat. Heat is where much of the energy is wasted in an incandescent globe. LED's are illuminated by the movement of electrons in a semiconductor material.

Safe, energy efficient and long life

While CFL's only sip electricity and have a very long life in comparison to incandescent bulbs, LED's consume less than half the electricity of compact fluorecent lampgs and last about ten times as long. While there's been concerns raised about the improper disposal and recycling of CFL's as they contain small amounts of mercury. LED's don't have any mercury content at all.

Individual LED's are quite small, so it takes a lot of them to produce an array suitable for lighting an entire room. CFL light output is omnidirectional whereas light from a LED is directional; i.e. more focused - so the application needs to be taken into consideration. In most domestic applications, the lighting appliance will be a long strip, or a cluster of LED's in a lamp fitting set at different angles, or with an array of lenses on the light cover to diffuse the light.

With so many LED's needed in a single light, the type of light it generates and the the materials an LED is constructed from being quite expensive, uptake has been relatively slow; but prices are rapidly dropping.

I had messed around with LED lighting previously, but wasn't really impressed with the light it produced - fine for torches etc., but for use in the home without spending hundreds of dollars on high end lamps; I found the light to be ... odd. I really can't describe it any other way; it was very cold, unnatural and much harsher than traditional fluorescent lighting.

More recently, I bought one of these lights to use in a small shed I have on patch o' dirt in the outback:

LED cabin light


This 12 volt lamp contains 18 LED's, but is very compact, has a total current draw of only 100mA and consumes only 1.26 watts, so it's perfect for my solar power setup. The draws only a quarter of the power of the energy efficient CFL I was using previously - and as anyone running on solar power can tell you; every watt counts.

The LED lamp lights the shed up well and I can read comfortably. While it's still not a "warm" light, I've grown accustomed to it.

Given the lamp should last 100,000 hours; for around 30 bucks it was a good investment. Actually, given the amount I use it, this LED lamp will likely outlast me! I couldn't give you an accurate direct comparison to an incandescent light in terms of intensity, but a guesstimate would be it's as "bright" as a 25 watt incandescent bulb. If you're interested in buying one, try running a search on eBay in your country (that's where I picked mine up from).

OLED - Organic LED

The next big thing in LED lighting is the OLED, which stands for Organic Light Emitting Diode. These are comprise of extremely thin organic materials layered between two electrodes that produce light when an electrical charge is applied.

One of the main features of Organic LED technology is their flexibility - OLED's could be worn on clothing, embedded in curtains. With further development, it's believed Organic LED's could be cheaper than regular LED technology and also clear the "cold" light hurdle; offering a warmer quality of light comparable to fluorescents and CFL. Selengkapnya...

Texting bug hits the Google phone





G1 phone, AFP/Getty
The bug was discovered accidentally by a G1 owner

A text conversation has revealed a big problem with the G1 mobile phone - powered by Google's Android software.

The newly discovered bug causes the phone to restart when owners type in the word "reboot" soon after starting up the device.

Google hurried to repair the problem, which causes the phone to interpret any text entered just after the phone was turned on as a command.

Google has rushed out a fix for the bug which will soon be available in the UK.

The bug was discovered when an owner of the phone typed the word "reboot" into a text message after restarting the phone.

"I was in the middle of a text conversation with my girl when she asked why I hadn't responded," said a user called jdhorvat in the description of his discovery that was posted to Google's problem reporting website.

"I had just rebooted my phone and the first thing I typed was a response to her text which simply stated "Reboot" - which, to my surprise, rebooted my phone."

The phone, which uses the Google-developed Android operating system, is sold by T-Mobile in the UK under the name "G1".

Google has fixed the problem in an update to the phone software that will be automatically installed on users' phones.

"We've been notified of this issue and have developed a fix," said a Google spokesperson in a statement. "We're currently working with our partners to push the fix out."

Users in the US already report receiving the update, and Google told the BBC that users in the UK should receive it by 12 November. Selengkapnya...

Sabtu, 15 November 2008

AMD announces fast, energy-saving chip

Photo

SAN FRANCISCO (Reuters) - Advanced Micro Devices has started selling the new generation of its Opteron quad core processors for servers, AMD announced on Wednesday, nearly one year after Intel launched its own 45 nanometer chip.

After a troubled launch of its earlier Barcelona chip, AMD waited until the new chip, called "Shanghai," was actually in distribution to make its formal announcement.

The chip is called quad core because each core operates as an independent computing device on the same chip. The size historically refers to the smallest feature on a chip.

The Shanghai chip has high energy efficiency, AMD says. For example, it automatically turns off some parts of the chip when they are not in use, even for short periods of time, the company said.

The Shanghai is designed to work well with so-called virtualization, in which space is saved on servers by running separate "virtual" machines across drives, instead of requiring separate drives for each machine.

AMD said that two versions of the chip are available now, with enhanced versions available in early January.

IBM, Hewlett-Packard, Sun Microsystems and Dell are among 25 systems vendors who will be shipping machines this quarter with the new chip on board, AMD said.

AMD quoted a Hewlett-Packard executive in praise of the new chip. Paul Gottsegen, vice president of marketing for Industry Standard Service, said HP had "experienced unparalleled success over the past four years working with AMD in bringing AMD Opteron processor-based platforms to customers."

The company said that the processors are easier to replace than Intel's. Intel says that is not true.

AMD said it is looking to increase market share with the server. The market intelligence firm iSuppli said it had only 12 percent in the third quarter.

(Reporting by David Lawksy, additional reporting by Georgina Prodhan in London; Editing by Gary Hill)

Selengkapnya...

Google Offers Search By Voice, and iPhone Gets It First

Google is pushing its voice-recognition technology to Apple's iPhone first, before devices running its own Android mobile platform.

The New York Times offered photographs of Google employees Vic Gundotra and Gummi Hafsteinsoon using an iPhone for a voice search. The free application was expected to be available on Apple's App Store on Friday. Google reportedly will soon offer the technology for other devices, presumably including the T-Mobile G1, which uses Android.

"This is an expansion of types of applications Google has already been developing," said Greg Sterling, principal analyst at Sterling Market Intelligence. "Google has GOOG411, which is the underlying technical engine. They also have a voice-search client for the BlackBerry which is limited to maps. So this is an evolutionary step."

Inside Google Voice Search

Here's how it works: The iPhone user asks a question, such as "Where's the closest Burger King?" or "How wide is the Grand Canyon?" The user's voice is converted to a digital file and transmitted to Google's servers.

Google Search then serves up the results -- in a matter of seconds if the user has a fast wireless network, the Times reports. The search results always include any local information.

"The question with these types of technologies is how good is the speech recognition? It's getting much better, and that's why Google feels this is the right time to introduce this," Sterling said. "Google has confidence now that voice recognition is good enough to open it up to the full Web search as opposed to the much more structured search on GOOG411."

Google is playing catch-up, in a sense. Yahoo and Microsoft already offer a voice-recognition option for mobile phones. Microsoft's Tellme service offers users information in specific categories, such as movies, maps or directions. Yahoo offers voice services through its oneSearch platform.

"In one sense this is new, but it's not new, because Yahoo and Microsoft have been doing versions of voice recognition -- and so has Google -- for some time," Sterling said. "A company called Dial Directions was the first to formally introduce voice search for the iPhone, but it was limited to selected local sites through the Safari browser."

Building a Killer App

Could voice recognition be the next killer app for mobile? The market is growing at breakneck speed. Voice-recognition technology sales topped $1 billion in 2006 for the first time. Datamonitor expects that number to swell to $2.6 billion by 2009.

The market is heating up -- and going global. Voice-recognition software maker Nuance Communications earlier this month acquired Austria-based Philips Speech Recognition Systems for $96.1 million. Philips develops speech-recognition solutions in 25 languages.

Voice recognition on the mobile phone is still not completely accurate, and may not see mainstream use until it improves. But Sterling said it is ever-improving and thinks Google's voice search will be a popular mobile-phone feature.

Specifically, he sees the new Google application for the iPhone as most useful when a user might need to call directory assistance or do a simple search, but can't do it safely on a keyboard while driving. Another benefit is the ability to enter potentially long search queries that would be difficult to type. But accuracy is still a factor.

"This is an evolutionary step in the whole realm of voice search," Sterling said. "So far it has not proven to be the killer app for mobile, but it's getting there and it's very useful in selective situations."

Selengkapnya...

Green your Computer use

Lightening your computing environmental footprint

Computing has changed the world - a great example is the Internet. It's hard to imagine either not existing.

While computer usage can actually lessen our environmental footprint, for example, being able to work from home or control of farm irrigation and many other tasks, the energy consumption involved with casual computing and gaming generally is massive.

There's not just the actual computer usage related electricity consumption, there's also all the millions of tons of plastic and metal used to create the billions of computers in their various forms now on this planet.

We can all do our bit in lessening our impact and the following are tips for more earth friendly computing, some of which will also save you cash!

- When not in use for extended periods, switch your computer off at the wall to avoid phantom power load consumption.

- Have your power saving/management options enabled and properly configured for periods when your computer is temporarily not in use. In Windows, this can be found in Settings/Control Panel/Power Options

- During usage, only have your screen as bright as you need it - unneccesarily bright screens really chew the juice

- If you're going to use a screen saver, use a blank (black) screen - animated screen savers just consume electricity unnecessarily.

- When buying components and peripheral items, try to choose those that come in the least amount of plastic packaging as possible.

- For your next computer, consider a notebook instead of desktop - these use under 50% the electricity of a desktop machine

- If you can afford it, buy extended warranty with your new system so there's less likelihood of you needing to junk the computer within the first few years if an expensive repair is needed.

- Do you really need a 22 inch screen? When considering your next screen purchase, balance your wants with your actual needs.

- Before purchasing a new computer, consider upgrading the hardware in your current machine. Some extra RAM (memory) or a new hard drive may be all you need to restore life to your current system. According to this site, the energy needed to churn out a new computer is enough to power a system for a decade!

- Following on from the above point; it's not uncommon for Windows to get slower as time goes on. This isn't necessarily your machine, but software bloat. All the updates, installing and uninstalling of software, applications running in the background that you don't really need take their toll and basically clog your machine up; seriously impacting on performance, which in turn is more wear and tear on hardware and increase electricity consumption. Consider doing a reinstallation of Windows and your software. A lean machine will sip less electricity and peform much better.

- When you do replace your current computer system, donate it rather than bin it if possible. According to the Environmental Protection Agency, e-waste is now the fastest growing aspect of the municipal waste stream. You can find places to donate your computer to on Earth911.org

- Consider a refurbished computers for your next purchase. These aren't dusty old machines that have just been wiped over; often they are display models or recent purchase returns with very little wear and tear that are thoroughly checked before sale, and often have the same guarantee that new units do. You can save a ton of cash this way!

The computer recycling problem

While putting a computer in for recycling isn't the worst step your could take, it's important to remember that e-cycling (recycling of electrical components) is a bit of a minefield.

Sometimes they aren't recycled at all and in some instances your computer could be shipped to China (more emissions in transportation) where poorly equipped and impoverished people are set the gruelling task of stripping down the systems and reclaiming some of the precious metals. It's nasty and highly toxic work and just another instance whereby we've been outsourcing our pollution.

If you are going to recycle your machine, check the recycler out - ask about their practices; for example, if the system will be stripped down locally and in safe/environmentally responsible conditions. Also try to keep as many components as you can as backups - for example, the mouse and keyboard.

I learned this the hard way recently when disposing of some equipment, only to find out it wound up in landfill due to the recycler having too many computers. Selengkapnya...

Cancer scan uses radar technology

Breast cancer scan
The new system uses radio waves, unlike conventional mammograms

The first breast screening system to use safe radio waves rather than radiation-producing X-rays is being successfully trialled.

The new scan, which took three years to develop, is being tested at Frenchay Hospital near Bristol.

The new system carries the same minor radiation risk as "speaking into a mobile phone at arm's length".

The scans also take less time than the conventional X-rays but produce an image which is just as clear.

Doctors say the machine does not expose patients to the risk of cancer, and a scan takes only six minutes.

Professor Alan Preece and Dr Ian Craddock began developing a breast-imaging device which used radio waves, unlike conventional mammograms, in 2003

Dr Craddock, from the university's electrical and electronic engineering department, said: "This new imaging technique works by transmitting radio waves of a very low energy and detecting reflected signals, it then uses these signals to make a 3D image of the breast.

"This is basically the same as any radar system, such as the radars used for air traffic control at our airports."

You don't have to get to the right angle, there is no squeezing on your breast at all
Theresa Thornton, patient

Mike Shere, associate specialist breast clinician at North Bristol NHS Trust (NBT) said: "Currently women are diagnosed in three ways; firstly by a clinician, then by using imaging such as mammography and ultrasound and lastly by a needle biopsy.

"The radar breast imaging system came to Frenchay Hospital in September this year and so far around 60 women have been examined using it.

"It takes less time to operate than a mammogram - approximately six minutes for both breasts compared with 30-45 minutes for an MRI, and like an MRI it provides a very detailed 3D digital image.

"Women love it as they compare it to a mammogram and find the whole experience much more comfortable."

The radar breast imaging system is built using transmitters and receivers arranged around a ceramic cup, which the breast sits in.

Theresa Thornton, one of 60 women examined using the new technology, was referred for a mammogram after finding lumps in her breast - subsequently found to be benign.

She told BBC News: "With the new technique its just a cup so you don't have to position yourself into it on certain angles, literally the cup just comes straight up to you, covers your breast and that's it.

"You don't have to get to the right angle , there is no squeezing on your breast at all."

The development team hope that, if the positive results continue, further trials will be scheduled for the next 12 months.

Selengkapnya...

Study shows how spammers cash in

Sale signs in shop window, PA
A tiny response means spammers still cash in (PA)

Spammers are turning a profit despite only getting one response for every 12.5m e-mails they send, finds a study.

By hijacking a working spam network, US researchers have uncovered some of the economics of being a junk mailer.

The analysis suggests that such a tiny response rate means a big spam operation can turn over millions of pounds in profit every year.

It also suggests that spammers may be susceptible to attacks that make it more costly to send junk mail.

Slim pickings

The spam study was carried out in early 2008 by computer scientists from University of California, Berkeley and UC, San Diego (UCSD).

For their month-long study the seven-strong team of computer scientists infiltrated the Storm network that uses hijacked home computers as relays for junk mail.

At its height Storm was believed to have more than one million machines under its control.

The team, led by Assistant Professor Stefan Savage from UCSD, took over a chunk of the Storm network to make it easier to run their study.

"The best way to measure spam is to be a spammer," wrote the researchers in a paper describing their work.

They created several so-called "proxy bots" that acted as conduits of information between the command and control system for Storm and the hijacked home PCs that actually send out junk mail.

The team used these machines to control a total of 75,869 hijacked machines and routed their own fake spam campaigns through them.

Fake pharmacy website, UCSD/UC Berkeley
The research team created a legitimate looking pharmacy site.

Two types of fake spam campaign were run through these machines. One mimicked the way Storm spreads using viruses and the other tried to tempt people to visit a fake pharmacy site and buy a herbal remedy to boost their libido.

The fake pharmacy site was made to resemble those run by Storm's real owners but always returned an error message when potential buyers clicked a button to submit their credit card details.

While running their spam campaigns the researchers sent about 469 million junk e-mail messages. The vast majority of these were for the fake pharmacy campaign.

"After 26 days, and almost 350 million e-mail messages, only 28 sales resulted," wrote the researchers.

The response rate for this campaign was less than 0.00001%. This is far below the average of 2.15% reported by legitimate direct mail organisations.

"Taken together, these conversions would have resulted in revenues of $2,731.88—a bit over $100 a day for the measurement period," said the researchers.

Scaling this up to the full Storm network the researchers estimate that the controllers of the vast system are netting about $7,000 (£4,430) a day or more than $2m (£1.28m) per year.

While this was a good return, said the researchers, it did suggest that spammers were not making the vast sums of money that some people have predicted in the past.

They suggest that the tight costs might also open up new avenues of attack on spammers.

The researchers concluded: "The profit margin for spam may be meager enough that spammers must be sensitive to the details of how their campaigns are run and are economically susceptible to new defenses." Selengkapnya...

Jumat, 14 November 2008

LTE Vs WiMAX


WiMAX and 3GPP LTE are the two wireless technologies that will eventually be used to deliver data at a very high speed (up to 100 Mbit/s for WiMAX and up to 300 Mbit/s for LTE) beyond the 3G technologies. This high speed offered by the two technologies is fast enough to potentially replace cable broadband connections with wireless and enable some existing services currently deemed to be too bandwidth-hungry to be delivered using existing mobile technologies.

Contrary to LTE which is still under standardization, WiMAX is already in the market with the first national fixed-WiMAX rollout in the 3.5GHz range was carried out by Wateen Telecom in Pakistan. However, the world's first large scale mobile WiMAX deployment is due in the US. This is the joint venture between Sprint Nextel Corp. and Clearwire Corp. and it is expected to reach 120 million to 140 million people in the U.S by the end of 2010. On the other hand, LTE is assumed to dominate world’s mobile infrastructure markets after 2011. As such, some wireless operators such as AT&T Inc. and Verizon have already stated plans to adopt LTE, with major rollouts planned for 2011 or 2012.

As it has been discussed herein, LTE is the natural upgrade path for GSM/EDGE and UMTS/HSxPA network technologies that now account for over 85% of all mobile subscribers in the world. On upgrading to LTE, the existing GSM/EDGE and UMTS/HSxPA operators can use their current infrastructure (BT towers) integrating with new equipments making the whole process to be cost effective. When compared to WiMAX, an operator has to start from ground zero to setup a WiMAX network. Therefore LTE will have a significant global advantage over WiMAX in the long term.[14]

Operators & Vendors

Based on core network architectures of WiMAX and LTE, obviously, the two technologies will both be adopted, with LTE be the best upgrading option for GSM/EDGE and UMTS/HSxPA and WiMAX mostly appealing to cable operators. This can be seen in the US where by Sprint and Clearwire are aligned behind WiMAX, while Verizon Wireless and AT&T are behind LTE.

WiMAX is under the WiMAX forum, comprises of more than 500 vendors and mobile operators such as Vodafone. The forum was established to promote solutions based on the IEEE 802.16 standards. As for equipment manufacturers, Intel has invested billions of dollars in WiMAX research and chip sets and showed off conceptual mobile Internet devices at the Consumer Electronics. Vodafone as a mobile operator and Motorola Corp they both support WiMAX and LTE. However LTE enjoys a strong support from Qualcomm and Ericsson who decided not to support WiMAX.

From Technical Point of View

Both LTE and WiMAX use OFDMA in downlink and deploy MIMO technology, to improve reception in a single cell site. However a WiMAX network process all the information in a wider channel so as to optimize channel usage to the maximum. LTE, on the other hand, organizes the available spectrum into smaller chunks.[8]

Since WiMAX sticks with OFDMA in the downlink as well as in uplink, LTE uses SC-OFDMA in uplink. A major drawback of OFDMA-based system is its high Peak to Average Power Ratio (PAPR). As such a high PAPR requires expensive and inefficient power amplifiers which eventually increase the cost of the user equipment and drains the battery faster . Therefore SC-FDMA is theoretically designed to work more efficiently with lower-power end-user devices than OFDM is by grouping together the resource blocks and hence reduce the need for power amplifiers.

Technology Demos

  • In February 2007, Ericsson demonstrated for the first time in the world LTE with bit rates up to 144 Mbit/s[15]
  • In September 2006, Siemens Networks (today Nokia Siemens Networks) showed in collaboration with Nomor Research the first live emulation of a LTE network to the media and investors. As live applications two users streaming an HD-TV video in the downlink and playing an interactive game in the uplink have been demonstrated.[16]
  • The first presentation of an LTE demonstrator with HDTV streaming (>30 Mbit/s), video supervision and Mobile IP-based handover between the LTE radio demonstrator and the commercially available HSDPA radio system was shown during the ITU trade fair in Hong Kong in December 2006 by Siemens Communication Department.
  • In September 2007, NTT DoCoMo demonstrated LTE data rates of 200 Mbit/s with power consumption below 100 mW during the test.[17]
Motorola demonstrated how LTE can accelerate the delivery of personal media experience with HD video demo streaming, HD video blogging, Online gaming and VoIP over LTE running a RAN standard compliant LTE network and LTE chipset. [5]
Ericsson demonstrated the world’s first end-to-end LTE call on handheld and LTE FDD and TDD mode on the same base station platform.[18]
Freescale Semiconductor demonstrated streaming HD video with peak data rates of 96 Mbit/s downlink and 86 Mbit/s uplink.[19]
NXP Semiconductors demonstrated a multi-mode LTE modem as the basis for a software-defined radio system for use in cellphones.[20]
picoChip and mimoOn demonstrated an LTE base station reference design. This runs on a common hardware platform (multi-mode / software defined radio) together with their WiMAX architecture.[21]
Alcatel-Lucent demonstrated live high-speed video connection over LTE supporting dozens of DVD-quality and high-definition video streams simultaneously, using Alcatel-Lucent infrastructure and LGE devices.[22]
  • In March 2008, NTT DoCoMo demonstrated LTE data rates of 250 Mbit/s in an outdoor test.[23]
  • In April 2008, Motorola demonstrated the first EV-DO to LTE hand-off - handing over a streaming video from LTE to a commercial EV-DO network and back to LTE. [6]
  • In April 2008, LG and Nortel demonstrated LTE data rates of 50 Mbit/s while travelling at 110 km/h. [24]
  • On September 18, 2008, Mobile operator T-Mobile and Nortel Networks achieved data rates of up to 170 Mbit/s for downloads and up to 50 Mbit/s for uploads. T-Mobile, the wireless business of Deutsche Telekom achieved these speeds in a car in range of three cell sites on a highway in Bonn, Germany at an average speed of 67 km/h.[25]
Selengkapnya...

3gpp LTE

Long Term Evolution (LTE) describes the latest standardization work by 3rd Generation Partnership Project (3GPP) in the mobile network technology tree previously realized the GSM/EDGE and UMTS/HSxPA network technologies that now account for over 85% of all mobile subscribers.[1] In this latest standardization work which started in late 2004, the 3GPP (set out in December, 1998) defines a set of high level requirements (new high-speed Radio Access method) for mobile communications systems to compete with other latest cellular broadband technologies, particularly WiMAX.

In preparation for further increasing user demands and tougher competition from new radio access technologies, LTE is enhanced with a new radio access technique called Evolved UMTS Terrestrial Radio Access Network (E-UTRAN).[1] Via this technology LTE is expected to improve end-user throughput, increase sector capacity, reduce user plane latency, and consequently offer superior user experience with full mobility.

Unlike other latest deployed technologies such as HSPA, LTE is accommodated within a new Packet Core architecture called Enhanced Packet Core (EPC) network architecture. Technically, 3GPP specifies the EPC to support the E-UTRAN. EPC is designed to deploy TCP/IP protocols thus enabling LTE to support all IP-based services including voice, video, rich media and messaging with end-to-end Quality of Service (QoS). The EPC network architecture also enables improved connections and hand-over to other fixed-line and wireless access technologies while giving an operator the ability to deliver a seamless mobility experience.[2]

To achieve all the targets mentioned herein, LTE Physical Layer (PHY) employs advanced technologies that are new to cellular applications. These include Orthogonal Frequency Division Multiple Access (OFDMA) and multiple-input and multiple-output (MIMO) data transmission. Smart Antennas are also deployed to accomplish those targets. Furthermore, the LTE PHY deploys the OFDMA for the Downlink (DU) - that is from the Base Station (BS) to the User Equipment (UE) and Single Carrier Frequency Division Multiple Access (SC-FDMA) for the Uplink (UL). These technologies will further minimize the LTE system and UE complexities while allowing flexible spectrum deployment in existing or new frequency spectrum.[2]

LTE enjoys the support from a collaborative group of international standards organizations and mobile-technology companies that form the 3GPP. However, a strong support comes from Ericsson and Qualcomm who decided not to support WiMAX, the competitor of LTE [3]. Alcatel-Lucent is also a key contributor to LTE standards, having held the rapporteur position on the MIMO working group since 2002. Motorola which also supports WiMAX claims to be the leading contributor in LTE standards such as Radio Access Network (RAN) 1 & 2 and a top three contributor to EPC 1 & 2 standards.[1] The standardization work on LTE is continuing, and Motorola claims to introduce LTE in ‘Q4 2009 [4]. However, LTE is assumed to dominate world’s mobile infrastructure markets after 2011.[5]


Standardization Path

The standardization of LTE started in November 2004, when the RAN Evolution Workshop in Toronto, Canada, accepted contributions from more than 40 operators, vendors and research institutes that included 3GPP members and nonmembers organizations. Contributions were merely a range of views and proposals on the UTRAN. Following those contributions, 3GPP started a feasibility study, in December 2004, so as to develop a new framework for evolution of the 3GPP Radio Access technology towards:

  • Increased data rates
  • Reduced cost per bit
  • Increased service provisioning - that is more services with better user experience
  • Flexibility in usage of both new and existing frequency bands
  • High-data-rate
  • Low-latency
  • Simple architecture, open interface and packet-optimized RAN technology

Put simply, the study maps out specifications for RAN that are capable to support the wireless broadband internet scenario which is already enjoyed in today’s cable networks – adding full mobility to enable exciting new service possibilities.[5]

Currently LTE specifications are described in 3GPP Release 8. The 3GPP Release 8 is the latest set of standards that describes the technical evolution of 3GPP mobile network systems. It is the successor of 3GPP Release 7 that includes a set of specifications for HSPA+, the ‘missing bridge’ between HSPA and LTE. Actually HSPA+ is described in both, the 3GPP Release 7 and 8, allowing the designing of simpler ‘flat’, all-IP based network architecture and bypassing many of the legacy equipments required for UMTS/HSPA.[5]

The specifications of the 3GPP Release 8 standard are assumed to complete at the end of 2008. Obviously the finalization of the 3GPP Release 8 will further progress the market interest in commercial deployment of LTE. The 3GPP Release 8 will compile the completion of 3GPP Release 7 HSPA+ features, Voice over HSPA and EPC specification and Common IP Multimedia Subsystem (IMS) [6].

LTE Key Features

As it has been discussed earlier, technically speaking, a fundamental objective of the 3GPP LTE project is to offer higher data speeds for both DL and UL transmissions. In addition to that, it is obviously LTE to be characterized by reduced packet latency while promising a superior experience in online gaming, Voice over IP (VoIP) videoconferencing and other real-time professional services. Now that based on the feasibility study under 3GPP, the following are the important features of LTE:

OFDMA on the DL and SC-FDMA on the UL

3GPP Release 8 specifies an all-new RAN that combines OFDMA-based modulation and multiple access schemes for the downlink, as well as SC-FDMA for uplink. These new technologies (OFDM schemes) are deliberately deployed to split available spectrum into thousands of extremely narrowband carriers, such that each carrier is capable of carrying a part of signal. This is what is known as multiple carrier transmission.[2]

To enhance the OFDM schemes, LTE also employs other higher order modulation schemes such as 64QAM and sophisticated Forward Error Correction (FEC) schemes such as tail biting, convolutional coding and turbo coding. Furthermore, complementary radio techniques such as MIMO and Beam Forming with up to four antennas per station are also deliberately deployed for further enhancement of innate spectral efficiency of OFDM schemes.[5]

The results of these radio interface features are obvious, enabling LTE to have improved radio performance. As such they yield the spectral efficiency up to 3 to 4 times that of HSDPA Release 6 in the LTE DL and up to 2 to 3 times that of HSUPA Release 6 in UL.[2][7] Consequently, theoretically, the DL peak data rates extend up to 300Mbit/s per 20MHz of spectrum. Similarly, theoretical UL peak data rates can reach 75Mbit/s per 20MHz of spectrum as well as supporting at least 200 active users per cell in 5MHz.[5]

All-IP Packet Optimized Network Architecture

LTE has a ‘flat’, all-IP based core network with a simplified architecture, open interface and fewer system nodes. Indeed, the all-IP based network architecture together with the new RAN reduces network latency, improved system performance and provide interoperability with existing 3GPP and non-3GPP technologies. Within 3GPP, all-IP based core network architecture is now known as Evolved Packet Core (EPC). EPC is the result of standardization work within 3GPP which targeted to convert the existing System Architecture Evolution (SAE) to an all-IP system.[5]

Advanced Antenna Techniques

LTE is enhanced with MIMO, Spatial-Division Multiple Access (SDMA) and Beam Forming.[8] These are advanced radio antenna techniques which are complementary to each other. These techniques are deployed for better air interface via enhancing the innate spectral efficiency of OFDM schemes. Furthermore, these techniques can be used to trade-off between higher sector capacity, higher user data rates, or higher cell-edge rates, and thus enable mobile operators to have finer control over the end-user experience [9].

System Architecture

Evolved Radio Access Network (RAN)

The evolved RAN consists of the LTE base station (eNode B) that interfaces with the UE. The eNode B contains the PHY, Media Access Control (MAC), Radio Link Control (RLC), and Packet Data Control Protocol (PDCP) layers. Therefore the eNode B performs some tasks such as resource management, admission control, scheduling and enforcement of negotiated UL QoS.

Serving Gateway (SGW)

The SGW guides and forwards user data packets. Furthermore, during inter-eNode B handover SGW acts as the mobility anchor for the user plane. It can also act as an anchor for mobility between LTE technology and other 3GPP technologies. When the UE is in idle state, the SGW terminates the DL data path of the UE and triggers paging when DL data arrives for the UE.

Mobility Management Entity (MME)

MME handles Control Signaling for mobility. When the UE is in idle mode, the MME is responsible for UE tracking and paging procedure that includes retransmissions. MME is also involved in the bearer activation/deactivation process. In addition MME can choose the SGW for a UE at the initial attach and at time of intra-LTE handover involving core Network (CN) node relocation. MME can interact with the Home Subscriber Server (HSS) so as to authenticate the user.

Packet Data Network Gateway (PDN GW)

PDN GW is a point of exit and entry of traffic for the UE. PDN GW performs packet filtering and acts as the anchor for mobility between 3GPP and non-3GPP technologies such as WiMAX and 3GPP2.[1]


Networks Upgrading to LTE

LTE is deemed to become a next generation mobile communications standard or 4G mobile communications standard that started with today’s 2G and 3G networks. Technically, the design of LTE is based on today’s 3GPP family of cellular networks that dominated by Global System for Mobile communication (GSM), General Packet Radio Service (GPRS) and Enhanced Data rate for GSM Evolution (EDGE) as well as Wideband Code Division Multiple Access (WCDMA) and High Speed Packet Access (HSPA). Therefore LTE ensures a smooth evolutionary path to higher speeds and reduced latency to these existing networks.

Contrary to today’s networks that deploy hybrid packet/circuit switched networks, LTE uses the advanced new radio interface. As such to harness the full potential of LTE it requires an evolution from the existing network architecture to a simplified, all-IP environment architecture. This evolution has advantages to operator’s point of view. These advantages include reduced costs for variety of services, blended applications combining voice, video and data services plus interworking with other fixed and wireless networks.

Furthermore since the design of LTE is based on today’s UMTS/HSPA family of standards, it will obvious enhance the capabilities of the existing cellular network technologies to delivery broadband services which were accustomed to fixed broadband networks. In other words, LTE will unify the voice-oriented environment of today’s mobile networks with the data-centric service possibilities of the fixed Internet. To the operator’s point of view, the smooth upgrading of the existing networks to LTE will allow the introduction of LTE’s all-IP concept progressively. As such operator will be able to retain the value of its existing voice-based service platforms at the same get the benefit of high performance in data services delivered by LTE network

[edit] Carrier adoption

  • Most carriers supporting GSM or HSPA networks can be expected to upgrade their networks to LTE at some stage:
  • However, several networks that don't use these standards are also upgrading to LTE:
    • Alltel, Verizon Wireless, the newly formed China Telecom/Unicom and Japan's KDDI have announced they have chosen LTE as their 4G network technology. This is significant, because these are CDMA carriers and are switching networking technologies to match what will likely be the 4G standard worldwide. [11] They have chosen to take the natural GSM evolution path as opposed to the 3GPP2 CDMA2000 evolution path Ultra Mobile Broadband (UMB). Verizon Wireless plans to begin LTE trials in 2008.[12]
    • Telus Mobility and Bell Mobility have announced that they will adopt LTE as their 4G wireless standard.[13]
Selengkapnya...