Smartphones: The Pocketable PC

Douglas Adams’s Hitchhiker’s Guide to the Galaxyseries is named after a pocketable device that contains everything worth knowing. But that seems almost quaint today, when you can carry the full contents of the Web in your pocket, as well as a telephone, a camera, a radio, a television, and a navigation system. Today’s smartphones are marvels of engineering, crammed with more features than the average PC. They’ve become the prime driver of innovation for both software and hardware.

It took half a century to shrink the mainframe from the size of a living room to that of a suitcase. It took another decade to make it smaller than a wallet. The smartphone has swallowed and assimilated functionality from music players, remote controls, gaming consoles, even printed maps and news publications. And now that smartphones are serving as Wi-Fi hot spots, they can replace wireless routers and modems, too. Smartphones are becoming as essential as keys or a wallet, and they’ll soon replace those as well.

Social Networking: Friended

Bandwidth, digital cameras, and a hunger for connectedness have created a virtual dinner party

A decade ago, it might have taken a new person in town months to make contacts, find places to hang out, and meet like-minded people. Now, with a few clicks of the mouse, you can get the job done through social networking—a communications revolution that began in fits and starts in the late 1990s and reached recognizable form in March of 2003, with the public launch of Friendster.
This is exactly the kind of life promoted by Friendster, which was left in the dust by MySpace, which was lapped by Facebook. Yet even Friendster climbed up on the shoulders of still-earlier pioneers. The most notable was Sixdegrees, arguably the first true social network. Twenty-eight-year-old New York businessman Andrew Weinreich launched Sixdegrees at a party in 1997, announcing that “with the click of a button,” the site would revolutionize human networking. Like Friendster, Sixdegrees let users identify their friends, their friends’ friends, and so forth. At its peak in 1999, it had attracted 3.5 million members.
For instance, there’s the messaging service Tencent QQ, which is popular in China; Google’s Orkut, which is popular in Brazil; and Twitter, a blogging network that limits posts to 140 characters, which is popular just about everywhere. There’s Flickr for photo sharing and YouTube for video sharing. There’s LinkedIn for job networking and Classmates for finding long-lost school friends. And there’s a whole host of niche networks: Coastr for beer aficionados, Goodreads for bookworms,ResearchGATE for scientists, and Dogster for dog lovers.

Facebook conquered in part because it took to heart the lessons of its predecessors’ mistakes. For instance, its founder, Mark Zuckerberg, expanded from the company’s base at Harvard by adding one university at a time, ensuring that no new customers would come online until the servers could handle the additional traffic.

Facebook also lured Internet users with its sleek, easy-to-use interface and engineering wizardry. One of its most innovative features was Multi-Feed, which searches your friends’ databases for new updates and streams them to your home page as a continuous news feed. Facebook now contends with some 30 billion shared updates a month—a monumental processing feat that requires tens of thousands of servers.

Voice Over IP: Setting Phone Service Free

For generations it was possible to grow up and grow old without outgrowing the telephone of your youth. Handsets stayed tethered to the wall, “long distance” remained a concept, costly service was a given. Then in the 1990s, when competition and cellphones began to free us from wires and monopolistic pricing, we marveled at what seemed a revolution. We were wrong: The real revolution began only in the 2000s, when it became clear that the Internet would be the telephone network of the world.

Look at this one fact: Skype, which launched only in 2003, now has more than half a billion registered users, making it the largest provider of telephony services in history. It’s good for Skype, but better still for its users, who when they talk Skype-to-Skype pay exactly nothing, no matter where they may be on the planet. The company charges only when you call a phone listed on a “real” telephone network.

Spencer balked at paying tens of thousands of dollars for a telephone system—that is, a PBX, or private branch exchange—for his start-up company, so he wrote his own software-based switchboard. He called the software Asterisk, after the Unix symbol for “everything.” It was only a few years later, in the early 2000s, that it dawned on him that people were more interested in the phone system than in the tech support service.

For the traditional telephone companies, this was the second blow of the old one-two punch: First they’d had some of their business siphoned off by VocalTec and ITXC; now Digium’s open-source software was cutting the cost of equipping a telco in the first place. Now pretty much anybody could set up shop as a VoIP provider.

“Telecom products were really expensive, and there was a real need for customization, especially in other countries [outside the United States],” Spencer recalls. “All these things lined up just right, so that when Asterisk came out, it was able to win a lot of attention.”

LED Lighting: Blue + Yellow = White

Giving LEDs the blues was the key to replacing the incandescent bulb

Back in the 20th century, just about the only LED you normally saw was the one that lit up when your stereo was on. By the noughties, tiny light-emitting diodes were also illuminating the display and keypads of your mobile phone. Now they are backlighting your netbook screen, and soon they’ll replace the incandescent and compact fluorescent lightbulbs in your home.

This revolution in lighting comes from the ever-greater bang the LED delivers per buck. With every decade since 1970, when the red LEDs hit their stride, they have gotten 20 times as bright and 90 percent cheaper per watt; the relation is known as Haitz’s Law, and it applies also to yellow and blue LEDs, which were commercialized much later.

Multicore CPUs: Processor Proliferation

Back in 1994, programmers figured that whatever code they wrote would run at least 50 percent faster on a 1995 machine and 50 percent faster still on a ’96 system. Coding would continue as it always had, with instructions designed to be executed one after the other.

It’s not that old, single-core CPUs weren’t already doing some parallel processing. When Olukotun began his work, most microprocessors had a “superscalar” architecture. In the superscalar scheme, the CPU contained many replicated components, such as arithmetic units. Individual instructions would be parceled out to the waiting components. Scaling up such “instruction-level parallelism” meant building in more and more such components as the years rolled by.

That key insight drove AMD to acquire a leading GPU maker, ATI Technologies, and start work on jamming their two products together. So a future processor, from AMD at least, would probably contain multiple CPU cores connected to several GPU elements that would step in whenever the work is of a type that would gum up a CPU core.

tech icon no. 6

Cloud Computing: It’s Always Sunny in the Cloud test Aranya

Cloud computing puts your desktop wherever you want it

Just 18 years ago the Internet was in its infancy, a mere playground for tech-savvy frontiersmen who knew how to search a directory and FTP a file. Then in 1993 it hit puberty, when the Web’s graphical browsers and clickable hyperlinks began to attract a wider audience. Finally, in the 2000s, it came of age, with blogs, tweets, andsocial networking dizzying billions of ever more naive users with relentless waves of information, entertainment, and gossip.

Welcome to cloud computing. We’ve been catapulted into this nebulous state by the powerful convergence of widespread broadband access, the profusion of mobile devices enabling near-constant Internet connectivity, and hundreds of innovations that have made data centers much easier to build and run. For most of us, physical storage may well become obsolete in the next few years. We can now run intensive computing tasks on someone else’s servers cheaply, or even for free. If this all sounds a lot like time-sharing on a mainframe, you’re right. But this time it’s accessible to all, and it’s more than a little addictive.

But for you and me, the days of disconnecting and holing up with one’s hard drive are gone. IT managers, too, will surely see their hardware babysitting duties continue to shrink. Cloud providers have argued their case well to small-time operations with unimpressive computing needs and university researchers with massive data sets to crunch through. But those vendors still need to convince Fortune 500 companies that cloud computing isn’t just for start-ups and biology professors short on cash. They need a few more examples like Netflix to prove that mucking around in the server room is a choice, not a necessity.

And we may just need more assurances that our data will always be safe. Data could migrate across national borders, becoming susceptible to an unfriendly regime’s weak human rights laws. A cloud vendor might go out of business, change its pricing, be acquired by an archrival, or get wiped out by a hurricane. To protect themselves, cloud dwellers will want their data to be able to transfer smoothly from cloud to cloud. Right now, it does not.

Drone Aircraft: How the Drones Got Their Stingers

Cruising silently overhead, an unmanned Predator aircraft uses its infrared camera to pinpoint the telltale muzzle flashes from a sniper’s rifle. The plane’s operators, located half a world away, then unleash a Hellfire missile from under its wing, using a laser mounted beneath the craft’s nose to guide the munition into the very window the sniper had been shooting from.

Planetary Rovers: Are We Alone?

Pete Theisinger stands at the back of the mission control room, his round, mustachioed face frozen in a nervous grin. Hunkered down at long rows of computer consoles, his engineers sit on the edges of their chairs. NASA‘s Jet Propulsion Laboratory is hanging on the brink of a jubilant victory—or a devastating failure.

Then the black-and-white images appear on a big projection screen, and the room explodes in cheers. Some 200 million kilometers from Earth, a little robotic rover called Spirit, built here in Pasadena, Calif., has awakened and called home, sending images of what it is seeing. And what it is seeing is the rocky plain of Gusev Crater, in the southern highlands of Mars.

Such missions represent a technological tour de force, but they’ve played out so often over the past few years that they no longer make headlines. What might be news, though, is just how far back the roots of this stunning 21st-century military technology reach.

Flexible AC Transmission: The FACTS Machine

Flexible power electronics will make the smart grid smart

Power systems must juggle supply and demand while guaranteeing glitch-free alternating current 24/7. To deliver it, engineers once had no choice but to design grids that were as passive as the Roman aqueducts, which could carry water anywhere, so long as it went downhill. But over the past decade, a confluence of innovations, regulatory change, and sheer watt-squeezing necessity has hatched a marvelous advance, one that has begun to realize the long-standing dream of pushing current where it wouldn’t ordinarily go.

And it’s happening not a moment too soon. These flexible AC transmission systems, or FACTS, promise to save energy in a big way by making possible the smart grid, which utilities hope will reconfigure power flows in real time, maximizing throughput and minimizing losses. They should also make it possible to smoothly incorporate wind, solar, and other intrinsically intermittent sources of energy into the grid.

The key word in the FACTS acronym is the first one: flexible. Modern hydraulic engineers use pumps to push water against the force of gravity, so they save immensely on bridges and tunnels in comparison with their Roman predecessors. And think about how aeronautical engineers can manipulate control surfaces from second to second to keep aloft a plane that would otherwise be only slightly more flyable than a brick. Real-time control of power systems promises similar rewards.

Digital Photography: The Power of Pixels

Ten years ago, photography for the most part meant film. We carried rolls of it on vacation, dropped it off for processing when we got back, picked up our prints, then put them in albums or scrapbooks or, more typically, in cardboard boxes. On occasion, we thought about sending a duplicate to distant relatives, but we’d often forget. Photographs were for documenting our history, for framing, for saving.

What a difference a decade makes! The vast majority of us haven’t handled a roll of film in years—it’s a retro novelty at best. Digital technology has changed the very nature of photography. Digital images are free and easy and can be instantly distributed. As a result, the vast majority of photos are no longer taken to capture special moments; they’re used to communicate the ordinary, with less forethought than a phone call.

Of course, digital cameras didn’t simply materialize in our hands a few years ago, although it may seem like it. You could trace their history back to 1969, when the charge-coupled device (CCD) was invented at Bell Telephone Laboratories, or to 1957, when the first digital image scanner was created at the U.S. National Bureau of Standards.

Class-D Audio: The Power and the Glory

Even in the go-go world of high tech, it’s pretty rare that a technological leap delivers both markedly superior performance and stunningly greater efficiency. That neat trick happened with class-D audio amplifiers, which now dominate the market for applications in car stereos, home-theater-in-a-box systems, television sets, and personal computers.

Their success has been a long time coming. The first commercially available class-D amps came in the 1960s from the British company Sinclair Radionics (now Thurlby Thandar Instruments), but they didn’t work well. Somewhat better ones came along in the 1970s and 1980s: John Ulrick designed a couple of class-D amps for Infinity Systems in the early and mid-1970s, and a decade later Brian Attwood did a series of “digital energy conversion amplifiers” for Peavey Electronics Corp. But in those days class-D theory was ahead of implementation, because the available components simply weren’t good enough to produce high-quality sound.

Next-to-the-Best Technologies of 2000-2010

e-paper illus

Paper or Plastic?

tablets illus

The New Computing Covenant
Apple brings tablets down from the mountain

mp3 illus

Compress Me a Song
A German researcher took us from albums to algorithms

dvd illus

Time to Eject
The rise and fall of the optical disc

hd radio illus

HD Radio
The End of Analog
AM and FM go HD

flat panel tv illus

Flat-panel TVs
Underdog LCD went from desktop to wall mount

digital 3d illus

Digital 3-D
Better, Bit by Bit
With digital cinematography, 3-D finally makes sense




About qianggan

Sr. Software Engineer
This entry was posted in Computers and Internet. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s