Friday, January 30, 2009
How well endowed should you be?
Solar power a ventilation system that can cool the car without help from the engine.
Solar cars still a way off
Toyota's third-generation Prius, due at dealerships this spring, will have an optional solar panel on its roof. The panel will power a ventilation system that can cool the car without help from the engine, Toyota says.
But it's a long way from the 2010 Prius to a solar-powered car, experts told CNN. Most agree that there just isn't enough space on a production car to get full power from solar panels.
"Being able to power a car entirely with solar is a pretty far-reaching goal," said Tony Markel, a senior engineer at the federal government's National Renewable Energy Lab in Golden, Colorado.
In the new Prius, the solar panel will provide energy for a ventilation fan that will help cool the parked car on sunny, hot days. The driver can start the fan remotely before stepping into the car. Once the car is started, the air conditioning won't need as much energy from a battery to do the rest of the cooling.
"The best thing about using solar is that regardless of what you end up using it for, you're trying to use it to displace gasoline," added Markel.
The question is, how much gasoline can solar power offset? Markel said his lab has modified a Prius to use electricity from the grid for its main batteries and a solar panel for the auxiliary systems. He believes the car gets an additional 5 miles of electric range from the panel.
According to recent articles in Japan's Nikkei newspaper, Toyota has bigger plans for harnessing power from the sun. Nikkei reports that Toyota hopes to develop a vehicle powered entirely by solar panels. The project will take years, the paper reported.
When contacted by CNN, however, a Toyota spokeswoman denied the existence of the project.
"At this time there are no plans that we know of to produce a concept or production version of a solar-powered car," said Amy K. Taylor, a communications administrator in Toyota's Environmental, Safety & Quality division.
Motorists don't have to wait for a 2010 Prius to drive a solar-enhanced car, however. Greg Johanson, president of Solar Electric Vehicles in Westlake Village, California, said his company makes a roof-mounted panel for a standard Prius that enables the car to travel up to 15 additional miles a day.
The system costs $3,500, and it takes about a week to make one, Johanson said. Billy Bautista, a project coordinator at the company, said Solar Electric Vehicles gets so many requests for the system that there is a backlog of several months.
The company's Web site says motorists can install the panels themselves, although it recommends finding a "qualified technician."
The system delivers about 165 watts of power per hour to an added battery, which helps powers the electric motor, Johanson said.
But others said it would take a lot more power than that to replace an internal combustion engine.
Eric Leonhardt, director of the Vehicle Research Institute at Western Washington University, said that even if solar cells worked far better than they do today, they wouldn't generate enough power for driving substantial distances. The best cells operate at about 33 percent efficiency, but the ones used on vehicles are only about 18 percent efficient, he said.
Leonhardt said it would be more practical to use solar power to help charge a car's battery and use the more efficient panels mounted on a roof or over a parking area to supply the rest of the electricity needed to drive the engine.
"Solar panels really need a lot of area," he said.
Leonhardt thinks Toyota's new Prius is a good first step toward using renewable energy. Some cars get hotter than 150 degrees inside when parked in the sun, so reducing the temperature could mean Toyota could use a smaller AC unit, he added.
Johanson of Solar Electric Vehicles said he'd like to see Toyota bring the weight of a Prius down from 3,000 pounds to 2,000. He also hopes for a small gasoline engine and a larger electric motor. That will probably come in the future, when Toyota unveils a plug-in engine.
In the meantime, Solar Electric Vehicles sells its version of a plug-in Prius, with a solar panel installed, for $25,000, Bautista said.
Toyota is the largest automaker to incorporate solar power into a mass-produced car. But its solar panel is not the first for a car company. Audi uses one on its upscale A8 model, and Mazda tried one on its 929 in the 1990s.
In addition, a French motor company, Venturi, has produced an electric-solar hybrid. The Eclectic model costs $30,000, looks like a souped-up golf cart and uses roof-mounted solar panels to help power an electric engine. It has a range of about 30 miles and has a top speed of about 30 mph.
ABOUT Solar vehicle
Borealis III leads the way during the 2005 North American Solar Challenge passing by Lake Benton, Minnesota.A solar vehicle is an electric vehicle powered by a type of renewable energy, by solar energy obtained from solar panels on the surface (generally, the roof) of the vehicle. Photovoltaic (PV) cells convert the Sun's energy directly into electrical energy. Solar vehicles are not practical day-to-day transportation devices at present, but are primarily demonstration vehicles and engineering exercises, often sponsored by government agencies.
Solar cars
Solar cars combine technology typically used in the aerospace, bicycle, alternative energy and automotive industries. The design of a solar vehicle is severely limited by the energy input into the car (batteries and power from the sun). Virtually all solar cars ever built have been for the purpose of solar car races (with notable exceptions).
Like many race cars, the driver's cockpit usually only contains room for one person, although a few cars do contain room for a second passenger. They contain some of the features available to drivers of traditional vehicles such as brakes, accelerator, turn signals, rear view mirrors (or camera), ventilation, and sometimes cruise control. A radio for communication with their support crews is almost always included.
Solar cars are often fitted with gauges as seen in conventional cars. Aside from keeping the car on the road, the driver's main priority is to keep an eye on these gauges to spot possible problems. Cars without gauges available for the driver will almost always feature wireless telemetry. Wireless telemetry allows the driver's team to monitor the car's energy consumption, solar energy capture and other parameters and free the driver to concentrate on just driving.
Electrical and mechanical systems
The electrical system is the most important part of the car's systems as it controls all of the power that comes into and leaves the system. The battery pack plays the same role in a solar car that a petrol tank plays in a normal car in storing power for future use. Solar cars use a range of batteries including lead-acid batteries, nickel-metal hydride batteries (NiMH), Nickel-Cadmium batteries (NiCd), Lithium ion batteries and Lithium polymer batteries.
Many solar race cars have complex data acquisition systems that monitor the whole electrical system while even the most basic cars have systems that provide information on battery voltage and current to the driver.
The mechanical systems of a solar car are designed to keep friction and weight to a minimum while maintaining strength. Designers normally use titanium and composites to ensure a good strength-to-weight ratio.
Solar cars usually have three wheels, but some have four. Three wheelers usually have two front wheels and one rear wheel: the front wheels steer and the rear wheel follows. Four wheel vehicles are set up like normal cars or similarly to three wheeled vehicles with the two rear wheels close together.
Technorati : car engine, carscience, engine, information, informationwatch, science, solar power,
Thursday, January 29, 2009
Small publishers barely feel the pinch
Google's new online tools that will diagnose your network connection & performance.l
The set of tools, at MeasurementLab.net, includes a network diagnostic tool, a network path diagnostic tool and a tool to measure whether the user's broadband provider is slowing BitTorrent peer-to-peer (P-to-P) traffic. Coming soon to the M-Lab applications is a tool to determine whether a broadband provider is giving some traffic a lower priority than other traffic, and a tool to determine whether a provider is degrading certain users or applications.
Think your Internet Service Provider (ISP) is messing with your connection performance? Now you can find out, with Google's new online tools that will diagnose your network connection.
Here's a quick walkthrough on how to make the best of them.
Google's broadband test tools are located at Measurementlab.net. On that page, you'll see an first icon that says "Users: Test Your Internet Connection". Click that, and then you'll be taken to a page where there are three tests available, and two more listed as coming soon. However, out of the three available tests, only one of them is fully automated and easy to use.
Glasnost , second on the list, will check whether your ISP is slowing down (like Comcast) or blocking Peer2Peer (P2P) downloads from software such as BitTorrent. P2P apps are commonly used for downloading illegal software and media content like movies and music, but also are used for legal purposes as well, such as distributing large software packages to many users at once.
To use the measurement tool, you will be redirected to the Glasnost site. You'll need the latest version of Java installed, and you should stop any large downloads that you may have running before you begin the test. If you're on a Mac, a popup message will prompt you to trust the site's Java applet.
When you're ready to start, you can choose whether you want to run a full test (approximately 7 minutes long) or a simple test (4 minutes long). When I tried to test my connection, Glasnost's measurement servers were overloaded and an alternative server was offered, but that was overloaded as well. After a short while I was able to run the test.
In the tests of my connection (my provider is Vodafone At Home, in the UK) all results indicated that BitTorrent traffic is not blocked or throttled. But I'm looking forward to hearing from you in the comments how your ISP performed in Glasnost's diagnostics. Meanwhile, make sure you keep an eye on the other tests that will be available soon from Measurementlab.net.
Wednesday, January 28, 2009
Britannica reaches uncovered to the web
Under the plan, readers and contributing experts will help expand and maintain entries online.
Experts will also be enrolled in a reward scheme and given help to promote their command of a subject.
However, Britannica said it would not follow Wikipedia in letting a wide range of people make contributions to its encyclopaedia.
User choice
"We are not abdicating our responsibility as publishers or burying it under the now-fashionable 'wisdom of the crowds'," wrote Jorge Cauz, president of Encyclopaedia Britannica in a blog entry about the changes.
He added: "We believe that the creation and documentation of knowledge is a collaborative process but not a democratic one."
Britannica plans to do more with the experts that have already made contributions. They will be encouraged to keep articles up to date and be given a chance to promote their own expertise.
Selected readers will also be invited to contribute and many readers will be able to use Britannica materials to create their own works that will be featured on the site.
However, it warned these would sit alongside the encyclopaedia entries and the official material would carry a "Britannica Checked" stamp, to distinguish it from the user-generated content.
Alongside the move towards more openness, will be a re-design of the Britannica site and the creation of the web-based tools that visitors can use to put together their own reference materials.
Strapped publishers skip NAA dues
Tech analysts expect Amazon.com (AMZN) to open the cover on Kindle 2
Amazon CEO Jeff Bezos is expected to attend. But Amazon won't reveal the plot. "We are not sharing details," Amazon director of communications Drew Herdener wrote in an e-mail. The company has said there will be a new version of Kindle sometime this year.
Paperback-size e-book readers such as the Kindle or rival Sony Reader let bookworms cart a boatload of titles — more than 200 in the case of the first Kindle. But Kindle's real advance was in its wireless Whispernet network (built on top of Sprint's speedy EV-DO wireless network). Readers could search for and sample books, blogs and periodicals (including USA TODAY) right on the device and purchase new content in under a minute. Best sellers typically cost $9.99.
Amazon won't disclose Kindle sales. Mark Mahaney, director of Internet research at Citigroup Investment Research, estimates Amazon sold about 400,000 units last year and that Kindle hardware and book sales will contribute about $1 billion to Amazon's revenue in 2010. "It's pretty clear this is the iPod of the book world," he says. Mahaney also expects the new Kindle to drop to around $300, from $359. Minor design glitches will likely also be addressed. Pundits have criticized Kindle for its clumsy button layout and homely appearance.
Amazon underestimated demand for the first Kindle, which is still difficult to come by. Amazon's website says Kindle is sold out due to "heavy customer demand." Orders are expected to be shipped in four to six weeks, the website indicates.
What isn't clear, of course, is whether buyers will receive the first Kindle or the sequel. Whatever Amazon trots out, Tim Bajarin, president of the Creative Strategies consulting firm, doesn't expect shortages to be a major issue. "This time they at least know what the sales cycles have looked like," Bajarin says. "I have to believe they're going to be smarter about building and managing inventory."
"We're fairly sure that it will be a new Kindle, one that will feature a color screen and a better battery life," said Richard Doherty, a consumer electronics analyst with the Envisioneering Group.
Doherty, who keeps close tabs on companies that supply parts for the Kindle and other devices, said Amazon had been working for much of 2008 on a successor to its unexpectedly popular reading device. But Amazon's plans to release the product in time for Christmas were derailed when the online merchant was overwhelmed with orders, Doherty said. As a result, those who ordered a Kindle in December were told to wait until February or March for the device.
The Boy Genius Report has some photos it says are of the next version of the Kindle.
Another possible change: a sleeker design that relocates the page-forward and page-back buttons so users would be less likely to hit them accidentally. That's a major complaint about the current Kindle, said Tim Bajarin, electronics analyst with Creative Strategies.
The new device is expected to update the Kindle's rather clunky looks and add some design touches aimed at making it easier to use. It should probably get the color screen treatment but it's unclear if it will go to a touch screen. One of the gripes has been inadvertent page turns, which most observers expect will get addressed.
The Boy Genius Report has some pictures (featured above) from last fall that show a new Kindle with rounded edges and buttons.
Despite its awkward looks, the Kindle has sold well even at its $359 price, down from its original $399 price. Amazon sold more than 250,000 units in the first year and the device is still shipping with a 4-6 week delay.
People have enjoyed the way the Kindle offers easy access to 225,000 books, which can be downloaded wirelessly over a cellular connection. The Kindle, however, faces competition from Sony's eReader and also down the road from devices like the iPhone and iPod Touch.
Hidalgo County, Texas is considering $500,000 project that would blanket the city with a wireless Internet system
Negotiations are still in extremely preliminary stages — and both the city and contractor say a timetable isn't set — but leaders have expressed intrigue at the prospect of a system that can seemingly meet their wildest high-tech fantasies.
"The possibilities for the future are really interesting," Pharr City Manager Fred Sandoval said.
Bobby Vassallo, a wireless Internet consultant, has met with the City Commission twice over the last six weeks to help pitch the concept of a wireless Internet "clothesline" that could help the city handle everything from police video surveillance to wireless water meter-reading.
Behind the pitch is Brownsville businessman Oscar Garza, who leads the corporation Valley Wireless Internet Holdings.
Sandoval emphasized that the city hasn't made any decisions yet.
"It's a very interesting concept," he said. "We definitely want to be at the forefront."
REGION-WIDE
Pharr isn't alone in its consideration of wireless systems.
While wireless Internet is already the standard in some large cities, the technology now seems to be taking root in the Rio Grande Valley.
Cities across the region are pursuing high-tech, wireless Internet options that have the potential to promote efficiency in virtually all municipal departments by keeping workers in the field connected to City Hall.
Using wireless "mesh" systems, cities can provide Internet access over a large area to their employees through a series of nodes attached to structures like water towers or streetlights.
That means building inspectors could send reports back to City Hall from a work site, traffic citations could appear in court computers almost instantly, and police could set up surveillance cameras without fear of their cables being cut.
McAllen is already moving forward with plans to install up to 120 surveillance cameras throughout the city, which will be connected wirelessly to a fiber-optic cable running through the city.
The cameras would be served by a downtown wireless network, which could also provide support to other city workers in the area.
Last summer, a pilot program provided wireless to city workers in Bill Schupp Park. McAllen is currently soliciting proposals from vendors and is scheduled to meet with them today.
The focus of McAllen's project would be city usage, but eventually it could be opened up to residents, said Belinda Mercado, McAllen's information technology director.
Meanwhile, Hidalgo leaders are examining the possibility of creating a citywide blanket of wireless Internet similar to the one Pharr is examining. The system would provide access to emergency responders and residents on two separate networks, explained Rick Mendoza, Hidalgo's information technology director.
He said the talks are in preliminary stages and price estimates aren't available. But the city would like to offer Internet service to residents at no cost.
"We want to offer Internet service to members of our community who don't have the means of getting either DSL or cable," Mendoza said.
He added that a citywide wireless network would help Hidalgo compete with neighboring cities.
Edinburg leaders have also discussed the possibility of creating some sort of wireless system that would include various hot spots throughout the city, though they are only in discussions and the city hasn't started talks with any specific vendors.
Brownsville officials, meanwhile, expect their $6.6 million wireless project to be operational within four months, Mayor Pat Ahumada said.
The city is erecting signal towers, which will provide wireless access to city employees, utility workers and emergency responders, though it remains to be seen how much access the general public will have.
COST
The systems don't come cheap, however.
The network being pitched to Pharr could cost as much as $500,000 for the initial infrastructure, $25,000 a month to operate and even more for cameras, wireless water meters and other high-tech equipment needed to actually take advantage of the system.
At a time when cities across the region are struggling financially, at least some have questioned whether the cost of such an ambitious undertaking can be justified.
Pharr is just starting to climb out from under its financial woes after it wiped out its reserves last year.
"I believe the No. 1 question we should be asking, besides ‘Can we afford this?' is ‘Do we need it?'" said Pharr Finance Director Juan Guerra at a city workshop earlier this month. "From what I'm hearing ... I'm not sure if we do."
TIMING
Interestingly, the Valley's pursuit of wireless comes as cities elsewhere are struggling with their Wi-Fi projects.
Internet service provider Earthlink, which has partnered with Philadelphia, Houston and other large cities on wireless programs, announced layoffs within its municipal division in November. The company told shareholders it no longer makes sense for Earthlink to invest in municipal wireless.
As a result, some community wireless projects have been put on hiatus.
Earlier in the decade, companies like Earthlink offered to provide wireless systems at virtually no cost to cities. In exchange, the networks were privately owned, and the companies could charge subscription fees to consumers or hit them with advertising.
That model is changing, as it has become apparent that broadband access is becoming more readily available and affordable to consumers.
Today, cities are designing the systems for themselves to meet their own needs, such as giving support to emergency workers or keeping public works employees connected while in the field.
Those purpose-driven networks — as opposed to ones that are simply designed to give residents Internet access — are the ones that are now poised to succeed, writes Governing magazine's Christopher Swope, an expert on municipal wireless systems.
Vassallo, the wireless Internet consultant, emphasized to Pharr leaders that the city could create some public hot spots, but providing all-encompassing Internet service to residents isn't worth the cost or stress to the city.
Regardless of how, exactly, Pharr and other cities' projects takes shape, advocates say it's high time the Valley embraces wireless.
Tuesday, January 27, 2009
Google will begin to offer browser-based offline contact to its Gmail Webmail application
This functionality, which will allow people to use the Gmail interface when disconnected from the Internet, has been expected since mid-2007.
That's when Google introduced Gears, a browser plug-in designed to provide offline access to Web-hosted applications like Gmail.
Gears is currently used for offline access to several Web applications from Google, like the Reader RSS manager and the Docs word processor, and from other providers like Zoho, which uses it for offline access to its e-mail and word processing browser-based applications.
Rajen Sheth, senior product manager for Google Apps, said that applying Gears to Gmail has been a very complex task, primarily because of the high volume of messages accounts can store. "Gmail was a tough hurdle," he said.
Google ruled out the option of letting users replicate their entire Gmail inboxes to their PCs, which in many cases would translate into gigabytes of data flowing to people's hard drives. It instead developed algorithms that will automatically determine which messages should be downloaded to PCs, taking into consideration a variety of factors that reflect their level of importance to the user, he said. At this point, end-users will not be able to tweak these settings manually.
"We had to make it such that we're managing a sizable amount of information offline and doing it well in a way that's seamless to the end-user," he said.
For example, in Gmail, users can put labels on messages, as well as tag them with stars to indicate their importance, and Google can use that information to determine which messages to download. Sheth estimates that in most cases Gmail will download several thousand messages, preferring those that are more recent as well. Depending on the amount of messages users have on their accounts, they may get downloads going back two months or two years, he said.
Google will begin to roll out the Gmail offline functionality Tuesday evening and expects to make it available to everybody in a few days, whether they use Gmail in its standalone version or as part of the Apps collaboration and communication suite for organizations.
While the feature was "rigorously" tested internally at Google, it is a first, early release upon which Google expects to iterate and improve on. That's why it's being released under the Google Labs label. Users are encouraged to offer Google feedback.
Users have been able to manage their Gmail accounts offline via other methods for years, since Gmail supports the POP and IMAP protocols that let people download and send out messages using desktop e-mail software like Microsoft Outlook and others.
However, the Gears implementation will let people work within the Gmail interface without the need for a separate PC application. When offline, messages will be put in a Gears browser queue, and the desktop and online versions of the accounts will be synchronized automatically when users connect to the Internet again. This will come in handy for people who travel a lot and often find themselves without Internet access, Sheth said.
To activate the offline functionality, users of standalone Gmail service and the standard Apps edition should click "settings" after logging on to their Gmail account. There, they should click on the "Labs" tab, select "Enable" next to "Offline Gmail" and click "Save Changes." A new "Offline" link will then appear in the right-hand corner of the account interface. Users of the Education and Premier Apps versions will have to wait for their Apps administrators to enable Gmail Labs for everyone on the domain first.
Google is also rolling out Gears-based offline access for its Calendar application. However, it will be for now read-only and exclusively available to Google Apps account holders. Previously, Google introduced read-only offline access to the Spreadsheet and Presentation applications in Google Docs, which is also part of Google Apps.
Pegged as an "experimental" feature, the app is aimed at maintaining Gmail's functionality even when you're not online. Built on Google's Gear's platform, once enabled the feature downloads a cache of your mail to your PC. When you're logged on the Web, it syncs the cache with the Gmail servers.
While you're offline, you can read, star, and label messages. If you send a message when you're offline, Gmail places it in your outbox and sends it as soon as you log back in. A special "flaky connection" setting splits the difference between on and offline modes ("when you're 'borrowing' your neighbor's wireless," says Google), utilizing a local cache while syncing it with the online version.
MNI halts dividend amid default fears
UCLA researchers have reprogrammed human induced pluripotent stem cells
“This finding could be important for people who are rendered infertile through disease or injury. We may, one day, be able to replace the germ cells that are lost,” said Amander Clark, a Broad Stem Cell Research Center scientist and senior author of the study. “And these germ cells would be specific and genetically related to that patient.”
Theoretically, an infertile patient’s skin cells, for example, could be taken and reprogrammed into iPS cells, which, like embryonic stem cells, have the ability to become every cell type in the human body. Those cells could then be transformed into germ line precursor cells that would eventually become eggs and sperm. Clark cautioned, however, that scientists are still many years from using these cells in patients to treat infertility. There is still much to be learned about the process of making high quality germ cells in the lab.In another important finding, Clark’s team discovered that the germ line cells generated from human iPS cells were not the same as the germ line cells derived from human embryonic stem cells. Certain vital regulatory processes were not performed correctly in the human iPS derived germ cells, said Clark, an assistant professor of molecular, cell and developmental biology.
So it’s crucial, Clark contends, that work continue on the more controversial human embryonic stem cells that come from donated, excess material from in vitro fertilization that would otherwise be destroyed.
When germ cells are formed, they need to undergo a specific series of biological processes, an essential one being the regulation of imprinted genes. This is required for the germ cells to function correctly. If these processes are not performed the resulting eggs or sperm, are at high risk for not working as they should. This has significant consequences, given that the desired outcome is a healthy child.
“Further research is needed to determine if germ line cells derived from iPS cells, particularly those which have not been created by retroviral integration, have the ability to correctly regulate themselves like the cells derived from human embryonic stem cells do,” Clark said. “When we looked at the germ cells derived from embryonic stem cells, we found that they regulated as expected, whereas those from the iPS cells were not regulated in the same way. We need to do much more work on this to find out why.”
The new president signed an executive order on Friday that ended the ban on giving taxpayer money to international family groups that offer abortions or provide related information. The assistance was available from the Agency for International Development during the Clinton administration but banned during the Reagan and both Bush administrations.
Obama also is expected to restore funding for the U.N. Population Fund, which George W. Bush had rejected on the contention that it supported a Chinese family planning policy of coercive abortion and involuntary sterilization, an allegation that the agency vehemently denied. In fact, the lifting of the bans will reduce unintended pregnancies, abortions and the deaths of women from high-risk pregnancies.
The signing came a day after the Food and Drug Administration allowed the world's first clinical trial of a treatment derived from human embryonic stem cells for spinal cord injury. The therapy uses an old embryonic stem cell line that was allowed under the latest Bush administration but the approval might have been delayed until Bush left office.
The Bush administration restricted federal financing for embryonic stem cell research because creation of the cells entailed destruction of human embryos, even though they had been destined for the trash. President Obama has pledged to remove some of the financial restrictions.
"Camera Phone Predator Alert Act" to protect citizens from being photographed illegally, without us knowledge
Congress Intros Bill to Force Cell Camera Sounds
The Camera Phone Predator Alert Act (H.R. 414) is the real deal. Fresh off the legislative desk of New York Representative Peter King (R), the bill--currently cosponsored by goose egg--would require an audible tone to accompany all cellular phones with an installed camera that are created in the U.S. This tone, likely a clicking noise of some sort, would sound, "within a reasonable radius of the phone whenever a photograph is taken with the camera in such phone." And don't think that evildoers would be able to conceal their predatory ways by flicking an iPhone-style audio toggle switch. Any mobile phones built after the bill becomes a law would be prohibited from including any way to eliminate or reduce the volume of said noise.
Camera Click Sound to be Legal Requirement
The draft of the legislation also mentions that the click sound should be audible within a sensible" distance.
Monday, January 26, 2009
The world's best coolest ear buds
Other clues: In the teeth of the worst recession in generations, the five-year-old private company is growing like a weed. And it just scored a round of funding, from private-equity shop Goode Partners, at a time when investment dollars are scarce.
If the name Skullcandy doesn't register, it will with your kids (so will the term half pipe, which is a ramp, in this case for skateboarding, shaped like a pipe cut in half lengthwise).
Skullcandy's business is headphones, and they dominate the 12- to 25-year-old demographic with a line-up of gear covered in faux gator skin, gold foil, rhinestones and hip hop-inspired graphics. Pull back the hoody on any kid riding a snowboard in Park City, Utah and chances are pretty good, a pair of Skullcandy headphones, probably the top-selling "Smokin' Buds," will be pumping music into their ears.
Making electronics cool
From a distant No. 10 three years ago, Skullcandy is now North America's third-largest manufacturer of headphones by unit sales, behind consumer electronics giants Philips Electronics (PHG) and Sony (SNE), according to NPD Group. "We'll be No. 2 soon," predicted Skullcandy president Jeremy Andrus, legs dangling from the office half pipe. "My guess is some time next year."
After that, Skullcandy and the band of snowboarders, skaters, surfers and DJs that founder Rick Alden has assembled in Park City, will be gunning for No. 1. That is, if Alden, the CEO and creative madman to Andrus' operations guru, can figure out a way to do it without diluting the company's cool factor.
Skullcandy didn't invent headphones; what the company has done is make them into a fashion item. Kids don't want one pair, they want five. "We're like sunglasses," Alden said. "Except we sit on top of your head, and you wear them a lot more."
Skullcandy headphones are not the type you will hear audiophiles gushing about. They are mostly solid-sounding pieces of affordable gear that, unlike Sony's grey and black headphones, or Apple's white, don't disappear into the background. On the contrary, they make a statement. The snowboard, surf and skate inspired graphics and colors ask for attention, and speak to a lifestyle, or in most cases, a wannabe lifestyle.
Successful clothing brands are able to evoke that lifestyle magic, but it is the rare consumer electronics company that does it. Apple (AAPL, Fortune 500) with its iPod is the obvious and most successful current example. Skullcandy has pulled it off so far, and in doing so sent revenue from essentially zero to approaching $100 million in just a few years. Sales more than doubled in 2008.
To put in perspective Skullcandy's momentum, when many consumer electronics companies saw sales fall off a cliff in November, Skullcandy's quadrupled year over year, according to Andrus.
That success is obviously gratifying to Alden, but it also has him worried about overexposure. "I was at the mountain riding with my son the other day, and everyone I saw was wearing Skullcandy headphone, I mean they were everywhere," Alden said. "I may go back to wearing black Sony's just to be different."
He's kidding, but his concern is real. Alden and his design team need to keep Skullcandy fresh, so it doesn't fall out of fashion and black becomes the new black. Fortunately the Skullcandy team has a secret weapon when they seek inspiration, design-wise and business-wise.
"We head to the mountain," Alden said, checking for the latest snowfall report on his laptop. "No good ideas ever come from sitting in an office, not around here at least."
There is great disagreement about:
Whether earbuds could potentially sound good, given their small size.
Whether any actual earbuds sound good, or whether the whole idea needs further development.
Which earbuds sound good and which sound bad.
Which of the expensive ($40-$80) earbuds sound so good that the extra cost is justified.
After testing many headphones and earbuds and applying my extensive experience tweaking equalizers, I think that earbuds actually have the potential to sound even *better* than standard headphones. In any case, all headphones and earbuds need a new approach: a calibrated equalization curve built into the player, to yield flat response. Megabass is a step toward such a compensation curve.
Like the Etymotics, earbuds have the potential to have smoother response than even the best popular standard headphones, such as the Sennheiser 580's. I've dialed in some truly vibrant, open sound using equalization together with $10 earbuds. It is easy and straightforward to equalize earbuds; just do anti-rolloff to a greater or lesser degree, and leave the rest flat; there aren't mysterious jags hidden along the entire spectrum that need unique shapes of compensation. I'd rather trust my ears than the common assumption that earbuds are inferior. If the conditions are right and the appropriate, ordinary EQ compensations are made, earbuds can be superior, rather than inferior, to good standard headphones. It's simply a matter of starting with a decent earbud driver, and providing the inverse of the earbud driver's frequency response.
If someone shows me a measured response curve of an earbud and it's rough and jagged, I will change my view somewhat, but in any case, I think that eq-compensated earbuds at least *can sound* unusually smooth and natural. Players need more fancy curves to compensate for specific earbud models.
"Though I like the R3 stock earbuds even better than the 888's, I can't stop seeking for even better sound, as I believe it can be a lot better. If I press against an earbud I get very powerful bass, so it is possible. I will keep on looking, and if I find something interesting I will let you know. Please let me know your findings on this matter." (from a private email to me)
Some people haven't been lucky and haven't heard the one or two models that are really good. No wonder they think earbuds are a poor packaging and sound poor. I was starting to suspect that *some* Sony stock earbuds (included with the player) sound great, and some sound lousy.
Internet Explorer 8 Focuses on better Security and Privacy
Microsoft's updated browser, Internet Explorer 8, promises an assortment of new features designed to help make Web browsing with IE safer, easier, and more compatible with Internet standards. We looked at the first release candidate of the new browser released to the public today, Release Candidate 1 (RC1). On the surface, IE 8 seems to be a lot like IE 7, but Microsoft has made a number of changes under the hood. You may have seen some of these new features already, however, in IE's no-longer-upstart competitor, Mozilla Firefox 3.
Tabbed Browsing
Perhaps the most novel addition in IE 8 is what Microsoft calls tab isolation. The feature is designed to prevent a buggy Web site from causing the entire Web browsing program to crash. Instead, only the tab displaying the problematic page will close, so you can continue browsing.
Of course, IE 8 RC1 retains some of the features introduced in the first beta, including WebSlices and accelerators; see "Updated Web Browsers: Which One Works Best?" for more details.
Searching
Improved Security
Microsoft touts IE 8 as its most secure browser to date, and Microsoft has indeed added a good number of security features to the mix, ranging from phishing detection to private browsing, plus a new feature to prevent clickjacking, an emerging data theft threat.
IE 8 RC1 includes two security features under the 'InPrivate' label: InPrivate Browsing and InPrivate Filtering. Both existed in earlier prerelease versions of IE 8, but IE 8 RC1 lets you use the two features separately, whereas before each relied on the other.
If you enable IE 8's InPrivate Browsing feature, the browser will not save any sensitive data--passwords, log-in info, history, and the like. Afterward it will be as if your browsing session had never happened. This feature is very similar to Private Browsing in Apple's Safari browser, except that an icon in IE's address bar makes InPrivate Browsing's active status more obvious.
InPrivate Filtering--called InPrivate Blocking in earlier IE 8 builds--prevents sites from being able to collect information about other Web sites you visit. This feature existed in IE 8 Beta 2, but you could use it only while using InPrivate Browsing. In RC1, you can use InPrivate Browsing at any time.
The browser's phishing filter--called SmartScreen--improves on its predecessor's filter with such features as more-thorough scrutiny of a Web page's address (to protect you from sites named something like paypal.iamascammer.com) and a full-window warning when you stumble upon a suspected phishing site. SmartScreen relies largely on a database of known phishing sites, so new, unknown phishing sites may slip through the cracks.
IE 8 displays sites' domains in a darker text color, so you can more readily see whether you're visiting a genuine ebay.com page, say, or a page simulating an eBay page on some site you've never heard of. Microsoft could still put a little more emphasis on the domain name (using a different color background, for example), but the highlighting is a welcome addition.
Finally, IE 8 RC1 includes a feature designed to prevent clickjacking, a method in which Web developers insert a snippet of HTML code into their Web page code to steal information from Web page visitors. When you use IE 8 to view such a page, IE 8 can identify an attempted clickjacking and will warn you of the attempt.
Web Compatibility
Creating a site that looks identical in Internet Explorer, Firefox, and Safari can be a challenge. IE 8 Beta 2 offers better support for W3 Web standards--a set of guidelines developed to ensure that a Web page appears the same in all browsers. The downside is that IE 8 will break some pages designed for earlier Internet Explorer versions.
To counteract this problem, Microsoft has added a compatibility mode: Click a button in the toolbar, and IE 8 will display a page in the same way that IE 7 does. In my testing, I found that most pages worked fine with the standard (new) mode, and that most errors were minor cosmetic ones. Unfortunately, the Compatibility Mode toggle button may not be obvious to most users, because it's pretty small; a text label would have helped.
Though it probably won't convince many Firefox users to jump ship, Internet Explorer 8 Release Candidate 1 shows promise, and may be worth considering for people who have not yet solidified their browser loyalties. (Keep an eye out for our report on the final release of IE 8.)
See more like this: internet explorer, browser security, online privacy.
The software maker plans to say more on its Web site around noon, but, as noted by enthusiast site Neowin, the code is already available from Microsoft's download center.
With IE 8, Microsoft is hoping to regain some lost ground by adding features such as private browsing, improved security, and a new type of add-ons, called accelerators.
On the security front, Microsoft is adding a cross-site scripting filter, as well as protections against a type of attack known as clickjacking.
In an interview, IE General Manager Dean Hachamovitch said there will be little change between the release candidate and the final version, though he declined to say when the final version will be released.
"The ecosystem should expect the final candidate to behave like the release candidate," Hachamovitch said.
Internet Explorer 8 will work with Windows XP (Service Pack 2 or later) and Windows Vista. A version of IE 8 is also being built into Windows 7.
However, the IE code in Windows 7 is a pre-release candidate version.
"Windows 7 enables unique features and functionality in Internet Explorer 8 including Windows Touch and Jump Lists which require additional product tests to ensure we are providing the best Windows experience for our customers," the software maker said in a statement. "Microsoft will continue to update the version of Internet Explorer 8 running on Windows 7 as the development cycles of Windows 7 progress.
French-style aid for U.S. press would cost $8B
The future 3D holographic television to become realism
Picture this: you're sat down for the Football World Cup final, or a long-awaited sequel to the "Sex and the City" movie and you're watching all the action unfold in 3-D on your coffee table.
The reason for renewed optimism in three-dimensional technology is a breakthrough in rewritable and erasable holographic systems made earlier this year by researchers at the University of Arizona.
Dr Nasser Peyghambarian, chair of photonics and lasers at the university's Optical Sciences department, told CNN that scientists have broken a barrier by making the first updatable three-dimensional displays with memory.
"This is a prerequisite for any type of moving holographic technology. The way it works presently is not suitable for 3-D images," he said.
The researchers produced displays that can be erased and rewritten in a matter of minutes.
He said the University of Arizona team, which is now ten-strong, has been working on advancing hologram technology since 1990 -- so this is a major step forward. He believes that much of the difficulty in creating a holographic set has now been overcome.
"It took us a while to make that first breakthrough, but as soon as you have the first element of it working the rest often comes more rapidly," he said. "What we are doing now is trying to make the model better. What we showed is just one color, what we are doing now is trying to use three colors. The original display was four inches by four inches and now we're going for something at least as big as a computer screen."
There are no more great barriers to overcome now, he said.
The breakthrough has made some long-time researchers of the technology believe that it could now come to fruition.
Tung H. Jeong, a retired physics professor at Lake Forest College outside Chicago who had studied holography since the 1960s told NJ.com; "When we start talking about erasable and rewritable holograms, we are moving toward the possibility of holographic TV ... It has now been shown that physically, it's possible."
And what might these holographic televisions look like?
According to Peyghambarian, they could be constructed as a screen on the wall (like flat panel displays) that shows 3-D images, with all the image writing lasers behind the wall; or it could be like a horizontal panel on a table with holographic writing apparatus underneath.
So, if this project is realized, you really could have a football match on your coffee table, or horror-movie villains jumping out of your wall.
Peyghambarian is also optimistic that the technology could reach the market within five to ten years. He said progress towards a final product should be made much more quickly now that a rewriting method had been found.
However, it is fair to say not everyone is as positive about this prospect as Peyghambarian.
Justin Lawrence, a lecturer in Electronic Engineering at Bangor University in Wales, told CNN that small steps are being made on technology like 3-D holograms, but, he can't see it being ready for the market in the next ten years.
"It's one thing to demonstrate something in a lab but it's another thing to be able to produce it cheaply and efficiently enough to distribute it to the mass market," Lawrence said.
Yet, there are reasons to be optimistic that more resources will be channeled into developing this technology more quickly.
The Japanese Government is pushing huge financial and technical weight into the development of three-dimensional, virtual-reality television, and the country's Communications Ministry is aiming at having such technology available by 2020.
Peyghambarian said there are no major sponsors of the technology at present, but as the breakthroughs continued, he hopes that will change.
Even if no major electronics company commit themselves, there is hope that backers could come from outside of the consumer electronics industry, he said.
"It could have some other applications. In training it's useful to show people three-dimensional displays. Also it would be good to show things in 3-D for defense command and control and for surgery," he said.
Sunday, January 25, 2009
Wireless power technologies are moving closer to becoming feasible options.
Fulton Innovations showcased blenders that whir wirelessly and laptops that power up without a battery at the Consumer Electronics Show (CES) earlier this month. The devices are all powered by electromagnetic coils built into the charging surface, and there's not a plug in sight.
10 Wireless Electricity Technologies
ECoupled uses a wireless powering technique called "close proximity coupling," which uses circuit boards and coils to communicate and transmit energy using magnetic fields. The technology is efficient but only works at close ranges. Typically, the coils must be bigger than the distance the energy needs to travel. What it lacks in distance, it makes up in intelligence.
In conjunction with the Wireless Power Consortium, Fulton, a subsidiary of Amway, has developed a standard that can send digital messages back and forth using the same magnetic field used to power devices. These messages are used to distinguish devices that can and can't be charged wirelessly, and to relay informtion like power requirements or how much battery is left in a device.
Using this technique, an industrial van parked outside the Fulton booth at CES charged a set of power tools from within its carrying case. The van was tricked out by Leggett & Platt people )--a diversified manufacturing company based in Carthage, Mo., and an eCoupled licensee--and is designed to solve its customers' biggest headache: arriving at the job site with a dead set of tools. Fulton, which teamed up with Bosch to design the setup, already has test vehicles rolling around in the field and plans to sell them to utility and other industrial companies by the end of the year.
The mat uses a conductive powering technique, which is more efficient than inductive powering but requires direct contact between the devices and the charging pad. Though most of the mats or pads on display are intended to power only a handful of devices at a time, WildCharge says the product design is certified for up to 150 watts--enough to power 30 laptops.
Across the room from WildCharge, PowerCast displayed Christmas ornaments and floor tiles glowing with LEDs powered by ambient radio waves. The devices harvest electromagnetic energy in ambient radio waves from a nearby low-power antenna. Because of the dangerous nature of electromagnetic waves in high doses, Pittsburgh-based PowerCast is targeting its application for mall devices like ZigBee wireless chips, which require little power.
Perhaps the most promising wireless power technology was the latest iteration of WiTricity, the Watertown, Mass.-based brainchild of MIT physicist Marin Soljacic, on display in a private suite high in the Venetian hotel tower.
The technology uses a technique developed by Soljacic called "highly coupled magnetic resonance." As proof that it works, an LCD TV is powered by a coil hidden behind an oil painting located a few feet away. Across the hotel room, WiTricity Chief Executive Eric Giler walks in the direction of another coil holding an iPod Touch in the palm of his hand. Power hungry, it starts to charge when it gets within two meters.
Soljacic has already earned a $500,000 genius grant from the John D. and Catherine T. MacArthur Foundation for his work, but Giler said the technology is at least a year away. In the meantime, WiTricity has obtained an exclusive license from MIT to bring Soljacic's idea to market and hopes to have an estimated 200 patents.
But because Soljacic published his academic paper in Nature magazine, companies like Intel (nasdaq: INTC - news - people ) have been able to replicate the effect in their labs based on his principles.
Elsewhere at CES, PowerBeam showcased wireless lamps and picture frames. Located in Sunnyvale, Calif., the company uses yet another wireless-powering approach. Its technology beams optical energy into photovoltaic cells using laser diodes. Although the company says it can maintain a constant energy flow across long distances, the difficultly of targeting a laser means that it's not ideal for charging moving devices.
So, while 2009 may not be the year wireless electricity takes off, the nascent sector is certainly on its way.
Downadup worm replicates itself at astonishing speed!
The Downadup worm made its first appearance two months back, exploiting a critical Windows flaw in the way the Server Service handles RPC requests. A blended threat, the malware relies upon many attack vectors - from brute-force password guessing to hitching rides on USB sticks - for replicating itself to spread throughout a network.
The unique rate of speed at which the worm replicates has perplexed experts. Security researcher, Derek Brown, of TippingPoint's DVLabs Team, said: "The notion of using multiple attack vectors is not terribly new. The unique thing about this worm is the speed at which it has spread and I think that's a result of the big size of the Microsoft vulnerability."
Experts also opine that though the Downadup malware got started because of the Microsoft flaw, it later proliferated quickly through the unpatched Windows operating systems of the users.
Though the malicious worm knows no land barriers, the hardest hit countries, as per Symantec Security Response, are China and Argentina. According to the Symantec vice president, Alfred Huger, China accounts for almost 29 percent of the infections tracked, Argentina was next in line with over 11 percent infections.
Various major newspapers and television news shows reported Friday morning that the latest computer worm might now infect as many as 10 million computers worldwide.
According to a report in the Detroit Free Press, the worm is so virulent because it seems to “mutate” and launch “brute force attacks” that relentlessly try thousands of letter and number combinations in codes to steal personal passwords and login information.
Because most computer users choose passwords that they can remember easily, the words might also be something the worm can guess easily. Once in control of a computer the worm can launch spam, phishing attacks, shut down the Internet with massive traffic or access bank records.
According to F-Secure, an antivirus software company, the Conficker worm is spreading at a rate of 1 million new machines a day. It can be spread by USB stick also.
F-Secure has updated its Downadup removal tool, and the United States Computer Emergency Readiness Team has issued Alert TA09-020A, which describes how to disable AutoRun on Microsoft Windows systems in order to help prevent the spread of Conficker/Downadup via USB drives.
According to Symantec, the top infected countries in order of infection are: China, 28.7 percent; Argentina, 11.3 percent; Taiwan, 6.7 percent; Brazil, 6.2 percent; India, 5.8 percent; Chile, 5.2 percent; Russia, 5 percent; Malaysia, 2.8 percent; Columbia, 2.1 percent; and Mexico, 1.9 percent.
Philip Templeton of PT Technologies in Athens said everyone should keep his or her virus protection and software updates current.
“I have seen in the last four to six months more people getting viruses,” said Templeton. “But no matter what antivirus software you buy, nothing is 100 percent. Make sure your Windows Firewall is on, and it doesn’t hurt to change passwords periodically. I usually advise to make this a quarterly chore.”
Saturday, January 24, 2009
Windows 7 beta to be offered through Feb. 10
Those who start the download process before February 10 will have until February 12 to finish the task.
The deadline applies to the general public, while members of Microsoft's TechNet and MSDN developer programs will continue to have access to the code, LeBlanc said.
CEO Steve Ballmer announced the beta of Windows 7 during his speech at the Consumer Electronics Show in Las Vegas on January 7. After a slight hiccup, Microsoft made the code available on January 10.
Keep Your laptop data safe,now fix it.
[ Stay up to date on key security issues and solutions in InfoWorld's Security Adviser blog. Keep abreast of the latest mobile developments in the Mobile Pulse blog. ]
Perhaps the most important advantage of full disk encryption, though -- beyond the peace of mind it gives your business's lawyers -- is the "safe harbor" immunity that accrues under many data privacy regulations. For example, credit card disclosure rules don't apply to encrypted data, and even California's strict data-disclosure statute makes an exception for encrypted records -- provided you can prove they're encrypted. That's trivial with full disk encryption but not so easy with partial encryption techniques, which depend on user education for safe operation.
A key challenge for IT in deploying encryption on its laptops is the sheer number of encryption options available. Some Windows Vista editions, as well as the forthcoming Windows 7, support Microsoft's built-in BitLocker encryption, and numerous third-party encryption products cover the range of mobile operating systems from XP through Windows 7, Linux, and Mac OS X. Encryption granularity is widely variable as well, ranging from protecting individual files to encrypting virtual disks to deploying fully armored, hardware-based full disk encryption. Prices range from free to moderately expensive.
If you've put off laptop data security due to perceived technical shortcomings or high costs, you need to take another look at the field -- before you lose another laptop.
The TPM is a chip soldered on to the laptop's motherboard, providing hardware-based device authentication, tamper detection, and encryption key storage. The TPM generates encryption keys, keeping half of the key information to itself, making it impossible to recover data from an encrypted hard drive apart from the computer in which it was originally installed. Even if an attacker gets the user's part of the encryption key or disk password, the TPM-protected drive's contents can't be read when connected to another computer. Further, the TPM generates a unique digital signature from the motherboard in which it's embedded, foiling attempts to move the TPM chip itself to another machine.
If your laptops have a TPM chip, don't try enabling it without carefully following the vendor's instructions -- otherwise, you could accidentally wipe out the laptop's hard drive. Before enabling the TPM chip in a laptop, you must first take ownership of it, a process that establishes user and management-level passwords and generates the initial set of encryption keys. The management password lets IT administration monitor the inventory of TPM devices, recover lost user passwords, and keep track of usage.
A TPM works with the laptop's resident operating system to encrypt either the entire hard drive or most of it, depending on the OS encryption implementation. (Microsoft's BitLocker, for example, requires a small, unencrypted initial-boot partition). Alternatively, a TPM can interoperate with encryption-enabled hard drives to perform encryption entirely outside of, and transparent to, the operating system.
The TPM technology isn't perfect, but it provides very solid protection in the most common incident, where a laptop is lost or stolen and the user has not left it logged in. If the laptop is powered off, TPM protection is absolute. Most implementations use 256-bit AES encryption, which is considered uncrackable for the foreseeable future. Powering up the device requires entering pre-boot credentials in the form of a password, a PIN, a smartcard, biometric data, a one-time-password token, or any combination of these. If the lost laptop is powered on (but not logged in), or just powered off, an attacker would have to use extraordinary procedures to recover the encryption keys from live memory.
However, if a lost device is powered up and logged in, a TPM provides zero protection. An interloper can simply dump the data off the hard drive in the clear using ordinary file copies. Thus, it's essential that TPM-protected systems have noncircumventable log-in timeouts using administrator-protected settings.
You can also roll your own software protection using stand-alone packages such as PGP Whole Disk Encryption.
All these products support a wide range of enterprise-class management tools that let you enforce uniform policies and centrally store encryption keys, including special data-recovery keys that solve the problem of lost passwords and prevent employees from locking employers out of their hard drives.
You might think that plan B involves partial disk encryption, typically deployed by designating specific folders on a laptop as encrypted; as files are moved into that folder, they are automatically encrypted. Apple and Microsoft have long offered this form of encryption, via FileVault on the Mac and the Encrypted File System tools in Windows XP and Vista. But this approach has a major flaw: It depends on users to properly store sensitive data only in encrypted form.
Another form of partial disk encryption is to apply encryption to specific files, typically those residing on corporate servers that users want to open locally. In this approach, users must enter a password every time they open a protected file. IT not only is on the hook to ensure that all sensitive files get encrypted but also has no way to stop users from simply saving the opened file as an unencrypted copy. Still, this protection is better than nothing and is widely available via free disk utilities. But key management can be a problem, and these file-level encryption tools generally don't support multifactor authentication.
It's true that software-based full disk encryption is less secure than if you have a TPM-equipped laptop: The entire drive can still be encrypted, but a determined hacker will have more opportunities to gain access through compromised keys. For example, if the key-storage token is left with the notebook computer (how likely is that?), the hacker may be able to simply plug the token in and gain access to the drive contents. Even multifactor authentication in this scenario is subject to attack by inspection, since the key token is not tightly bound to the system motherboard.
Still, when TPM-enabled encryption is not an option, pure software full disk encryption can still give you considerable peace of mind, as well as provide the "safe harbor" benefits afforded encrypted systems in data-privacy regulations. Software full disk encryption solutions have also been around long enough that they're available for most mobile computing platforms, including Linux and Mac OS X.
TPM technology changes to comeAlthough TPM full disk encryption with hardware-based encryption in the hard drive is the best you can do for data protection today, security researchers are constantly testing TPM's mettle and devising improvements to it.
One potential vulnerability of today's separate TPM chip implementation is that keys must be transported across conductors in the motherboard to the CPU for software-based full disk encryption, or to the hard drive for hardware-based full disk encryption. That could provide an entry point for a hacker. That's why a major vendor trend is to move all TPM-oriented data manipulation on to the CPU chip set in the form of customized silicon. Intel has advertised its vPro solution, which is part of the upcoming Danbury processor and Eaglelake chip set. This feature will perform all encryption and decryption for SATA and eSATA drives without involving the CPU, OS device drivers, or even the hard drive itself.
Google Beats Estimates, Profit Takes a strike
The Mountain View, Calif.-based company reported an 18% jump in fourth-quarter revenue to $5.7 billion for the period ended Dec. 31. That's up from $4.83 billion in the year-earlier quarter.
Excluding commissions paid to advertising partners, Google posted sales of $4.22 billion, better than the $4.12 billion in sales expected by analysts polled by Thomson Reuters.
Google reported fourth-quarter net income of $382 million, down 68% from $1.2 billion a year ago. However, excluding certain charges, such as the cost of employee stock options, the company earned $5.10 a share, much better than consensus estimates of $4.95 per share.
"We had tight control over costs" in the quarter, said Google chief executive Eric Schmidt in a conference call with analysts.
Schmidt pointed to the scaling back of non-profitable Google projects such as Google Video, Google Notebook, and status update service Jaiku. He also mentioned a quarterly decline in costs paid to advertising partners.
"Google continues to take market share, and they continue to have any number of levers to pull on both the revenue and the cost side that makes them very formidable in any economic environment," said Derek Brown, analyst with brokerage Cantor Fitzgerald.
Over the last quarter of 2008, Google said it spent about $368 million on capital expenses - mostly on data centers, servers and networking equipment.
As of Dec. 31, Google said it employed 20,222 full-time workers, slightly up from the 20,123 it employed at the end of September.
In order to retain employees, Google also announced that it would be starting a stock option exchange program from the end of January through early March.
Revenue for the fourth quarter rose 18 percent from the same period last year to $5.70 billion and three percent from the previous quarter. Google (NASDAQ: GOOG) also suffered significant non-cash-impairment charges of $1.09 billion related primarily to its investments and AOL and Clearwire, a wireless broadband service that has partnered with Intel to build WiMax services across the country.
"The results were better than I expected," IDC analyst Karsten Weide told InternetNews.com. "Google is doing great because about half of the online ad spend in the U.S. is search and they have about half that market. They are leveraging the biggest market out there."
On a conference call with financial analysts, Google CEO Eric Schmidt noted "strong search query growth year on year." He also credited "tight control over costs that may have eluded us in the past, but I think we've got the formula down now."
While neither Schmidt or other Google executives on the call got very specific about new initiatives or product plans, he did say the company is looking at new ways to recognize the contextual meaning of a search phrase, which it would be rolling into its market leading search engine.
The past year saw Google branch out significantly beyond its original model of text-only results. In 2008 Google tripled the number of non-text only results, which includes video, images, blogs and books, said Jonathan Rosenberg, Google's senior vice president or product management.
He also said Google's $125 million settlement in October with the Authors Guild and the Association of American Publishers promises to make content from millions of out-of-print books accessible online and even create a new market for the sale of those books.
Weide also said YouTube's been "a sinkhole" for Google, which bought the video site for over $1.65 billion in 2006. "User generated content can work as an advertising source, but it's going to take a while," said Weide. "I think Google is going to have to acquire long form, professional content because big name advertisers want premium content not grainy, amateur video."
Rosenberg said Google continues to experiment with different ad approaches for YouTube. "It's hard to match the right format with the right content," he said. "We have to come up with a standard format to make it easier."
New employee stock options
Google also announced a new stock options plan, beginning January 29, that's designed to help retain employees. Schmidt said about 85 percent of its 20,000 employees had stock options "under water," or priced higher than the current trading price of stock.
Under the voluntary plan, employees can exchange all or a portion of their existing stock options for the same number of new options. Google said it expects the new options to have an exercise price equal to the closing price per share on March 2, 2009. Stock options with exercise prices above the March 2 closing price would be eligible for exchange, though Google said details of how the plan will work could change.
In after hours trading Thursday, Google shares were down $8.18 to $298.32.
Friday, January 23, 2009
New York City & Google starting a new trend in city-oriented tourist Web sites
Just Ask the Locals
The information center at 810 Seventh Avenue offers touch-sensitive horizontal screen tables that also use Google Maps. In a statement on The Official Google Blog, New York City Mayor Michael Bloomberg wrote that the new Web site and information center will "help make it easier for both visitors and residents to explore the energy , excitement and diversity of New York City's five boroughs."
Visitors can move around a table's map of the city's five boroughs. If the user has selected a category such as Museums & Galleries or Dining, the map will flag those places as a token is moved around. Each flagged item can then be opened to reveal photos and more information.
Since there are probably 10 million opinions about the city, no visitor's center would be complete without at least a few virtual New Yorkers. A visitor can browse a Just Ask The Locals section, where famous New Yorkers give recommendations.
'Custom Itinerary Flyover'
Visitors can save sites, recommendations and more to a physical disk and take it to a Video Wall where a "custom itinerary flyover" soars virtually over a detailed, three-dimensional map of the city. The wall also offers yet more advice from celebrities and local experts, and the visitor can send the itinerary to his or her cell phone, e-mail, or print it.
Andrew Frank, an analyst with Gartner, said such a high-tech center for visitors could be a marketing tool for other cities.
If done with an eye toward ease of use, as New York's appears to be, Frank indicated that such centers could appeal to the wide range of technological sophistication among visitors and locals in any city. He also said New York's center is another indication of "the evolution of out-of-home" marketing experiences, which increasingly are accompanied by ways to measure how people use them.
But, Frank noted, an issue with these centers -- and even Web sites -- is keeping them up to date, not only with data , but with the latest technology and fastidious, shining surfaces.
NYCGo.com contains not just Google map and search data, but also travel deals from Travelocity and local content from what-to-do powerhouse Time Out New York, nightlife culture magazine Paper, the New York Observer, and eco-living guide Greenopia.
The information center, located on Seventh Avenue between 52nd and 53rd streets, is equally Googly. The city's technocratic mayor, Michael Bloomberg, even contributed a guest post to the official Google blog to announce it: "The Information Center features interactive map tables, powered by the Google Maps API for Flash, that let you navigate venues and attractions as well as create personalized itineraries, which can be printed, emailed or sent to mobile devices," the blog post explained. "Additionally, there's a gigantic video wall that utilizes Google Earth to display a 3D model of New York City on which you can map out personalized itineraries."
Bloomberg has been aggressive about promoting tech initiatives during his time in office, from a wind power plan (part of the much bigger "GreeNYC" project) and a city-run venture firm. Under his watch, the Mountain View, Calif.-based Google opened its New York satellite office, taking over several floors of the historic former Port Authority building downtown.