Wednesday, December 27, 2006

Samsung demonstrates fuel cell laptop

According to Akihabaranews, Samsung is now offering a high-density fuel cell storage system for its Q35 laptop. The power system is contained in a large dock that the laptop sits on top of.

Samsung claims that the fuel cell offers an energy density of 650Wh/L, which is about four times as much as competing offerings. The total energy storage is an impressive 12,000Wh which, depending on the laptop's power settings and usage could theoretically power the laptop continuously for a whole month without the need for recharging.

One of the big breakthroughs Samsung is touting for their new fuel cell system is a major reduction in noise level: the new unit is said to be no louder than conventional laptops. While the docking station is somewhat bulky, it still qualifies as being portable, which is a claim that other fuel-cell solutions have not always lived up to.

Besides the advantage of much longer battery life, fuel cells have other advantages, such as near-instantaneous recharging time by simply adding more fuel. Companies such as Casio were demonstrating laptop fuel cell prototypes as early as 2003. Some even predicted that devices as small as cell phones could use fuel cell batteries, but the technology has taken some time to become ready for consumer use. As fuel cells become smaller, lighter, and more powerful, they may eventually take over from conventional batteries for all kinds of applications. And the days of mobile computing users rushing to find power outlets may become a thing of the past.

Science's Breakthrough of the Year

As the end of the year approaches many publications are releasing their top 10 lists for the year and Science is no exception. Last year Science named evolution as its top breakthrough of the year, but was accused of pandering to the political/religious debates that were/are raging throughout the world, especially in the United States. This year, Science (open access) named a breakthrough that has no connections to politics or religion: the proof of the Poincaré Conjecture by Russian mathematician Grisha Perelman.

The Poincaré Conjecture was originally proposed by Henri Poincaré in 1904 and deals with the topology of everyday objects, namely what, in topological terms, defines a sphere. The Conjecture remained unsolved for almost 100 years, although not for lack of trying, and in the year 2000 the Clay Mathematics Institute (CMI) named the Poincaré Conjecture as one of its six Millennium Problems. These problems have solutions that have eluded mathematicians for years and carry a US $1,000,000 prize to anyone who solves them (either in a positive or negative manner). As stated in the CMI's official problem declaration, the Poincaré Conjecture asks

If a compact three-dimensional manifold M3 has the property that every simple closed curve within the manifold can be deformed continuously to a point, does it follow that M3 is homeomorphic to the sphere S3?
Clear, no? Well for those of us who do not hold advanced degrees in topology or geometry, it asks a fairly simple question, namely "Can an arbitrary closed surface be turned into a sphere by only stretching it appropriately?" This question has formed a cornerstone in the field of mathematics known as topology, which studies the geometry of surfaces undergoing deformations. Poincaré suggested that ALL three-dimensional surfaces that had no holes could be turned into a three-dimensional sphere without needing to tear a section apart or stitch two sections together. He also suggested the converse to be true, that a surface with a hole could NEVER be turned into a sphere without tearing or sewing sections. A real world example from breakfast would be that a muffin can never be turned into a doughnut: the hole in the doughnut cannot be created in the muffin by simply stretching it—one would need to tear a hole in the muffin to make it ever resemble the doughnut.

Over the years, special solutions for specific dimensions were developed. For one and two dimensions, the proof was trivial; seven or more dimensions were handled by a proof from Stephen Smale, developed in 1960. Smale then extended his proof to all dimensions greater than or equal to five, for which he was awarded the Fields Medal in 1966. More than 20 years later, Micheal Friedman proved the Conjecture for the four dimensional case in 1980 and was awarded the Fields Medal in 1986. Poincaré had now been proven for ALL dimensions EXCEPT three, the original dimension for which Henri Poincaré first proposed the problem.

The final unproven dimension fell in a series of papers published to the web by Grisha Perelman, all available to the curious for free from arXiv.org. Dr. Perelman extended previous work done by Richard Hamilton, a mathematician who proposed that an arbitrarily lumpy space could "flow" towards a smooth space—imagine a Koosh ball morphing into a soccer ball—through equations akin to the heat equation, and named the process "Ricci flow." In Ricci flow, lumpy areas and areas that are highly curved tend to smooth themselves out until in entire surface has a constant curvature. In the absence of any major problems, this idea of Ricci flow could show how an arbitrary surface could be morphed by simple stretching into a smooth sphere, thus providing a method to prove or disprove Poincaré's Conjecture. However, problems arose; singularities, such as necks (thin areas between two larger areas, e.g. the bar in a dumbbell weight) would pinch and close themselves off, forming two separate objects with a uniform geometry, violating the rules of topological stretching. Perelman had to work for many years to find a way to overcome this problem. In November of 2002, he published his first paper on the subject with a new quantity added into the mix, what he called Ricci flow "entropy", borrowing from a term in statistical mechanics that tends to increase until a equilibrium is reached. This entropy idea proved that the problems in Hamilton's work—the singularities—could be overcome, yet Perelman still faced other road block to overcome before a full proof could be developed. In his subsequent articles, he showed that these problems areas would occur one at a time, as opposed to all at once. He then went on to show that it could be "pruned" through surgery before it would cause a problem with the Ricci Flow.

This series of papers was published in 2002 and 2003, yet it took the mathematics community at large three more years to accept this solution as a true proof to the Poincaré Conjecture. However, the story does not end there. In 2006 the International Mathematical Union announced that it had awarded the Fields Medal to Grisha Perelman, but he declined it. In a rare interview with The New Yorker, Dr. Perelman announced that he was retiring from mathematics, stating he was disheartened by what he viewed as ethical lapses by some of his colleagues. This New Yorker article has caused quite a commotion in the mathematical world, with people claiming their words were distorted and threats of lawsuits traded. This has caused a black cloud to hang over what is the greatest mathematical breakthrough of the millennium (so far). Fortunately mathematical proofs are not affected by the feelings of those who create them. The proof of the Poincaré Conjecture remains atop the list of the greatest scientific breakthroughs of the year.

Thursday, December 21, 2006

Apple Bug-Hunt Begins

Apple Computer will soon be a member of the "month of bugs" club.

On January 1, two security researchers will begin publishing details of a flood of security vulnerabilities in Apple's products. Their plan is to disclose one bug per day for the entire month, they say.

The project is being launched by an independent security researcher, Kevin Finisterre, and a hacker known as LMH, who declined to reveal his identity.

Some of the bugs "might represent a significant risk," LMH said in an e-mail interview. "Others have a lower impact on security. We are trying to develop working exploits for every issue we find."

The two hackers plan to disclose bugs in the Mac OS X kernel as well as in software such as Safari, iTunes, iPhoto, and QuickTime, LMH said. Some of the bugs will also affect versions of Apple's software designed to run on Microsoft Windows, he added.

LMH was one of the brains behind the recent Month of Kernel Bugs project, which exposed flaws at the core of several different operating systems.

Wednesday, December 20, 2006

Polar Rose wants to recognize faces in web photos

Swedish startup Polar Rose today took the wraps off of their plan to make faces in photos searchable. Searching within images has always been tricky for machines, but Polar Rose hopes that a combination of mathematics and user intelligence will help to build an accurate recognition engine.

The software works by constructing 3D models of the 2D faces found in photographs, a technique that gives a "significant boost to recognition," according to the company. The 3D model makes it easier to construct a unique facial "fingerprint," but does little to match that fingerprint with an individual. That's where users come in.

Polar Rose hopes to make the system match up the faces with names by providing a browser plug-in for Firefox and Internet Explorer. When the plug-in locates people (or what it thinks are people) in online photos, it puts the company's rose logo in a corner of the picture. Clicking it will allow web surfers to tag photos with names and other information. Once the system is trained to recognize a particular person, it can locate that same person in other photos, even if the new photos contain no metadata and are taken from a different angle. All tagging information is kept by Polar Rose, which means that all users get the benefit of everyone's tags.

The goal is to provide a workable search engine for people in photos, but the company is also opening up the technology by releasing a set of APIs early next year. The royalty-free APIs will allow for the creation of new applications that go beyond anything dreamed up during snowy nights in Malmö, where Polar Rose is based.

Thursday, December 07, 2006

Open XML becomes a standard

Microsoft Corp. has won approval for its Offfice Open XML document format from international standards body ECMA International.

ECMA's General Assembly voted by 20-1 in favor of the standard at a meeting in Zurich on Thursday afternoon, and will now submit the format to the International Organization for Standardization (ISO) for its approval. The vote against came from a representative of IBM Corp.

A standardized document format will make it easier for competing software companies to develop products that can interoperate with one another and to edit the same documents. Products meeting the standard could find favor with governments, or other organizations concerned about interoperability.

Interoperability is vital for the preservation of archive information, according to Adam Farquhar, head of e-architecture at the British Library and a member of the committee that worked on the standard at ECMA. The British Library archives electronic documents, but must deal with whatever format they arrive in. The development of interoperable software tools will make that work easier.

To help developers digest the standard, the committee has published a 14-page white paper explaining it.

That accessibility is important if Microsoft is to win developers over to its document format, as the company faces competition in the standards industry just as it does in the software market. A rival document format, OpenDocument Format (ODF), has already won approval from ISO, and was published as an ISO standard last month. ODF is used by office productivity suites such as OpenOffice.org or Sun Microsystems Inc.'s StarOffice, and has gained the support of other companies, including IBM. Government officials in France and the U.S. commonwealth of Massachusetts have recommended adoption of ODF as a government standard.

Monday, December 04, 2006

Open Document Format published as ISO standard

The International Organization for Standardization (ISO) finally published the Open Document Format (ODF) as an official standard last week after approving it as an international standard last May. The ODF file format—the XML-based open format for text, spreadsheet, database, and presentation files—is now published under the standard name of ISO/IEC 26300:2006.

ODF is being championed as the accessible file format by accessibility groups, as the open standard must face public peer review to evaluate its accessibility as opposed to other, proprietary formats (such as .doc). Since it is specified as a public document, the idea is that anyone can create software for it under any platform at any time for those with disabilities without worrying about royalties or implementation barriers, not to mention those who simply like being able to move files around without concerning themselves with incompatible formats.

This move will likely give an advantage to ODF over other open standards, such as the Office XML standard that Microsoft has been working on. There are already a number of companies and organizations worldwide that support the use of ODF, and various software, such as OpenOffice.org, already generates ODF files as the native file format. Additionally, Google's web-based office apps support ODF, and even Microsoft has conceded that it will support a plugin for Microsoft Office that will convert files to ODF.

Zudeo: a high-def version of YouTube

Another day, another peer-to-peer file-swapping firm trying to go legit. This time it's Azureus, makers of one of the most popular BitTorrent clients on the Internet. The company has just launched its own video sharing site, dubbed Zudeo, which hopes to stand out from a crowded field of contenders by offering high-definition content and storage space for massive files.

The site, still in beta, allows the usual mix of tagging, uploading, and free viewing, though content is quite limited and features are still sparse. Users first need to download the Azureus application to get content, though, adding a layer of complexity to the normal process.

In return for using a BitTorrent client, Zudeo offers full HD-quality videos and gigabytes of storage space. It's free to content producers, who need to use the Azureus Director Edition to upload their creations.

Azureus isn't the only company using BitTorrent to distribute massive video clips; there's also, err, BitTorrent. BitTorrent (the company) has its own plans to become a content distributor, and it just signed deals with 20th Century Fox, Lionsgate, MTV Networks, Paramount, and Starz that will allow it to offer their content for download in the new year (in addition to deals it signed earlier in the year). The company has also closed another $20 million funding round, a sign that investors still have confidence in peer-to-peer distribution models (or that they desperately want to create a YouTube-style success story).

Hacker boots out evolution archive from Google

Webmasters aren't pleased when they run into problems with Google, and Wesley Elsberry was no exception. Elsberry, the webmaster of the Talk.Origins archive, made his displeasure public in a blog post after Google booted his site from their search index. This sort of thing happens frequently, but a public response from Google is uncommon, and it sheds some light on how Google handles its relationships with webmasters. It's also a great example of how far you'll get with a problem if you call Google on the phone.

Talk.Origins is devoted to discussions of evolution and the perceived shortcomings of the Intelligent Design and creationist movements. The group behind Talk.Origins runs several websites, one of which was hacked on November 18. The hacker added a collection of invisible spam links to certain webpages, links that dealt mostly with the less-pleasant sides of human sexual experience. Google's algorithms noticed the spammy links within a few days and delisted the site from the Google Index for 60 days.

Elsberry was angry that he wasn't notified in advance of this decision, and claimed that Google even refused to tell him what the problem was. "So, what, precisely, was causing Google to not like us anymore?" he writes. "The essential lesson here is that Google would not tell us. That isn't mere caprice; that is Google policy." Elsberry's attempts to get access to a Real Live Human™ on the phone proved fruitless as well, but he reports that he luckily found and fixed the problem quickly.

After the story was posted on Slashdot, Google's Matt Cutts investigated the issue and wrote a blog post of his own, outlining the steps that Google had taken. By November 27th, the site was listed as "hacked and spammy" and was flagged as "penalized." The next day, the Google folks emailed several addresses at talkorigins.org about the problem, and included specific links to the problem pages. Elsberry never got the messages.

But Cutts argues that it's ultimately the responsibility of webmasters to safeguard their own domains, and that Google can't provide too much information about its practices or the types of problems it flags (this could make it easier for spammers to evade automated scrutiny). He agrees that Google could do a better job of getting the word out, but concludes, "Please give Google a little bit of credit, because I do think we're doing more to alert webmasters to issues than any other search engine."

Being indexed by Google is all but a prerequisite for running a successful website, which continually puts Google in the position of having to play the roles of judge, jury, and executioner. While they have no control over a site's bandwidth, one decision by Google can effectively put many sites out of business. Talk.Origins won't be one of them, though; the site has already been reinstated and should soon be fully accessible through the search engine.

Wednesday, November 29, 2006

Google Answers decides to close up shop

Google announced today that it is bidding farewell to one of its first side projects (and incidentally, one of the few Google projects to ever come out of beta), Google Answers. Google Answers will stop accepting new questions by the end of this week and will stop accepting answers to existing questions by the end of this year.

For those not in the know, Google Answers allows anyone on the web to submit any question, be it deep and intriguing or silly and curious, via Google's interface. The question would then be answered by one of over 500 "carefully screened" researchers within 24 hours. This was a good method for getting fairly definitive, expert answers on oddball topics that might not otherwise be easy to find via searching the internet with Google. For example, how many tyrannosaurs are in a gallon of gasoline? Non-researchers were allowed to comment on questions and answers as well, so that input from other web users was also included.

The interesting part about Google Answers was that there was a cost involved to the user, something that is relatively uncommon with Google services these days. Users with questions could post payment amounts of their choice, starting at $2.50 and up, depending on how difficult the question was and much they valued the answer from an expert. When the question was answered, the researcher (an independent contractor hired by Google) would receive 75 percent of the payment while Google would receive 25 percent—higher-paid questions were usually answered first for obvious reasons. Those whose questions got answered were also allowed to leave tips of up to $100 if they were particularly satisfied with the answer.

Why would such a seemingly useful service close up shop? There could be several reasons for Google to pull the plug on Google Answers, the simplest of which is that they want to move engineers to bigger and better projects. However, in the four years that Google Answers was open for business, it fell under some criticism that may have affected its popularity. Some said that Google was making money off of otherwise free services provided by librarians, and others were concerned about Google Answers enabling plagiarism by students. More controversially, however, some of the researchers involved were unsatisfied with the unruliness of Google's process.

Intel jumps on 802.11n bandwagon with Centrino

On Tuesday, at the IEEE Globecom 2006 Expo in San Francisco, Intel announced that the company is planning to put a pre-standard version of 802.11n wireless networking into the next release of the Centrino chipset. Alan Crouch, director and general manager at Intel's Communications Technology Lab, told the crowd of engineers and technologists that the new WiFi technology was slated for inclusion in the Centrino chips as early as next year.

802.11n is the latest revision of the wireless networking standard, which has already gone through two major standards: 802.11b at 11 megabits per second, and 802.11g at 54 Mbps. However, 802.11n has not yet completed the standardization process, and will not receive its final blessing from the IEEE LAN/MAN Standards Committee until the first half of 2008.

Other companies have jumped the gun on producing 802.11n cards and adapters before. Dell started offering the high-speed wireless networking option back in June, along with Linksys, Belkin, and D-Link.

Tuesday, November 28, 2006

New Opera makes music on cell phones

Opera has announced a major new version of their web browser for mobile phones. Dubbed Opera Mini 3.0, the new version is available as a free download from the Opera web site. The new browser works with a variety of phones (a complete list is available at the site) and can be downloaded in three ways: as an SMS attachment, a WAP download, or downloaded to the user's PC and then transferred over via Bluetooth or USB attachment. The download comes in the form of a Java-based application separated into .jar and .jad files.

Opera Mini comes in two different versions to accomodate phones with different hardware resources. New features in Mini 3.0 include an RSS reader, HTTPS support for secure web sites, the ability to share photos taken by the camera, and a content-folding feature that compresses long menus to a [+] button that can be expanded as necessary. The company has set up a remote server that preprocesses web pages, compressing the content before it is sent to the phone. This makes browsing faster and lowers bandwidth charges. Opera claims that Mini 3.0 is even faster at rendering pages than the previous version.

Its small size, modest hardware requirements, standards compatibility, and raw speed have won it much critical acclaim, but the browser has not met with much success on the desktop—most web surveys put its usage at around 1 percent, compared with 10-12 percent for Firefox and 80-odd percent for Internet Explorer. The folks at Opera have decided to focus on the embedded market for their future growth, and have already scored deals with Nintendo to provide a browser for the DS and Wii game consoles (the latter is not available just yet).

Monday, November 27, 2006

256GB paper storage claims simply don't add up

A story first posted on ArabNews.com has been making the rounds on the Internet, involving an Indian student who has allegedly found a method of storing compressed digital information on a regular sheet of paper. Sainul Abideen claims that his technique, dubbed Rainbow Technology, can store between 90 and 450 GB on a single sheet of paper.

The system allegedly works by encoding data into small geometrical shapes (circles, squares, and triangles) in various colors, then printing them out on a piece of paper. A scanner is used to read the data back in to the computer. Abideen claims that his storage method is more environmentally friendly due to the biodegradable nature of paper, and envisions magazine publishers printing tear-out sheets of paper containing demos and programs, replacing the traditional plastic-wrapped CD or DVD.

Storing digital information on paper dates back to the earliest days of computing. Anyone remembers Punched Cards? The cards had 80 columns - an artifact that remains with us today as the default width for console-mode applications - and could only store a maximum of 120 bytes (about one-eighth of a kilobyte) per card.

However, despite technological advances in scanning and printing technology since those days, Abideen's claims quite simply do not hold water. A little bit of math is in order here. Starting with a scanner with a maximum resolution of 1,200 dots per inch, this leads to a maximum of 1,440,000 dots per square inch, or just over 134 million dots on a sheet of standard 8.5" by 11" paper (excluding margins).

Getting a scanner to accurately pick up the color of a single dot on a page is a difficult affair (it would take near-perfect color calibration, for example, and be prone to errors from ambient light and imperfections in the paper) but let's be generous and say that the scanner can accurately pick out 256 shades of color for each dot. That's a single byte per dot, making the final calculation easy: a maximum theoretical storage of 134MB, which would likely go down to under 100MB after error correction.

It's a decent amount of storage, but several orders of magnitude smaller than the 450GB claimed by Abideen. The claim that "circles, triangles, and squares" can achieve these extra orders of magnitude can be easily challenged. There is a word for using mathematical algorithms to increase the storage space of digital information: it's called compression. No amount of circles and triangles could be better than existing compression algorithms: if it was, those formulas would already be in use! Compression could easily increase the 100MB theoretical paper storage by a factor of two or three, but so could simply compressing the files you wished to store into a .zip archive before converting them to a color printout.

Ultimately, storage is about bits, and the smaller the bits are physically, the more storage can be packed into a given space. The magnetic bits on hard drive platters and the tiny pits in optical media are orders of magnitude smaller than the smallest dot that can be recognized by any optical scanner, and this is the simple reason why they store orders of magnitude more information. Even if a much higher-density printer were used (such as an expensive laser printer or offset printing process) the limiting factor is still the scanner required to get the information back into the computer. In the end, a picture may be worth a thousand words, but it cannot be worth half a thousand megabytes.

Tuesday, November 21, 2006

Castle built for .Net

Castle, which is an open source project to ease development on Microsoft's .Net platform, is nearing its general release.

Spawned out of the defunct Apache Avalon object-oriented framework, Castle is built as a set of programming libraries. In addition to .Net, it also will work with Mono, an open source implementation of .Net.

"Castle's goals are avoid code repetition, use convention instead of configuration whenever it's possible, and handle common tasks on the programmer's behalf," said project creator Hamilton Verissimo, founder of the Castle Stronghold consultancy in Brazil.

A second release candidate of Castle 1.0 is available as of several weeks ago. A general release is targeted for January.

Castle can be used for enterprise and Web application development. It features tools including MonoRail, which is a Web framework inspired by Ruby on Rails. MonoRail uses the MVC (Model-View-Controller) pattern for application flow, data, and views. AJAX (Asynchronous JavaScript and XML) is featured as well.

Also in Castle are ActiveRecord, for mapping objects to databases, and Windsor, which is an object lifecycle container for making code more maintainable.

"[Castle is] a collection of many different tools and features primarily designed for easing development and reducing development time," said Kevin Williams, who is a developer involved in Castle.

Ease of use and intuitiveness are the primary draws of Castle, Williams said.

"Of all the software that I've used over the years in .Net, I've never seen anything as easy to use or powerful as the Castle tools," Williams said.

"It's very intuitive to use and you have a lot of control over things and it's very extensible," Williams said. Users can mold Castle to their environment or use it with the Visual Studio IDE.

"Probably the best thing I can say is that as developers improve their practices and learn about new methodologies and techniques and design patterns and all these things, sometimes it's difficult to put all these tool together and make it work," Williams said. "Castle fits all those pieces together for you and gives you a jumpstart in what you want to build."

Castle began as an attempt to bridge Avalon to .Net. Planning already is under way for version 2.0 of Castle. The MonoRail component in version 2.0 may be enabled to work with many view engines at the same time. A domain language to generate JavaScript also is eyed, as is a better template engine in the vein of Ruby. Performance improvements also are planned for MonoRail.

Initialization and mapping improvements are expected for ActiveRecord in Castle 2.0. The IoC (Inversion of Control) microkernel in Castle may be more configurable as well.

Internationalized domain names coming next year

The recent Internet Governance Forum in Athens featured lots of talk about the way that the 'Net is run, but nothing generated more discussion than internationalized domain names (IDNs). In many parts of the world, frustration has built up for years over the need to use Latin characters to access most top-level domains (TLDs), even when they reference sites that are in languages like Tamil, Farsi, or Mandarin. Change is coming, but it's slower than many would like.

ICANN has been urging patience, arguing that if certain countries introduce proprietary systems of their own, the interoperable nature of the Internet could be broken permanently. Systems that are not well-implemented could also cause general confusion and security concerns.

At a recent event in Sydney, ICANN CEO Paul Twomey said that the "political pressure" his organization faces will not make ICANN move any faster. "The Internet is like a fifteen story building, and with international domain names what we're trying to do is change the bricks in the basement," he said, according to The Sydney Morning Herald. "If we change the bricks there's all these layers of code above the DNS... we have to make sure that if we change the system, the rest is all going to work."

ICANN's roadmap for IDNs shows that a workable system should be ready by the end of 2007, with live root zone tests to be conducted in December 2006. The group has already contracted with a Swedish firm named Autonomica to do lab testing of the new system, which is currently underway. ICANN has actually been developing the IDN system for three years, leading us to believe that their timetable has a good chance of being met. If you've been holding out on that new domain registration until you could have it in Hindi, you've only got a year left to wait.

Monday, November 13, 2006

Samsung Tells Future of Cell Phones

Mobile phones will undergo a dramatic transformation over the next few years, incorporating more powerful processors and more storage, as well as new technologies, a Samsung Electronics research and development executive said today.

The addition of these technologies will dramatically expand the capability of mobile handsets, which will have sensors to monitor a user's health and offer a wider range of entertainment and online services, such as shopping, said Kang-Hun Lee, vice president of Samsung's Next-Generation Terminals Team. Voice will remain a "basic capability" of these devices, he said.

By 2010 or so, handsets will use flexible or holographic displays and could have processors that run at clock speeds up to 5GHz, Lee said. In addition, they may pack up to 10GB of flash memory or hard disks that can hold 20GB of data or more.

Future handsets will also include more advanced cameras, capable of capturing 3-D and holographic images, and rely on fuel cells or solar panels for power. Future handsets will switch seamlessly from one network to another, moving between cellular networks, mobile WiMax, and other networks.

While much of the new technologies have yet to move beyond the R&sD lab, Samsung anticipates the gradual addition of new technologies and capabilities into its handsets. For example, Samsung will next year put a 1GHz StrongArm processor inside a mobile phone.

In June, Samsung revealed it plans to add mobile Wimax support to a Global System for Mobile Communications (GSM) handset. That handset is due to hit the market during the first half of 2007, said Hwan Woo Chung, a vice president at Samsung's Mobile WiMax Group, speaking at that time.

Sun opens Java under the GPL

After years of speculation followed by years of waiting, Java has finally, truly been opened. Sun announced today that Java has made the jump to "open source," as Sun says that parts of the Java platform it owns are being licensed under the GPL open source license (version 2). The use of the GPL is surprising, because it puts any and all modifications back into the public source code, and not all software developers are eager to share their contributions. Nevertheless, in adopting the GPL, Sun is being aggressive in its move into open source. Solaris, for instance, is distributed under the far more restrictive Common Development and Distribution License (CDDL), which is mostly based on the Mozilla Public License (MPL).

The upshot is this: not only has Sun open-sourced Java, but they've adopted a license that they hope will please the "free software" folks along with the hordes of commercial software developers that have been using Java for almost a decade. Java will be distributed with what is known as a "Class path exception" which will allow Java libraries to link to non-GPLed code, making it possible to continue to use Java with closed-source commercial development projects. Sun hopes it's a win-win situation. Only time will tell.

Sun calls the move "one of the largest source code contributions under the GPL license," but the company is also quick to point out that this is big is another way, too. With 3.8 billion devices using Java, it's the single largest platform for unifying software development for devices. Sun will also continue to sell Java-based software packages despite the licensing change.

All of the source code relating to Java is expected to be opened by the end of March 2007. For now, Sun has made available the first pieces of source code for Sun's implementation of Java Platform Standard Edition (Java SE) and a buildable implementation of Java Platform Micro Edition (Java ME). More details are available at Sun's new open source Java landing.

More coverage of interest to developers can be grokked at InfoQ, which has a great run-down on the licensing ramifications.

Friday, November 10, 2006

Yahoo to embed instant messaging into e-mail

E-mail was arguably the first killer application of the Internet, but recently it has become somewhat less useful as the war between spammers and antispammers has left some Internet users feeling like innocent victims in a battle between good and evil. The younger generation tends to eschew e-mail in favor of instant messaging. Now, Yahoo has decided to bring the two technologies together for a new version of Yahoo Mail.

Brad Garlinghouse, vice president of communications at Yahoo, said that the reason for the move was to improve the overall user experience, something he claims is lacking from many "Web 2.0" applications. "I would argue that many Web 2.0 applications are already dead," he said. "Web 2.0 as an application is leaving tremendous value on the table for consumers and for us as businesses."

So is the new Yahoo Mail an example of Web 2.5? The company is not the first to have the idea of integrating e-mail and instant messaging. For almost as long as there has been MSN Messenger, Microsoft has enabled access to its contact list within Outlook Express. Google also recently added chat features to its Gmail application. However, the new Yahoo Mail is even more integrated with instant messaging. It does not require any separate instant message program, allowing direct chatting within its web-based interface.

Yahoo and Microsoft recently announced that they would be making their IM services interoperable, so MSN users will have a new method of keeping track of their friends while online. Will the IM integration and MSN compatibility help push Yahoo's services over the top? The battle for instant messaging supremacy has not been a close one, at least in the United State, where AOL Instant Messenger continues to hold a strong lead. However, the web is increasingly international, and the race outside the US is much closer. What prizes await the victor? Untold riches in the form of ad revenues.

Wednesday, November 08, 2006

Firefox 1.5 support ending April 24

Users of Firefox 1.5 should plan to upgrade their browser by April 24 of next year at the very latest, according to Mozilla Corp. That's because April 24 is the date developers plan to stop issuing security and stability fixes for the open-source browser, Mozilla said Wednesday in a note posted on the Mozilla.com Web site.

This notice was included in an alert advertising the latest upgrade to Mozilla 1.5. Released Tuesday, the 1.5.0.8 update includes three critical security fixes for the browser. Firefox users should already have begun receiving the software through the browser's automatic update process.

The April 24, 2007, date means that Firefox 1.5 users are being given a six-month window to move over to the Firefox 2.0 browser, released last month. This is the same amount of time that Firefox 1.0 users were given before updates for that product ceased.

Version 1.5.0.8 also includes new software that will eventually allow Mozilla to push out its version 2 of the browser via Firefox's automatic update mechanism.

Although Firefox 1.5 includes an automatic updater, it does not allow users to decline these software updates. So Mozilla has decided to add this capability in the 1.5.0.8 patch before offering the Firefox 2.0 as an automatic update. That way, users who are not ready to make the move to 2.0 will be able to decline the upgrade.

Firefox representatives could not say when they planned to begin pushing version 2.0 as an automatic update.

Mozilla developers are now beginning discussions on what to include in the Firefox 3.0 browser, which they hope to release one year from now, according to the Firefox road map.

Skype 3.0 beta released

Windows-using Skype fans can now download beta 3 of the venerable VoIP application. Skype 3.0 sports a handful of new features, including Skypecasts, Public Chats, Click-to-call, and a redesigned user interface.

Public Chats allow Skype 3.0 users to create and join large text chatrooms that appear to operate much like IRC chatrooms. Moderators can direct conversation topics, kick users, and determine who can participate in chats and to what extent. Public Chat links can be included on web pages, allowing surfers to jump into conversations via Skype.

Still in beta, Skypecasts are conference calls handled over Skype's network that can handle up to 100 users. Like Public Chats, Skypecasts allow moderators to control the flow of the conference calls with mute and "eject" buttons. Like Public Chats, Skypecast links can appear on web pages to enable 'Netizens to join calls.

Click-to-call is an extension of a feature first found on eBay. It works with both Internet Explorer and Firefox, using a phone number recognition feature to enable one-click Skype calls directly from a web page.

Other improvements include a redesigned UI, tweaks to some of the language files, support for the Lithuanian language, and improved video device detection. As of now, Skype 3.0 beta is for Windows only, and that will likely remain the case for the next several months. The official Mac OS X client just hit 2.0 last month, and Skype for Linux is currently at version 1.3.

Tuesday, November 07, 2006

Adobe contributes Flash code to Mozilla

Adobe has opened the source code of the ActionScript Virtual Machine, the high-performance ECMAScript implementation used in Adobe's ubiquitous Flash Player. Adobe has made the source code available under three prominent open source licenses, and contributed it to Mozilla for eventual inclusion in Firefox.

Mozilla has responded to Adobe's contribution by creating the Tamarin Project. Named after a species of tiny arboreal monkeys, the project aims to develop a complete ECMAScript 4 implementation based on Adobe's virtual machine. The Tamarin Project team includes developers from both Mozilla and Adobe.

The Tamarin project roadmap also involves integration of the new virtual machine into SpiderMonkey, the open source JavaScript interpreter used by Firefox. Developed from scratch by Adobe for the recently released Flash Player 9, the virtual machine features a unique Just-in-Time (JIT) compiler that converts ECMAScript bytecode into native machine code.

The JavaScript interpreter currently used in Firefox has been the subject of some criticism, and although it beats Internet Explorer in many benchmarks, it doesn't even come close to matching the performance of Opera's JavaScript implementation. The Tamarin Project developers plan to alter the SpiderMonkey compiler so that it can leverage the native code generation functionality of the new virtual machine, dramatically increasing the runtime performance of JavaScript in Firefox on many platforms. High-performance JavaScript execution facilitated by native code generation could enable web developers to produce rich web applications with an unprecedented level of sophistication.

A trend towards development of interactive web applications with dynamic scripting techniques has recently increased the relevance of JavaScript on the web.

Mozilla developers hope to integrate Tamarin into Firefox in 2008. Source code is already available in the Mozilla CVS, and independent open source developers are encouraged to participate. More information is available from the Tamarin FAQ and on the Tamarin mailing list.

Microsoft beats Apple to the TV

It has been some time in the making, but Microsoft has finally announced their online video entertainment strategy for the Xbox 360.

Beginning on November 22, Xbox 360 owners will be able to buy and rent over 1,000 hours of programming using the new Xbox Live Video Marketplace. The Marketplace will see Microsoft selling programming from CBS, MTV Networks, Paramount Pictures, Turner Broadcasting, Ultimate Fighting Championship (UFC) and Warner Bros. Home Entertainment. "CSI," "South Park," "Batman Forever," and "Nacho Libre" are a few of the titles that will be offered by year's end. At launch, TV shows will be offered as download-to-own, while movies will only be available as 24-hour rentals.

Microsoft's arrival in this emerging market brings with it some impressive firsts. The company has narrowly beaten Apple in the race to find a practical set top box-like solution to get commercial video downloads to the family room TV without having to move hardware around. Apple is expected to launch its own wireless set top box solution, dubbed iTV, some time in the first quarter of 2007. Apple has already announced a retail price of $299.

Microsoft's plan to combat Apple's offering rests in the added value of the complete Xbox 360 package. iTV's features are not yet known, but a premium Xbox 360 with an additional wireless network adapter is $480, $180 more than iTV, but considerably more capable. Yet, even if they have beaten Apple to the punch, Microsoft's biggest concern will be deflating Sony's offering which will likely debut next year.

There's another trick up Microsoft's sleeve, as well. The company says that much of the content will eventually be available in High Definition—a welcome development for those of us who don't like paying full price for low resolution video. No other commercial service currently sells HD video, although sources have told us that Sony is planning to launch with HD. Microsoft says that 20 percent of the offerings will be in HD at launch, with more to follow.

Microsoft has also said that users will be able to re-download content for free, in the event that users delete content to make space for new purchases, or even if users want to sign in to Xbox Live and download the show on another console. Shows will be attached to a user's Xbox Live gamertag.

For now, the service will launch in the United States, but Microsoft plans to expand it to other territories over the coming year.

Sunday, November 05, 2006

Apple Releases 8GB RED nano

Apparently response to Apple's RED nano has been so impressive that the company announced a new 8GB model on Friday. The first model, released in mid-October had 4GB.

The RED nanos were designed by U2 lead singer Bono (who was not at the DEMO show) and Bobby Shriver of the Kennedy/Shriver family. It is part of the RED campaign currently being endorsed by GAP, Converse, Emporio Armani, American Express, Motorola, and Apple. Each sponsoring company has its own RED products, which raise money for The Global Fund to Fight AIDS, Tuberculosis, and Malaria. For each RED nano sold, Apple will contribute $10 to the fund.

"Customer response to the iPod nano (PRODUCT) RED Special Edition has been off the charts," said Greg Joswiak, Apple's vice president of Worldwide iPod Product Marketing in a statement. "We're thrilled to add a second model with 8GB of capacity, enough for 2,000 songs, so customers have yet another choice in supporting this important cause."

The new 8GB model will sell for $249, while the 4GB model will remain $199. Both models have a red aluminum enclosure and a claimed 24-hour battery life.

For more information on (PROJECT) RED, go to www.joinred.com.

Thursday, November 02, 2006

YouTube downloads for mobile devices by 2007?

Speaking at the OgilvyOne Verge Digital Summit in New York yesterday, YouTube founder Chad Hurley said that he hopes to "have something on a mobile device" by the end of 2007. "It's a huge market and with our video lengths, it's a natural."

It's one of Hurley's first public outings after the Google acquisition of his YouTube service. YouTube already has a mobile service, but only for uploading videos, not watching them. Hurley wasn't sure he could figure out a way to make money from mobile downloads, since it's tricky to insert advertising on small screens without ruining the user experience. "It would be great to make the ad model work on a mobile device," he said. "I haven't seen that work."


Wednesday, November 01, 2006

Dell Launches First AMD-Based Laptop

Dell launched its first notebook PC powered by a processor from Advanced Micro Devices instead of an Intel chip.

With the Inspiron 1501, Dell offers users a choice between AMD's low-end Sempron, mobile Turion 64, and dual-core Turion 64 X2 chips. The product is aimed at home entertainment and small business users, offering a baseline configuration of the Sempron chip, widescreen 15.4-inch display, 80GB hard drive, and 512MB of memory for a starting price of $549.

Dell launched the product today without fanfare, simply listing it on the company Web site without the usual flurry of press releases. The company did not return calls for comment.

Dell had announced in May that it would began selling AMD-based servers, after remaining loyal to Intel processors for so long that it was the only major PC vendor not offering its customers a choice. By then, AMD had eaten significant chunks of Intel's enormous market share, riding the success of its efficient Opteron server chip to acclaim for its full range of chips. Indeed, by September Dell had also launched AMD-powered desktops.

Dell has shown great confidence in AMD by choosing to use the new chips in its notebook line, the one segment of the PC industry that has been fighting a recent slump in sales growth. Now that AMD has finally climbed aboard the Dell sales machine, it can claim a presence with nearly every vendor in the U.S. notebook market. The only remaining bastions of purely Intel notebooks are Lenovo Group, Sony, and Toshiba.

Monday, October 30, 2006

New Windows attack can kill firewall

Hackers have published code that could let an attacker disable the Windows Firewall on certain Windows XP machines.

The code, which was posted on the Internet early Sunday morning, could be used to disable the Windows Firewall on a fully patched Windows XP PC that was running Windows' Internet Connection Service (ICS). This service allows Windows users to essentially turn their PC into a router and share their Internet connection with other computers on the local area network (LAN.) It is typically used by home and small-business users.

The attacker could send a malicious data packet to another PC using ICS that would cause the service to terminate. Because this service is connected to the Windows firewall, this packet would also cause the firewall to stop working, said Tyler Reguly, a research engineer at nCircle Network Security Inc., who has blogged about the issue.

By knocking off the Windows Firewall, a criminal could open the door to new types of attacks, but there are a number of factors that make such an attack scenario unlikely, Reguly said.

For example, the attacker would have to be within the LAN in order to make the attack work, and, of course, it would only work on systems using ICS, which is disabled by default. Furthermore, the attack would have no effect on any third-party firewall being used by the PC, Reguly said.

Users can avoid the attack by disabling ICS, Reguly said. But this will also kill the shared Internet connection.

An easier solution, may be for ICS users to simply move their networks onto a router or NAT (Network Address Translation) device, said Stefano Zanero, chief technology officer with Secure Network SRL. "They are so cheap right now, and in many cases they offer better protection and a easier administration of your LAN," he said via instant message.

Windows XP appears to be the only platform affected by this attack, which has not been successfully reproduced on Windows Server 2003, Reguly said.

Microsoft 's initial investigation into the matter "has concluded that the issue only impacts users of Windows XP," the company's public relations agency said Monday in a statement. "Microsoft is not aware of any attacks attempting to use the reported vulnerability or of customer impact at this time."

Thursday, October 19, 2006

First IE7 Security Flaw Found

Less than 24 hours after the launch of Internet Explorer 7, security researchers are poking holes in the new browser.

Danish security company Secunia reported today that IE7 contains an information disclosure vulnerability, the same one it reported in IE6 in April. The vulnerability affects the final version of IE7 running on Windows XP with Service Pack 2.

If a surfer uses IE7 to visit a maliciously crafted Web site, that site could exploit the security flaw to read information from a separate, secure site to which the surfer is logged in. That could enable an attacker to read banking details, or messages from a Web-mail account, said Thomas Kristensen, Secunia's chief technology officer.

"A phishing attack would be a good place to exploit this," he said.

One of the security features Microsoft touts for the new browser is the protection it offers users from phishing attacks.

Secunia rates the security flaw as "less critical," its second-lowest rating, and suggests disabling active scripting support to protect the computer. The flaw could result in the exposure of sensitive information and can be exploited by a remote system, Secunia said in a security advisory posted on its Web site.

It is hard to exploit the flaw because it requires the attacker to lure someone to a malicious site, and for the attacker to know what other secure site the visitor might simultaneously have open, Kristensen said.

"A quick user browsing through our Web site using IE7 found it failed one of our tests," he said.

The company then verified the information, notified Microsoft and published a proof-of-concept exploit on its Web site.

Update: Microsoft said Oct. 19 that the problem was not with IE 7, but with Outlook Express.

Monday, October 16, 2006

Wikipedia co-founder plans 'expert' spinoff

Larry Sanger, co-founder of Wikipedia, says he will launch a spinoff of the free site, called Citizendium. It will include user registration and editorial controls to govern user-submitted articles, unlike the free-for-all submission process that reigns on Wikipedia. With "gentle" controls in place, Sanger says Citizendium will naturally weed out so-called trolls from posting obscenities or biased information.

"Wikipedia is amazing. It has grown in breadth and depth, and the articles are remarkably good given the system that is in place. I merely think that we can do better," Sanger said. "There are a number of problems with the system that can be solved, and by solving those we can end up with an even better massive encyclopedia."

Sanger said an invitation-only, pilot version of his nonprofit site will launch this week, but wider release has yet to be determined.

Since early 2001, when Sanger helped get Wikipedia off the ground with co-founder Jimmy Wales, the service has become one of the most popular research tools on the Web and one of its fastest-growing sites, with more than 2 million articles in 229 nationalities. In September, the site attracted more than 33 million unique visitors, up 162 percent from the same period a year earlier, according to research firm Nielsen NetRatings.

Like Wikipedia, he wants the service to evolve with public participation--it will be a "fork" of the open-source code of Wikipedia, meaning that it will replicate its existing database of articles and then evolve, through user participation, into a new compendium of its own. But unlike Wikipedia, Citizendium will have established volunteer editors and "constables," or administrators who enforce community rules.

Citizendium is soliciting experts in their fields to post and oversee articles on any given subject. Another difference from Wikipedia is that Citizendium will require that members register with their real name to post to the wiki. That, Sanger said, should also discourage shenanigans.

iPass to expand remote management

As mobile and remote access becomes more commonplace over a variety of wireless and wired networks, third party outsource providers are expanding their management services to the enterprise.

This week iPass will announce Virtual Office and Device Lockdown, a suite of management and security services respectively, for companies with heterogeneous remote networks.

Virtual Office will give companies a single piece of client software whether connecting over Wi-Fi, cellular, DSL or T-1 from home or remote office. Also included is a portal service for configuration and quality of service management in addition to support, training, and billing information.

Device Lockdown quarantines a device attempting to access the corporate network until it is patted down by the iPass policy server, according to Michael Suby, Research Program Director for Stratecast, a division of Frost & Sullivan.

The iPass policy server checks all compliance requirements before it allows a user to access a VPN and touches the customer's LAN. iPass will also white label the Virtual Office and Lockdown services to other telecommunications providers.


Sunday, October 15, 2006

Intel to launch Quad-Core chips on Nov 13

In a race with rival Advanced Micro Devices, Intel will bring its quad-core chips to market in a new line of Hewlett-Packard workstations due to be introduced on November 13.

HP sent out invitations to the event but did not specify exact models and prices. The computers will probably use Intel's planned Xeon 5300 chip, and will be designed to run high-end applications like seismic analysis and visualization technologies from Ansys, Autodesk, Landmark Graphics, and Parametric Technology.

The launch would mean that Intel brings quad-core processors to market before AMD, a crucial win in a year when Intel has made as many headlines for its layoffs and missed earnings targets as for its technology.

AMD plans to release its own quad-core chips in the middle of 2007, and claims its monolithic design is superior to Intel's plan, which essentially glues two dual-cores chips together. But without having any hardware to test, analysts are divided on whether this detail will significantly affect the chips' performance.

Compared to the ratcheting of clock speeds in conventional chips above 3 GHz and 4 GHz, multiple-core chips can accelerate processing tasks in desktops and servers without drawing more electricity and generating extra heat. They can also handle more than one instruction set at a time, allowing computers to multitask more efficiently.

Firefox accepting feature suggestions for V3

The Firefox web browser has come a long way since the project was announced as a fork from the open-sourced Mozilla project. Version 1.0 was released in 2004 and quickly won critical acclaim for its speed, compatibility with web standards, and features. In a couple of years, Firefox managed to reach a milestone that its predecessor never quite reached: hitting 10 percent market share worldwide. Version 2 of the browser recently hit Release Candidate 2, but the team is already making plans for 3.0. The Mozilla organization has set up a feature brainstorming web site that allows everyone to enter their favorite wish lists for the open source browser.

The wish list is long indeed, and it provides an insight into the desires of the browser community, and a look at the open source development process. While closed-source projects often ask their user community for feedback on requested features, the process is not usually open to the public. For Firefox 3, anyone can both suggest new features and comment on other people's suggestions.

The feature requests are divided into categories, such as browser customization, privacy features, security, history, download manager, and other areas. There are suggestions for features found in other competing browsers, such Safari, IE 7 beta, and Opera. IE7 seemed to be featured most prominently, with requests for "low-rights mode," as well as more cosmetic features like skins that mimic Microsoft's browser.

Customization seems high on the list: floating menu and toolbars, tabs that are draggable to other sessions of Firefox, and the ability to add tag notes to web pages are all present.

For those adventurous folks who want to take a sneak peak at Firefox 3's progress, early alpha builds are now available for download.

Microsoft's PDF-killer heads towards standards body

There's no doubt about it: Adobe's Portable Document Format - better known as PDF - is a choice tool for digital document delivery. Some might say that it's the tool for delivering complex documents to wide array of users, as its design allows for faithful rendering on any platform that supports PDF - application issues, font problems, layout quirks, etc., need need not apply.

Enter Microsoft. The company has been toiling away on its own portable document technology for some time and plans to make a splash with it in 2007. Dubbed the XML Paper Specification (or more succinctly, XPS), Microsoft plans support for the new format in both Windows Vista and Office 2007. In response, Adobe went to EU regulators earlier this year to ask that they bar Microsoft from including XPS support in Windows Vista, fearing that the ability to create XPS documents for free could cut into their ability to sell PDF creation software to Windows users.

Now in a move to appease EU regulators, Microsoft is going to step things up a notch and try to push XPS through as a standard. For Adobe, this could ultimately make XPS more — not less — popular.

Microsoft is looking again at its license in order to make it compatible with open source licenses, which means that the "covenant not to sue" will likely be extended to cover any intellectual property dispute stemming from the simple use or incorporation of XPS. The end result is that using XPS may be considerably more attractive for developers now that the EU has apparently expressed concerns over the license.

The company has not hinted to which standards body it would submit XPS, but a few things are clear already. First, standards approval will see Microsoft opening XPS to the point that any platform could theoretically support it, including Linux and Mac OS X. If it remains royalty-free, this could mean a proliferation of support for the format. Second, given that the EU is pushing Microsoft to be more open with XPS, we can expect Microsoft to take an approach similar to Adobe: the specification would remain open but also controlled by the company.

Thursday, October 12, 2006

Treo 680: Affordable, Media Friendly

Palm today announced a new series of Treo smart phones designed to appeal to "price-sensitive" customers. The company also announced the immediate availability of a new, free Google Maps application for Treos based on the Palm operating system.

The announcements of the Treo 680 series and Google Maps for Palm OS-based Treos came at the DigitalLife trade show here. Palm CEO Ed Colligan was mum on pricing, but said the Treo 680 would ship in November and would be "lower cost and easier to use than any other Palm." The new Treo is a quad-band GSM/GPRS/EDGE handset for use in the United States and Europe.

The Treo 680 targets today's media-hungry consumers. It comes with music, video, and photo slide-show players, and Palm says for a limited time it will sell unlocked versions of the phones with a Yahoo music bundle that will include a 1GB SD Card, a stereo headset, and a 30-day free trial to Yahoo's music service. Colligan said Palm expects 20 or more carriers around the world to offer the Treo 680 by the end of Palm's fiscal year next June 1.

Fashion-conscious Treo 680 buyers will be able to choose from four colors: copper, arctic, crimson, and graphite. The Treo 680 has an internal antenna and is smaller and sleeker than previous models. The phone runs the new Palm OS 5.4.

Other features include an enhanced version of Palm's VersaMail e-mail application (version 3.5), which allows for more robust syncing of e-mail and now contacts and calendars as well, the company said. Another update brings a feature that displays SMS text-messaging exchanges as "threaded chats," similar to instant messaging.

Among other bells and whistles is the ability to respond to an incoming call with a preset text message such as "I'm busy now." The bundled Dataviz Documents To Go productivity software now supports viewing Adobe PDF files as well as editing and creating Microsoft Office files.

The Treo 680 smart phone also can function as an MP3 player, and has an integrated digital camera, camcorder, and video player. It sports a 320-by-320 touch screen, a full QWERTY keyboard, and Bluetooth 1.2. The 680 comes with 64MB of usable memory, with expansion available via an SD Card. Palm says the unit's battery is capable of 4 hours of talk time and 300 hours on standby.

Google Docs has some real competition

While the world is going ooh and ah about the merging of Writely and Google Spreadsheets into a package called Google Docs and Spreadsheets, it may surprise them to know that Google has some serious competitors in the online office productivity space. What’s more, in some cases they're way more advanced than the search leader.

Online software as a service (SaaS) applications have been with us for some time and have been predicted by organizations such as Gartner to gain a sizable chunk of the business applications market by the end of the decade.

In the online office productivity space, however, there are also some emerging products that have been developed. Two examples that readily spring to mind are Zoho and Thinkfree.
Both of the above-mentioned Web 2.0 products, unlike Google, offer the full suite of basic office productivity tools, including a word processor, spreadsheet and presentation application. Zoho also offers a free database, a planner, a project management package and, for a monthly rental of US$12, a CRM package.

Thinkfree probably presents the most well integrated package, with a web-based implementation of a virtual filing system for documents that simulates the desktop. Thinkfree also offers off-line users a Java-based desktop Microsoft Office compatible clone for US$50 that runs on Windows, Mac OS X and Fedora Core 3 Linux.

As far as solving the needs of offline users, at $50, the Thinkfree Microsoft Office clone sounds interesting. However, Open Office.org 2.0 is free and has already proven itself to be good enough for business use – even if Microsoft says it’s 10 years behind Office 2007.

Like Google Docs and Spreadsheets, neither Zoho nor Thinkfree have really solved the documents storage issue satisfactorily. The Google method of tagging documents is not really the way users are accustomed to organizing their information with the Windows folders based filing system.

Thinkfree makes the best attempt, with a rudimentary web top filing system that simulates the Windows My Documents folder. However, the comparison is superficial, as it’s nowhere near as powerful, not even enabling simple things like folders within folders.

Zoho is reportedly out of beta now, has single sign-on for all its applications and has said that it is working on developing and integrating a web top system.

It’s fairly safe to say that all of the online office productivity tools will do the job if your needs are simple. However, none appear to have quite the industrial strength grunt yet for business strength applications. No doubt, however, they will before too long.

In one respect, however, all of the online office tools totally outshine their desktop equivalents – collaboration. It is so much easier for a group of users to share access to a document that stored in a central location. It sure beats passing it around by email – especially if the file is large.

Thus, we may look forward to a not too distant future, when web access is ubiquitous, in which we are no longer paying through the nose for office productivity software and we no longer care which operating system we’re using.

Wednesday, October 11, 2006

Nokia plans WiMAX cell phones

The tether tying cell phones to cellular networks will be further loosened in 2008, when Nokia introduces its first WiMAX-capable cellular phones. Nokia isn't releasing too many details on the handsets other than the fact that they will work with the mobile version of the WiMAX standard.

Officially known by its 802.16 monikers, WiMAX comes in two flavors: fixed location and mobile. 802.16REVd handles the fixed-position WiMAX, which promises wide-area wireless connectivity with DSL-type speeds. 802.16e-2005 is the mobile version and promises similar speeds for applications and devices needing something more than a fixed-point connection.

WiMAX is slowly spreading its wireless tendrils, with some installations in Europe and a handful under way in North America. Recently, Samsung, Intel, Motorola, and Sprint Nextel joined forces to build a national WiMAX network using Sprint Nextel's spectrum and cell towers with hopes of a 2008 launch.

In addition to announcing the handsets, Nokia said it will begin selling the Flexi WiMAX Base Station in 2007. Targeted at WiMAX network operators, the Base Station uses a small and modular design which Nokia says can lessen the costs of deploying WiMAX networks. "As the world is going wireless, we believe the Nokia Flexi WiMAX Base Station offers broadband operators an easy and trusted way to offer wireless Internet connectivity to their customers anytime, anywhere," said Nokia SVP of Radio Networks Ari Lehtoranta. "Nokia is a strong believer in having a multiradio strategy that gives operators a future-proof solution and the flexibility to choose different technologies as they evolve."

Nokia currently offers a couple of cell phone models that can also use 802.11b/g for VoIP calls. It has also begun a pilot program for what it calls Unlicensed Mobile Access, where calls are routed seamlessly between WiFi and traditional cell networks, depending on which is the best option. With a handset that can work with WiFi, mobile WiMAX, and cell networks, mobile phone users would have even more options for how to place calls, perhaps to the dismay of the cellular providers.

GPS Capability Enhanced in MS Streets & Trips 2007

Microsoft today announced improved GPS (global positioning system) functionality in a new version of its travel and mapping software.

Microsoft Streets & Trips 2007 With GPS Locator includes a new receiver from Pharos Science and Applications, the SiRFstarIII, for mapping locations to GPS coordinates. The new GPS locator is ten times more sensitive than its predecessor in the previous version of Streets & Trips, according to Microsoft.

Microsoft Streets & Trips 2007 With GPS Locator sells for $129 in the United States, while the standard version of the software is available for $40.

To use the new locator, customers can plug the GPS receiver included with the software into a notebook PC's USB (Universal Serial Bus) port, after which they can view maps and travel routes in real time. Buyers who want to use the locator wirelessly can purchase the necessary Bluetooth dock or CompactFlash card adaptor directly from Pharos, Microsoft said.

Besides mapping routes for people traveling by car, Streets & Trips includes points of interest--such as gas stations, hotels, restaurants, and national parks--on the map, in case travelers want to stop along the way.

In a press statement, Helen Chiang, a product manager at Microsoft, said that better GPS capability in Streets & Trips will give users of the software more confidence that the journey they've mapped in the software is the correct one.

Pharos, headquartered in Torrance, California, sells GPS navigation tools and location-based services for mobile devices.

Tuesday, October 10, 2006

IBM Cranks Up Its Server Chip

IBM plans to crank up the speed on its Power6 server chip to 5.0GHz, far higher than competing processors from Intel and Sun Microsystems.

Despite its high frequency, the chip will avoid overheating through its small, 65-nanometer process geometry, high-bandwidth buses running as fast as 75GB per second, and voltage thresholds as low as 0.8 volts, IBM said.

When it ships the chip in mid-2007, IBM will target users running powerful servers with two to 64 processors, said Brad McCredie, IBM's chief engineer for Power6. He shared details on the chip at the Fall Microprocessor Forum in San Jose, California.

By doubling the frequency of its current Power5 design, IBM is swimming against the current of recent chip designs that sacrifice frequency for power efficiency. Instead, IBM cut its power draw by making the chip more efficient, with improvements like computing floating point decimals in hardware instead of software, he said.

The company hopes the Power6 will help it reach new customers in commercial database and transaction processing, in addition to typical users of its Power5 chip in financial and high-performance computing such as airplane design and automotive crash simulation, McCredie said. To win that business, IBM will have to compete with chips like Intel's Montecito Itanium 2 and Sun's high-end SPARC processors.

If this chip works as promised, IBM could be successful in that effort, analysts say. IBM is one of the few remaining alternatives to Intel in the market for 'big iron' servers used in high-end jobs like scientific computing, image processing, weather prediction and defense, said Jim Turley, principal analyst at Silicon Insider, in Pacific Grove, California.

IBM upgraded its current midrange Unix servers in February from 1.9GHz to 2.2GHz Power5+ processors, targeting users of large databases, ERP (enterprise resource planning) and CRM (customer relationship management). The company will ship several versions of the Power6 chip, ranging from 4.0GHz to 5.0GHz in frequency.

Google Blog Gets Hacked

A hacker broke into Google's main official blog and posted a false message on Saturday, saying that the company had decided to cancel a joint project with eBay.

The intrusion marks the second time this year that Google's official blog has fallen into unauthorized hands. In March, Google staffers deleted the so-called Google Blog by mistake and someone briefly took control of the Web address.

n Saturday's incident, someone exploited a bug in Blogger, the Google Web log publishing service on which Google Blog is hosted. The hacker published a note riddled with grammatical and spelling errors that said Google had ended its click-to-call advertising project with eBay because it was "monopolistic."

The next day, Karen Wickre, from the Google Blog team, alerted readers about the false posting and said the Blogger bug had been fixed, without detailing the breach. The eBay project remains alive and well, she wrote on the blog.

The Google Blog is one of the company's main communication tools. As official corporate messages similar to press releases, its postings often trigger news reports, analyst recommendations, and investor decisions.

Monday, October 09, 2006

Crawl the Web with your fingers

If you have a fingerprint scanner hooked up (or built in) to your PC, you've probably thought to yourself, "Self, if this scanner can give me access to my own computer, why can't it log me into websites?" Now it can, thanks to the new TrueMe service from Pay By Touch, one of those firms that has already helped to bring biometric identification into the market.

The new service, announced today, uses certified fingerprint scanners to replace username/password combinations on the Web. "With TrueMe, a simple touch of the finger gives Chief Security Officers the security they demand while giving users the simplicity they desire," said Jon Siegal, a Pay By Touch VP. "TrueMe satisfies both needs without the hassle of multiple User IDs and passwords."

The scanners must be certified because encryption of the fingerprint is done inside the sensor. When a user swipes a finger, the recognition data is compressed and encrypted, then sent to a TrueMe server, which handles authentication. If the user is allowed to visit the website or resource in question, the server sends the verified identity directly to the site.

Given the way that crooks have attacked traditional two-factor authentication systems, will fingerprints prove to be more secure? Hopefully. The TrueMe system also records the device ID of the fingerprint scanner used in the authentication attempt, potentially making it easier to spot fraud and to track down malicious users. We imagine that the technology could also be used by businesses to restrict employee access to sensitive internal websites to certain company-supplied PCs, though Pay By Touch says nothing about the way that the ID check will be used.

While Pay To Touch shows its own branded scanner on its site, the ones built into Lenovo T60 and X60 machines will also work. TrueMe isn't free; there's a yearly fee to use the service, which is currently targeted at business users.

Google buys YouTube

Its confirmed now - Google has agreed to buy YouTube for $1.65 billion in stock. The news comes after a cornucopia of press releases announcing Google and YouTube deals to distribute music videos from Universal, Sony, Warner Music, and CBS, paving the way for a relatively risk-free buyout from Google's perspective.

In the conference call accompanying the press release, the founders of both companies unanimously professed their excitement about the deal, saying that it's a great fit on many different levels. Eric Schmidt said that it was about vision, not about business, and that the YouTube guys reminded him of the early days of Google. Sergei Brin added that video content is certainly information, so the acquisition fits with Google's stated mission of organizing all the world's info.

As for the YouTube leaders, Chad Hurley kept repeating that YouTube has been given the opportunity and resources to "sharpen their focus" and build a better "new media platform" than they could have done on their own. Specifically, Google's "revolutionary new advertising platform" inspired ideas of how to improve the media platform, and users who now demand control over what to watch and where and when to watch it will get what they need from the new YouTube. According to Steve Chen, the two management teams just finished a 48-hour brainstorm where they worked up a list of "potential integration points" between the companies, so there's "no shortage" of ideas on how to improve the user experience or how to make money off the combination.

YouTube will remain a separate brand, and Google Video is not going away. It's unclear how the two services will be different from each other, but Eric Schmidt did mention that one of the principal strengths of YouTube was the social networking aspect of it. Google is issuing stock to pay the YouTube owners rather than dipping into its $10 billion war chest, and according to Schmidt that's because it becomes a tax-free deal for the YouTube team. "Our deals are very, very good for our partners," he said to scattered laughter, calling it a Google hallmark.

Thursday, October 05, 2006

Konica Minolta shows wearable display prototype

Konica Minolta is developing a lightweight, holographic wearable display, a prototype of which was on display this week at the Ceatec exhibition in Chiba, Japan.

The Holographic See-Through Browser prototype resembles a pair of eyeglasses and uses a prism with a thickness of 3.5 millimeters and a holographic element to reduce the weight of the display to 27 grams.

Konica Minolta has just begun development of the lightweight display and is looking for an application where the device could be useful, said Hiroshi Itou, an assistant manager at the business development group of Konica Minolta Technology Center Inc. Possible applications under consideration include giving workers access to an instruction manual or allowing commuters to watch a video while riding a train, he said.

In a video demonstration of the technology, Konica Minolta showed how a user could watch a motorcycle race on the display while walking around their house. In this demonstration, the see-through image of the game appeared to be float in the user's line of sight.

The display image is produced by a small attachment above the glasses, which contains an LED (light-emitting diode) that projects the image through a condenser lens and a prism. Once the image travels through the prism, it passes through the display where it is projected onto the holographic element.

The display attachment on the glasses is connected by a cable that leads to a small, wearable device.

Best Buy launches iTunes competitor

Best Buy, in cooperation with SanDisk and RealNetworks, is the latest company to join the growing list of competitors to Apple Computer's iTunes music service.

Best Buy unveiled on Thursday a online music service, called Best Buy Digital Music Store, that allows customers to find, manage and purchase music online. It is powered by RealNetworks' Rhapsody 4.0 music service and lets users purchase and permanently download songs and albums, as well as subscribe monthly to listen to an unlimited number of songs, the company said.

As part of the offering, Best Buy also will carry and promote SanDisk Sana e200R Rhapsody MP3 players, which have been optimized to work with its new music service. Both the players and the service will be available starting Oct. 15, the company said.

Jennifer Schaidler, vice president of music for Best Buy, said the company is differentiating its service from Rhapsody by offering exclusive artist content and tailoring that to what Best Buy customers are purchasing.

"Look at it as Rhapsody 4.0 plus," she said. "You get all the stuff that's there [on Rhapsody], plus more exclusive content."

Selling CDs and MP3 players in its stores and online is already a successful part of Best Buy's business, so offering a music service was a logical next step for the company, Schaidler said. "Customers expect Best Buy to provide them with quality entertainment in an easy way," she said.

Songs on the Best Buy Digital Music Store will cost $0.99, with monthly subscriptions that allow users to play an unlimited number of songs for $14.99 a month

The news comes on the heels of the formal unveiling last week of the availability and pricing for Microsoft's forthcoming Zune Player and Zune Marketplace service. Microsoft will make the digital media players and service available in the U.S. on Nov. 14.

Like Best Buy's new service, songs on the Zune Marketplace will cost about $0.99 each, though the charges will be according to a points system (i.e., 79 points a song) that will allow users to purchase items on other Microsoft properties, such as Xbox Live. Zune Marketplace's unlimited monthly subscription also costs $14.99 a month. Songs on Apple's iTunes service cost $0.99, but there is no monthly subscription available.

Monday, October 02, 2006

Firefox JavaScript security "a complete mess"?

Firefox is loaded with security flaws, according to a hacker duo that presented at this year's ToorCon. Mischa Spiegelmock and Andrew Wbeelsoi used a session at the show to highlight what they have called "a complete mess" that is "impossible to patch" in Firefox's JavaScript implementation. According to the pair, the implementation is home to at least 30 possible exploits, all of which they plan to keep to themselves. CNet's Joris Evers brought the story to light this past weekend, but reports are surfacing everywhere.

The presentation, dubbed "Lovin the LOLs, LOL is my will," actually only focused on one flaw, which the presenters said affects Firefox on Windows, Linux, and Mac OS X. The exploit reportedly causes a stack overflow by merely including a small snippet of JavaScript code on a webpage. Spiegelmock and Wbeelsoi have declined to fully detail the exploit, however, leaving Mozilla a bit in the dark. In fact, after a Mozilla employee exhorted them to report the flaw and collect a $500 reward, Wbeelsoi said "what we're doing is really for the greater good of the Internet, we're setting up communication networks for black hats."

Mozilla's head of security, Window Snyder, indicated that Mozilla believes the exploit to be real. She has also said that the presentation given at the conference contained enough information that other hackers may be able to reproduce the exploit before it can be patched.

Reports of the flaw come less than a week after Symantec's biannual Internet Security Threat Report indicated that the number of browser vulnerabilities is on the rise. Firefox led the pack both in terms of absolute number of vulnerabilities disclosed on the last six months, and in terms of percentage growth over the year. The report also noted that Firefox had the lowest "window of vulnerability," meaning that the time between identification and fix was comparatively shorter that for other browsers. Nevertheless, the current state of affairs has led many readers to start joking, "Firefox: the next Internet Explorer."

Monday, September 18, 2006

Four formats on a single disc?

Does riding the DVD/HD DVD/Blu-ray Roller Coaster of Consumer Confusion make you queasy? Supporting three formats is giving the movie studios indigestion, but so far they haven't found the right antacid. It costs time and money to produce discs in multiple formats, and it takes a valuable shelf space at retail. Surely there must be a better solution than selling three kinds of discs and three kinds of players?

Talk of a hybrid player that could handle both new high-def formats set the tech world buzzing when the chipset was demonstrated early this year, but hardware based on it has yet to materialize. The other approach, making hybrid discs, has so far produced only HD DVD/DVD hybrids. While HD DVD and traditional DVD share enough in common to make the manufacturing processes similar, Blu-ray requires an expensive technology upgrade.

The basic problem is that the different technologies use different types of lasers and store data at different depths. Traditional DVDs use a 650nm red laser, while Blu-ray and HD DVD both use a blue laser at 405nm. DVD and HD DVD share the same data depth, though, at 0.6mm, while Blu-ray's pits are only 0.1mm from the surface.

A recent patent unearthed by New Scientist suggests that Warner has seen the hybrid future and could one day produce discs featuring all three formats, plus CD.

The patent, which lists several top Warner execs as the inventors, describes how "a dual disc may also be formed with two high-capacity data layers, one conforming to the HD DVD format and the other conforming to the BD [Blu-ray disc] format." Warner engineers have figured out a way to use semireflective coatings to allow the two layers to coexist on a single side, using HD DVD's greater depth to position it beneath the Blu-ray data layer.

The patent then describes an implementation in which both sides of the disc contains data layers, which means that we could see discs with CD and DVD layers on one side, HD DVD and Blu-ray on the other. The downside to this approach is that each format gets only one layer, reducing its total capacity. Discs are not especially expensive to produce, so it may make financial sense for studios to begin shipping two-disc movies that contain all three formats. While this would simplify things for consumers and would free up valuable shelf space, it would also make it hard to offer different price points for DVD and high-definition formats. It might well make more sense for studios to release DVD editions in one box, high-definition versions (Blu-ray on one side, HD DVD on the other) in another.

Intel pioneers silicon laser technology

Technology is advancing to a point where copper connections over small distances cannot achieve throughputs needed for high-performance computing. The electrical characteristics of wire at these high frequencies usually cause more problems that it's worth. In order to expand bus transmission rates to a terabit scale, optical connections are necessary. However, photonic semiconductor components have traditionally been expensive and difficult to manufacture in high volume. This is why most fiber-optic equipment is expensive and only used when transmission distances make it more costly to use copper. This is why you'd never see an optical bus in many high performance computing scenarios, let alone someone's desktop.

Intel has been working on ways to solve these problems and through a partnership with the University of California Santa Barbara, they've achieved a method of producing components using standard semiconductor processes that are reliable and cost-effective. They call this technology hybrid silicon lasers. A hybrid silicon laser is a laser that small enough to be built on modern electronic scales (nanometers) and allows for the conversion of electronic signals into highly efficient optical signals.

There are two components to a hybrid silicon laser. The first is a material that, when a charge is placed across it, emits photons. In this instance, Intel is using Indium(III) phosphide. The second component is a substrate of silicon that acts as a waveguide. Depending on how this silicon waveguide is designed, it will affect certain characteristics of the laser's output, such as wavelength. Previously, these two components had to be aligned precisely in a process that was costly and time-consuming. In Intel's new method, the two materials are coated with a thin layer of oxygen plasma and bonded at around 300°C. This creates a layer of what is essentially glass that bonds these two components together. What is important to note, however, is that via this process, the two layers no longer need to be precisely aligned, removing the expensive barriers to mass-production.

This means that bandwidth-intensive subsystems of large computing projects such as supercomputers, and someday high-performance workstations can have their copper bus architectures replaced with efficient optical versions. This would remove many of the limiting characteristics of copper technology which becomes increasingly bothersome at the frequencies involved in high bandwidth operations. This new technology will expand data rates between components dramatically. Intel has demonstrated a silicon-based optical modulator operating in excess of 1GHz while other researchers have demonstrated data transmission rates as high was 160Gb/s. Intel is optimistic that these technologies can expand to "terabit" level connections in the future.

Sunday, September 17, 2006

Apple iPhone Clues in iTunes 7

An Apple analyst said Friday there is further proof the computer company will soon make its own iTunes-enabled cell phone.

Piper Jaffray analyst Gene Munster said in a research note, “We believe there is more tangible evidence of the existence of the iPhone from the resource files in the new iTunes 7.” He was referring to reports that the latest iTunes software, which was released Tuesday, has messages that suggest features for a mobile phone, such as messages about copying games and synching photos to mobile phones.

Mr. Munster said such features are not on the ROKR phone that was jointly developed by Apple and Motorola and released last October. “This resource file message suggests there will be a phone that will be capable of synching with iTunes, and that the phone will support iTunes and photos,” Mr. Munster said in the report. He added, “We believe this phone is most likely the iPhone, with an outside chance the message is in reference to an upcoming phone from a current phone manufacturer.”

The evidence, rather than the revelation of a phone, is what’s more compelling here. Indeed, UBS analyst Benjamin Reitzes said in a research note Tuesday that he is “still expecting new products in the coming months… including touch-screen video iPods with larger screens and cell phones.”

What’s more, talk of a phone made by Apple has been on the lips of observers for more than a year. Some believed the ROKR was deliberately made clunky—it’s an old-school candy-bar style with room for just 100 tracks—so that Apple could later offer an improved model.

Others believe that due to the extremely enthusiastic following of Apple fans, the Cupertino, California-based company could become a mobile virtual network operator (MVNO), selling its phones as well as Apple-branded cellular service.

Microsoft announces plans for Zune phone

Microsoft plans to release a Zune-based phone sometime in the future, according to Zune's general manager of global marketing, Chris Stephenson. The phone will be part of Microsoft's plans to expand into the digital music player market, although like the Zune itself, there were no details or timeline given for when to expect the Zune phone.

Microsoft, unlike Apple, doesn't seem to like keeping secrets about its plans to bust into the Apple-dominated digital music player market. Although there has been no official confirmation of the long-rumored "iPhone," Apple's CFO Peter Oppenheimer was quoted during an earnings conference call in July saying that "We don't think the phones that are available today make the best music players. We think the iPod is. But over time that's likely to change, and we aren't sitting around doing nothing." Apple's certainly not sitting around doing nothing according to some, who claim that the Apple phone is ready for production.

Without any details regarding either company's tight-lipped phone plans, it's hard to speculate whether one will trump the other or whether either of them will succeed at all. Apple's first foray into the phone business with the Motorola ROKR ended up being somewhat of a failure, and may have been due to Apple not having full control when working with Motorola in determining the phone's design and usability. Will the rumored Apple phone, when and if it ever happens, be able to put the ROKR out of its misery, and will the Zune phone be able to top it?