11/29/2012

Google improve before end of the year

Android crushes the competition in China as it passes 90% smartphone market share: Report

androidAndroid has established a clear monopoly in China after achieving more than 90 percent market share there, up from 58.2 percent a year ago, according to a new report from Analysys International (via Tech in Asia). The data combines estimates from both devices sales and ownership.
Google’s mobile OS soared to 83 percent last quarter, and it has continued its run, capturing an estimated 90.1 percent of the market. It’s possible Android’s overall share is even higher than estimated, as the firm doesn’t count knock-off phones, many of which are powered by the platform.
iOS also dropped from 6 percent to 4.2 percent. However, it may be underrepresented, as Analysys notes that it doesn’t include grey market imports. Since the iPhone 5 is not yet officially available in China, many sellers have resorted to having the device smuggled in from Hong Kong.
 androidmarketsharechina 520x401 Android crushes the competition in China as it passes 90% smartphone market share: Report
Nokia’s Symbian continued its tragic decline, dropping from 6 percent in the second quarter to 2.4 percent in the third. Just a few years ago, Nokia had the dominant position in the Chinese mobile market. With Nokia’s transition to Windows Phone, Symbian is on its way out, but sales of its Lumia devices are still getting off the ground.
Smartphones based on the Windows Mobile, BlackBerry and Linux operating systems took up a negligible share of the market.
Budget smartphones continued to gain momentum, as the average price for Android devices fell yet again, this time to ($223) RMB 1,393 down from $251(RMB 1,560). The average price for a Symbian smartphone came in at $179 (RMB 1,114) and iOS dropped slightly to $726 (RMB 4,523).
With Android taking over almost the whole market in China, at least according to Analysys’ estimates, the country is becoming a stronghold for the platform. Google. however, is unable to properly capitalize on its growing user base there, as many of its services are blocked or constricted.
Local Internet companies have moved in to take advantage of the opportunity. Baidu, for instance, has built its own software that works on top of Android, while Alibaba is pursuing its own mobile operating system that may or may not be based on Android, depending on who you talk to. Chinese smartphone maker Xiaomi has also thrown its lot in with Android for its MIUI ROM and Mi-One and Mi-Two smartphones.
Estimates from Analysys are worth noting because of their consistency, its interpretation of the Chinese smartphone market is just one of many. Recent third-quarter data from app analytics firm Umeng put Apple’s smartphone distribution on its platform at 33 percent.

 

Google Launches Ingress, a Worldwide Mobile Alternate Reality Game

 


What’s the wackiest thing you can imagine Google launching? How about a game to fight for control of the minds of everyone on earth?
Or maybe that’s not so wacky.
Meet Ingress, a new free mobile app and alternate reality game made by Google launching today (on Android first, available as soon as it makes it through the Google Play release process).
Ingress is a project of former Google director of geo John Hanke and his Niantic Labs, a start-up team wholly inside of Google.
“This grew out of us thinking about notions of ubiquitous computing,” Hanke told AllThingsD this week. “The device melts away.”
Ingress also aims to get people out in the physical world, both for physical activity and to see their surroundings in a new way.
Users can generate virtual energy needed to play the game by picking up units of “XM,” which are collected by traveling walking paths, like a real-world version of Pac-Man. Then they spend the energy going on missions around the world to “portals,” which are virtually associated with public art, libraries and other widely accessible places.
“The concept is something like World of Warcraft, where everyone in the world is playing the same game,” Hanke said. Players are on one of two teams: “The Enlightened,” who embrace the power, or “The Resistance,” who fight the power. Anyone can play from anywhere in the world, though in more densely played areas there will be more local competition for resources.
Outdoor physical activity is a big component of this, though driving between locations isn’t banned. “You’re like a rat in a maze on the phone,” Hanke said. Then, back at your computer, you can review the larger area and gameplay.
If self-driving cars or computer glasses are a head-scratching fit for Google, Ingress is perhaps even more so, because it’s a content project that’s expressly askew from reality. The company has hired game writers and artists, and hopes to stay a month or two ahead of the audience, Hanke said.
(You have to admit, this might be pretty fantastic to play from the point of view of those Google glasses.)
But Hanke contended that the game will be good for Google’s business from the beginning. That’s because of advertising. Ingress incorporates real physical stores and products in the game, and has brokered relationships with Hint Water, Zipcar, Jamba Juice and Chrome apparel and messenger bags.
And eventually, Google plans to make these real-world game tools available as a platform for developers to make their own.
Hanke said he wants the game to be a living creation that’s shaped by its players. Some early public teasers of the game on a dummy Niantic Project Web site had generated a lot of interest in Russia, Hanke said, so the team wrote some “aspects of Russia” into the game.
Niantic also wants the game to end at some point, or at least have a good stopping point in a year and a half or so.
“We were definitely inspired by JJ Abrams,” Hanke said, “but we don’t want to leave people in a ‘Lost’ situation where they get into the fiction of a world but then it never ends.”
Niantic’s first public product was Field Trip, also a mobile geo app for Android, released in September. Hanke described the factoid-finding Field Trip as “more of a mainstream Web tool” and Ingress as an option for people who are more comfortable with gaming and sci-fi.
The project, which was internally named Nemesis, has been tested by Google employees for the past six months.

Google announces new Shopping experience coming to Europe, Australia, Brazil, and Japan

Google today announced it is bringing its new Google Shopping experience to Australia, Brazil, Germany, France, Italy, Japan, Netherlands, Spain, Switzerland, and the UK. The new version will be rolling out gradually to give merchants some transition time and let them optimize their campaigns.
The first major change (cleaner results for shopping queries and Sponsored results) will take place on February 13, 2013. These new queries, which will let shoppers refine a search by brand or price and will feature larger product images, are available on a small percentage of searches starting today. Google is hoping to complete the rollout with all the changes intact by the end of Q2 2013.
If you’re sitting there scratching your head, let me explain. Yes, Google Shopping is available internationally already, but it was a very basic product search service (in fact, it was formerly called Google Product Search). What we’re talking about here is the company’s commercial model, built on Product Listing Ads, which tweaks the ranking in Google Shopping to be based on a combination of relevance and bid price, as well as throws in a few actually useful features.
Back in May, Google announced a new service that lets shoppers research products better, compare them based on features and prices, and connect with merchants to make their purchase. In other words, what a shopping service should do in the first place. This new model fully launched in the US on October 17, and now the company is pushing it out to the rest of the world.
Why did Google make the change? Here’s what the company says:
We made this transition because we believe that having a commercial relationship with merchants will lead to better, more up to date product data — which will mean better shopping results for users and in turn, higher quality traffic for merchants. We think this will bring the same high-quality shopping experience to people — and positive results to merchants – around the world.
In other words, Google Product Search sucked and the company was getting trampled by the competition (not to mention the lawsuits). Now Google claims to have come up with something that benefits both shoppers (products in one convenient place where they can compare and check out reviews) and merchants (advertisers get more granular control over product listings and traffic). Oh, and Google gets to make more money, of course.

11/15/2012

An Amazon engineer had a little idea that turned into a billion-dollar business


An Amazon engineer had a little idea that turned into a billion-dollar business


Once upon a time, Amazon was a dot-com-era technology company best known for selling books. Then, in 2003 and 2004, Amazon wanted to streamline its internal process between the programmers and the hardware engineers. It was a move that many other companies were taking, but an Amazon engineer had a brilliant idea: Why not use the same project to design an application that could rent chunks of Amazon’s computing facilities to customers?
On August 24, 2006, the public beta of Amazon’s “Infrastructure as a Service” (IaaS). And so, the ability to rent computing capacity managed by someone else was born. It was a gamble that has, so far, paid off. Amazon includes IaaS revenue in a larger unit called Amazon Web Service, which includes other cloud products. That, in turn, is under a part of the financial reports that includes non-cloud  products called “other.” Besides Amazon Web Service, Amazon’s other revenue includes non-retail activities, such as marketing and promotional activities, co-branded credit card agreements, and other seller sites. Yet most analysts studying the industry believe that the mass majority of the “other” is cloud computing, and the growth is stunning:

IaaS at Amazon went from a thought project in 2004, to a startup in 2006 and it is almost certainly heading toward a billion-dollar business, if it is not there already.
Renting computers, called servers, that were managed by someone else, somewhere else, was not a new concept. What was new was the pricing that allowed customers to buy servers by the hour with a click of a button. That rental concept allowed businesses with uncertain future demand to buy computing capacity rapidly, 24 hours a day, as needed. Other companies followed Amazon’s lead. Rackspace, Terremark, CSC, Savvis, among others have similar options now, and technology research-firm Gartner estimates that businesses will spend $6.2 billion, or 45.4% growth, in 2012 on IaaS.
Here’s why server space matters. In 2002, Friendster was the first site to introduce social networking to millions. The demand grew exponentially as news outlets jumped on the new phenomenon, yet the code behind the site was asking too much of the servers. (In comparison, Facebook uses a programming paradigm called AJAX that taxes servers less.) Its popularity stressed Friendster’s computing capacity and eventually, for a period of time, the site became unusable because it was too slow. It is unknown whether Friendster could have innovated rapidly enough to remain dominate over the more nimble MySpace and Facebook, but it never had a chance. Other companies were forming while it was down.  Years later in 2007, Zynga combined gaming with social networking into online applications that became known as social gaming. It first received venture capital in 2008 and by April 2009 it was the largest app developer for Facebook with 40 million monthly active users. In 2011, Zynga brought in 12% of Facebook’s revenue. Unlike Friendster, Zynga was able to handle the explosive demand of its service. It did so while not owning most of its own servers or hardware infrastructure. It was almost completely in the cloud. In Amazon’s cloud.
IaaS cloud computing is not always cheaper than owning your own hardware, but it provides liquidity for compute capacity. No longer does a company, researcher, or individual need to invest large amounts of capital to purchase hardware that will be used and amortized over three or four years. Now they can buy computing capacity by the hour. That allows websites to increase the ability to handle rapidly growing demand like Zynga did. Other sites have been able to avoid going down from 375 million page views a month for only $15 a month. In fact, the instructions to avoid a Friendster-like site slowdown are so easy to digest that someone right out of high school or college can use the technique. IaaS means that pioneering companies can start small yet still grow rapidly while other websites do not need to go down just because they posted something popular.
For startups, Infrastructure as a Service’s cloud computing increases the amount of money that can be returned to investors (salvage value), reduces the time to set up equipment, and cuts the amount of capital that needs to be raised. Servers and other computing components that run websites and are used for research no longer need to be bought outright, saving money and time. Indeed, most of the tech startups I know use cloud computing and it is exciting to see the rate of innovation increase as classical deterrents are removed. And cloud computing is not only changing business. Academic and corporate researchers can  now rent the 102nd fastest supercomputer in the world, according to the Top 500 Supercomputer Sites through Amazon.
Ten years ago, if asked what company would revolutionize computing, a book merchant with a tech edge probably did not come to mind. However, that is exactly what Amazon has accomplished. Infrastructure as a Service is now widely available in many forms. While other cloud vendors have developed their own systems from scratch or are working with open (free) software, Amazon’s continues to grow rapidly and change the way businesses form and run—while reaping larger and larger revenue streams for vendors in the market.

Meet the PC that will be the death of the PC



Meet the PC that will be the death of the PC

 

 

 

Acer C7 Chromebook, the web-based laptop from Google that happens to have the guts of a PC.Google

Apple may have invented the tablet computer that now threatens the existence of the PC, but it’s Google, with the help of a variety of hardware manufacturers, that wants to finish off the PC  for good.
Today, Google announced a new $199 PC that’s the latest and cheapest in a line of machines that run the Chrome operating system (OS). Unlike Microsoft’s Windows or Apple’s OS X, Chrome OS hardly deserves to be called an OS; it basically consists of a web browser and not much else. The idea behind a Chromebook, as they’re called, is that you do everything through a web browser, using the growing array of web-based apps that have been built to do word-processing, photo-editing and just about anything else you might want to use a computer for. By moving 100% of your computing to the web, you are no longer tied to any one computer. And because everything you use and store is in the cloud, software need never be updated, and, absent a snafu on Google’s end, data can never be lost.
The new Chromebook is manufactured by Acer, the Taiwan-based manufacturer better known for laptops that run Windows. But here’s the funny thing about it: In many ways it’s inferior to its cousin, an even more web-centric device announced in October, known as the Samsung Chromebook.

Samsung Chromebook, a PC in name only.Google
The difference between the two is that the new Acer C7 is still a traditional PC, albeit one not running Windows. It has an Intel processor, the same workhorse that has powered PCs for a generation, and a capacious, spinning, 320-gigabyte hard drive. Samsung’s device has the same kind of processor, made by Intel competitor ARM, that appears in almost every smartphone and tablet on earth, and a mere 16 gigabytes of solid-state (flash) memory, like a smartphone or tablet. In every respect save its laptop-like appearance, the Samsung Chromebook isn’t a PC; it’s a mobile device.
What’s more, the Acer C7 may have a faster processor and bigger hard drive, but it’s bulkier, has a battery life 2-3 hours shorter and, for many tasks, it’s slower. When booting up, loading apps and switching tasks, solid-state drives are noticeably faster than regular ones. Reviewers have observed that Samsung’s laptop does a lot more with its “limited” hardware than a one-to-one comparison with a PC would suggest.
So why did Google just release an inferior machine based on dated technology?
Probably because, although Samsung’s Chromebook has in its short life become the most popular PC on Amazon.com (indeed, it’s sold out), Chromebooks as a whole have been slow to take off since they first went on sale in mid-2011. Most people, it seems, aren’t yet ready to move to Google’s somewhat radical vision of computing’s future. Even if Google is much less likely to lose your data than you are, there’s something comforting about having your own hard drive with your own stuff on it. The thought of relying on the cloud has led to a sort of “range anxiety” for computer users, analogous to the anxiety some drivers feel about the limited range of electric cars—only this is anxiety about being out of range of a wifi signal. For people who are used to trusting the Intel brand and evaluating laptops based, in part, on the size of their hard drives, the Acer C7 is comfortingly familiar, even if it doesn’t run Windows.
In other words, the latest Chromebook is like training wheels, or more cynically, a Trojan Horse. If Google can lure cost-conscious buyers or tech buffs with a spare $200 to spend with this ultra-cheap laptop, it can train them to think of a PC less as a home for storing your life than as a window for viewing it. And that, of course, gives Google a good deal of control over what you see through the window.

6/15/2012

6 Reasons Solid State Memory Is The Biggest Story In Computing

6 Reasons Solid State Memory Is The Biggest Story In Computing


Guest post written by Narayan Venkat
Narayan Venkat is VP, product management, at Violin Memory.
Narayan Venkat
Make no mistake: the sudden boom in solid-state storage is no flash in the pan.
Storage systems based on solid-state flash memory now compete directly against traditional systems using spinning hard drives for mission critical jobs in data centers, particularly at financial institutions and Web companies. Venture capitalists and strategic investors have been injecting capital into the market, fueling a steady clip of multimillion dollar acquisitions. Fusion-IO, which makes flash accelerators for servers, held one of 2011’s most successful IPOs.
Arguably, we’re witnessing the biggest shift in the industry since hard drive-based systems nudged tape storage into the background.
But why is it happening now? And why is the acceptance of flash-based systems occurring so rapidly? Flash memory, after all, isn’t new: Fujio Masuoka led a team at Toshiba that invented flash in the mid-’80s. Flash remains a standard in mobile phones for and consumer electronics and manufacturers have produced flash memory chips in large volumes for years.
The hard drive industry hasn’t hit a plateau either, creating a vacuum for flash to fill. Drive manufacturers have continued to push the pace of innovation in their industry. Both hard drives and flash, in fact, will both likely grow rapidly over the next decade.
Some analysts believed that we’d see a huge uptick in flash notebooks years before solid state memory came to the data center. Instead, the reverse happened: here are six reasons why.
  • Latency
A delay of a few microseconds in a stock trading can mean millions of dollars lost. Fast response time is one of the reasons Google, Twitter and Facebook customers remain happy. Soon, you’ll see those fast-growing cloud storage companies like Box, Carbonite and Dropbox needle each other about speed and reaction time. Eliminating latency is also the root strategies of networking and computing companies such as Luxtera and SeaMicro (now part of AMD).
Flash is inherently faster at most data retrieval tasks than drives. Up to 95 percent of the latency can be reduced. Drives are mechanical devices with motors and moving parts: flash operates with electrical signals. To achieve a competitive edge in speed, customers need flash arrays and accelerator cards.
  •  Technology
Flash wasn’t designed to go into data centers. Flash, in fact, shouldn’t even exist. Flash chips store data by inserting (or extracting) electrons into a silicon dioxide, or glass, cell. It is the equivalent of putting liquids into a thermos through the wall. Electrical engineers look back at the invention of flash as a major achievement.
The stresses caused by this process, though, have meant that flash chips historically have lived short lives. That’s not all. The protocols for writing and erasing data, as well as monitoring for errors or performance optimization, can consume several computing cycles if not properly managed. Simply swapping in flash memory for conventional hard drives in a computer or enterprise-class storage system can lead to worse performance.
To get around these problems, some flash systems companies have designed software and/or semiconductors from the ground up. These new technologies can minimize the number of read-write cycles flash memory chips in a storage array must endure, thereby extending its life from several months to several years, or manage the chips in a way to circumvent over-use. They can also schedule “janitorial” tasks that flash chips must perform to avoid glitches in performance. If you wondered why my company is called Violin Memory, now you know: it’s because our technology conducts an orchestra of semiconductors.
  •  Inertia and The Innovator’s Dilemma
If the benefits are so tremendous, and the technological solutions to the limitations of flash aren’t far-fetched, how come established storage powers like NetApp, EMC and IBM didn’t get their first?

 

Because it took a lot of work and quite a bit of risk. The new generation of flash companies had to sit down and develop new semiconductors and software platforms. Silicon in Silicon Valley: how 1980s can you get? It took money, time and creative thinking. I’m not saying EMC isn’t capable of that. I’m saying that when the executives at the large companies looked at the risks and rewards, they chose to continue to sell yesterday’s products.
Most investors and VCs did the same thing. How many VCs have you heard proclaim that they want to invest in hardware companies facing challenging technological hurdles that might take a few years of trial and error to solve? Most turned to social networking instead. They missed out.
  •  Energy
Electricity is now the largest operating expense for large data centers not including personnel. Power at some data centers can consume 30 percent or more of their budgets. Regulations aimed at reducing power consumption in Europe and parts of Asia and North America add further pressure. Even when carbon taxes aren’t an issue, utilities in large urban centers can and will limit the amount of power a given data center might get.
As a result, companies are faced with giving up growth or getting more efficient. Google, Microsoft, Yahoo and others have begun to build data centers in frigid locations like Buffalo, New York, and Finland to chop their air conditioning bills.
Flash arrays consume a fraction of the energy of drive-based systems. Research firm iSuppli once estimated that replacing just 10 percent of short-stroke hard drives, which account for only a tiny fraction of drives in data centers, with basic, relatively unsophisticated solid-state drives could save 57,000 megawatt hours of power a year. Drives are mechanical systems. Not only do they consume more power directly, they require more AC.
  •  Big Data
You’ve likely heard the term Big Data quite a bit in the past two years, and every time the definition changes. Does Big Data mean large amounts of data that need to be stored and processed? Does it refer to applications capable of harnessing large sets of data? Or does it refer to the science of coming up with applications capable of absorbing large amounts of dynamic data that compiles in real time?
The answer is, all of the above. You are going to see insurance companies overhaul their actuarial processes: policies will be issued with an eye toward crop futures and oil prices. Real estate companies will create markets for parking spots. Retailers will parse digital video streams for data about how consumers, and different segments of consumers, shop their stores. Running these applications will require high-performance data retrieval and storage systems. Otherwise, the results could take decades.
  •  Volume Economics
Cost has been the Achilles’ heel of flash. Historically, flash has cost 3X per bit or more than drives. While drives still cost less, the price of both technologies has dropped steadily and considerably. Corporate flash arrays can be purchased for $6 to $9 per gig, the cost of drive-based systems a few years ago.
What happens next? The storage industry is massive so you won’t see the market suddenly shift completely to flash. But the cost and performance benefits of starting the migration now are more than apparent. Some new capabilities—such as mixing flash with virtualization technology to enhance the economics of a solid-state array—will begin to be discussed more frequently among CIOs.
Any way you slice it, flash is fast becoming a debate you can’t ignore

 

IT-centric enterprise BI models unsustainable, says Forrester

Fast-changing business intelligence requirements drive need for self-service BI

By Jaikumar Vijayan
June 15, 2012 06:00 AM ET
Computerworld - Enterprise business intelligence models that are too heavily IT-centric are unsustainable, a new report from Forrester Research cautioned this week.
Increasingly, businesses that want to develop robust business intelligence (BI) capabilities will need to adopt self-service BI tools and methodologies in order to succeed, Forrester noted.
Two major factors are driving the need for self-service BI.
The first is that BI requirements change faster than IT's ability to keep up. Even IT organizations with the latest tools and best practices often have to struggle to keep up with business requirements for BI applications, Forrester researcher Boris Evelson said in the report.
Mobile Video

Phones contain a significant amount of enterprise data. Learn how to configure and secure them centrally.
Unlike enterprise applications such as Customer Relationship Management (CRM) and Enterprise Resource Planning (ERP), BI applications have a short lifespan and can become outdated very quickly, he said.
The other major issue is that conventional approaches to software development are poorly suited for today's BI needs, he said. "The traditional waterfall methodology for the software development life cycle calls for collecting user requirements, transforming them into specifications, and then turning these specifications over to developers," Evelson noted in the report.
"While this approach is often successful for traditional enterprise application implementations, it won't work for the majority of BI requirements," he said.
Increasingly, enterprises can benefit from tapping self-service tools for their BI requirements, he said. While IT needs to retain control of complex, mission-critical BI applications, a vast majority of other BI initiatives need to be handled directly by the business units that will be using the applications.
"We maintain that in an ideal BI environment, 80% of all BI requirements should be carried out by the business users themselves," he said.
The key to success with self-service BI lies in choosing the right tools, Evelson noted. To be really useful a self-service BI tool should enable casual users, technology savvy users and executives to self-serve for new queries, reports and dashboards, he said.
The Forrester report outlines several features that enterprises should look for in self-service BI tools. Some examples include features such as automodeling, data virtualization, search-like graphical user interfaces and collaboration support.
Self-service BI does not, however, mean eliminating IT altogether from BI projects.
"To do it right, IT still has to setup infrastructure, architecture, tools and policies upfront," Evelson told Computerworld by email today.
Many business organizations try to do an end run around IT by having vendors implement self-service BI capabilities. "But that's not the right way, [because] it won't give them access to the entire enterprise data, just what they themselves can connect to," he said.
Sometimes business units try to enable self-service BI capabilities by signing up with hosted providers. But again without IT involvement, such efforts can be somewhat limited in scope, Evelson said.
"It's OK for situations where IT just doesn't have the time, skills, or budgets," he noted. "But again, this'll just give them access to a subset of enterprise data."

 

Cloud providers aren't selling the real value of the cloud

Cloud providers focus on time to deploy and other tactical claims, not the core strategic value

  I hear this pitch all the time: "Cloud computing provides the shortest time to deploy or time to market because there is no need to purchase and configure hardware and software." That makes sense.
However, the value that comes from speedy deployments is often lost in the process that occurs in most Global 2000 companies as they allocate resources, understand compliance, and deal with security. The advantage of not purchasing hardware and software is significantly diminished, considering the amount of work required to move or create a system wherever it may be sourced. The cloud providers are emphasizing a small advantage of the cloud.
[ In the data center today, the action is in the private cloud. InfoWorld's experts take you through what you need to know to do it right in our "Private Cloud Deep Dive" PDF special report. | Stay up on the cloud with InfoWorld's Cloud Computing Report newsletter. ]
If the value of time to deploy is not the big deal we're told it is, what is the compelling reason to move to cloud computing? You might think it's the efficiency of the public cloud platforms, but that too is a relatively small advantage.
The big advantage is the ability to quickly align with changing requirements, an area where traditional approaches to IT have failed for the last 20 years. In fact, they're getting worse at it.
The trouble is that the value of adaptability, which far exceeds that of other benefits of cloud computing, is both difficult to define as a concept and even more challenging to model for a specific problem domain or a whole enterprise. Nonetheless, it should be the ultimate objective of cloud computing and -- for that matter -- any new technology.
Cloud providers should stop leading their pitches with the tactical values that vary greatly from enterprise to enterprise and instead discuss the core strategic reasons for moving to the cloud. For its part, IT needs to get a clue about this concept so that it can apply cloud computing technologies in the right directions. My fear is that both providers and enterprises don't yet understand the true value of cloud computing, and tactical "quick win" thinking will get us into trouble -- again.

6/06/2012

Transit of Venus, next in a hundred of years

After seeing the Astronomical phenomenon we are going to post the IT news that we consider good to know:

Indonesian farmers reaping social media rewards



The BBC's Karishma Vaswani meets the Indonesian farmers using social networking to make the most of their produce.
Indonesia is an economy on the move these days - and fast becoming one of the most technology-savvy countries in Asia.
Just look at the facts - Indonesia is one of the world's biggest users of Twitter. It is also home to the world's third-largest group of Facebook users.
Blackberry maker Research in Motion counts Indonesia as one of its most lucrative markets - and other gadget-makers are eagerly eyeing the upwardly mobile Indonesian consumer.
There's even talk of a Silicon Valley-style boom taking place in Jakarta's suburbs, with the likes of US tech giant Yahoo snapping up an Indonesian start-up.
But while urban Indonesians are considered to be as plugged-in as their counterparts in Singapore or Seoul, out in the countryside, it's a different story.
Challenging existence
Just venture a few hours outside Jakarta, the capital, to the village of Kadaka Jaya, and it is easy to see the digital divide.
The further you drive up winding roads, the worse your mobile phone signal gets.
It's hard to spot a telephone tower anywhere, but for miles on end you can see emerald green paddy fields peppering the hills.
It is peacefully quiet - a far cry from the hustle and bustle of Jakarta.
From a distance, I saw a lone Javanese farmer sitting in the middle of his field, presumably for a leisurely afternoon snooze.
Ade, young Indonesia farmer  

Ade, 24, uses technology to monitor demand for his chilli crop
It's thought that almost half of Indonesia's population of 230 million make their living from the land - often a challenging existence that depends on the whims of the weather and the prices of the markets.
It is in Kadaka Jaya that I meet Ade, a dynamic and bright 24-year-old farmer.
Ade has followed in the footsteps of his family, tilling the land the way so many before him have done.
Crop prices
Now though, change is coming to Kadaka Jaya.
"As farmers we constantly need new technology to improve our livelihoods," Ade tells me, as he takes a break from picking ripe chillis.
"There's no way for us to tell what consumers in the cities need, or when the products we want have arrived at the stores in town. We have to keep calling the shopkeepers to find out, and phone networks in this area are patchy."
Mathieu Le Bras  

Mathieu Le Bras is the founder of 8villages, a social network for farmers
This gap between the rural and the urban is where technology start-up 8villages saw an opportunity, using mobile phones.
"8villages is a business social network for farmers," says founder and chief executive Mathieu Le Bras. "It provides them with a link to local buyers, their local sellers - and other farmers who are growing the same crop as them."
Ade is one of 900 Indonesian farmers testing the product free.
"It allows me to access information about fertilisers, pesticides and the prices of crops," he says. "So now when I need information, all I have to do is wait for an SMS from 8villages."
The plan, according to Mr Le Bras, is to take this nationwide within the next six months - and even further, to farmers in Vietnam and the Philippines.
Close interaction
Mr Le Bras acknowledges he has drawn from his experience with social networks in urban markets - but insists there are already established networks in the countryside that his product is tapping into.
Indonesian countryside  

The peaceful Indonesian countryside - not your typical technology hotspot
"Social networks are paramount in the countryside," he says.
"People interact closely here, social status is very important - and the influence of a senior farmer plays a very important role in the community."
That's why 8villages has a service that allows farmers to enter a code on their mobile phones and access product reviews by senior farmers - taking the whole "like" concept offline.
Mr Le Bras says this is key to the success of the product, and why the farmers are now far more efficient.
"What we are doing with this is leveraging the social networks we know - like Wikipedia and eBay for example - and taking it online, so that the farmers have access to the knowledge."
New consumers
It's not just start-ups that are looking to tap the huge potential in the Indonesian countryside.
Global handset maker Nokia offers the Life Tools service, costing about five US cents a day.
For that, farmers with a Nokia handset get a text message about crop prices and weather patterns, a service the company says had more than 600,000 users in Indonesia in 2011.
But Nokia's country head Martin Chirotarrab says the plan is to keep expanding its reach in the countryside.
Group picture of 8villages testers  

The 8villages system was tested by its eventual users - the Indonesian farmers
"We have over seven billion inhabitants in the world," Mr Chirotarrab tells me at the Nokia headquarters in Jakarta.
"Roughly half of them have a device in their pockets. But only a billion of these consumers are on the web. Nokia's plan is to connect the following billion consumers to the web."
Both big businesses like Nokia and start-ups like 8villages want many of those new consumers to come from the Indonesian countryside, but this is unlikely to happen overnight.
Phone networks in many districts remain patchy, and telecom providers have yet to make it a priority to extend them.
But this is changing - although very slowly - because of the vast untapped potential that companies are now beginning to see in the Indonesian farmer.

 

Nokia, Apple, Obama, Ubisoft, ETSI: Intellectual Property


Nokia Oyj’s claim of patent infringement on electronics, including mobile phones and tablet computers from Taiwan’s HTC Corp. (2498), will be reviewed by a U.S. agency that has the power to block imports of the goods.
The International Trade Commission agreed to investigate Nokia’s complaint, filed with the agency last month, according to a statement yesterday. No date has been set for a decision.
Nokia said on May 2 it filed lawsuits in the U.S. and Germany over inventions for mobile devices, naming HTC among several manufacturers. The company, which lost its 14-year title as the world’s biggest seller of mobile phones last year to Samsung Electronics Co. (005930), is seeking to expand revenue from its patent holdings.
HTC is using proprietary technology of Espoo, Finland-based Nokia to improve hardware and software functions in its devices, the company said in a statement when it filed the suits.
Nokia has joined Microsoft Corp. (MSFT) to make Lumia smartphones that run using Windows Phone software, which competes with Google Inc. (GOOG)’s Android operating system. HTC makes phones for both Android and Windows Phone.
A final decision in the trade commission’s investigation will be made “at the earliest practicable time,” according to the statement. A hearing will be held and then a commission judge will issue findings in the case. If a violation is found, the six-member commission will then vote on whether to block HTC phones from entering the U.S. market.
A spokesman for HTC wasn’t available to comment. Nokia and HTC have been partners in fighting patent-infringement claims by IPCom GmbH, a licensing company that obtained mobile-phone patents from Robert Bosch GmbH in 2007.
About 10 companies, including Apple Inc. (AAPL) and Research in Motion Ltd., dominate the global industry. There was about $312 billion in worldwide sales of handsets in 2011, a 19 percent increase from 2010, according to Bloomberg Industries.

Apple Copied Samsung Inventions for IPhone Use, U.S. Judge Told

Apple Inc. introduced its iPhone in 2007 using Samsung Electronics Co. technology that it didn’t want to pay for, a lawyer representing the Korean electronics company told a U.S. trade judge yesterday.
Samsung contends Apple’s devices, including the iPhone, iPad tablet computer and iPod touch media player have infringed as many as four patents. All came from two decades of work Suwon, South Korea-based Samsung spent improving mobile phones, the attorney for the company said.
“All of these things that Samsung built up, Apple was using when it entered the market,” Samsung lawyer Charles Verhoeven of Los Angeles-based Quinn Emanuel Urquhart & Sullivan LLP said at the beginning of the trial yesterday at the U.S. International Trade Commission in Washington.
The case before ITC Judge James Gildea, and another patent case by Apple against Samsung that’s in the midst of trial before a different trade judge, are part of a global battle between the two companies for increased share of a market that Bloomberg Industries said was $312 billion last year.
Apple denies infringing the Samsung patents and is challenging their validity, just as Samsung is doing in regard to Apple’s allegations.
Samsung’s case against Apple is In the Matter of Electronic Devices, Including Wireless Communication Devices, 337-794, and Apple’s case against Samsung is In the Matter of Electronic Digital Media Devices, 337-796, both U.S. International Trade Commission (Washington).
For more patent news, click here.

Trademark

Obama Campaign Sues Seller of Election Materials

The Obama presidential campaign filed a trademark- infringement suit against a website that sells election-related materials.
It is the second similar complaint the campaign has filed against Washington-based Demstore.com since October 2011. According to the complaint filed June 1 in federal court in Washington, the campaign objects to what it says is unauthorized use of the “rising sun” trademark.
The campaign said it’s damaged because it depends on its sale of authorized merchandise as a fundraising technique for President Barack Obama’s re-election campaign. Also, when people make even a “relatively small” purchase of trademarked merchandise through the official website, the campaign obtains the buyer’s contact information and uses it “to reach out to that individual repeatedly to seek further donations and further opportunities to promote the campaign.”
The earlier trademark suit against Demstore.com was dismissed following a Jan. 25 court filing from the campaign requesting termination of the case. No details of a settlement were available in the court file.
In the new case, the campaign asked the court to bar further unauthorized use of its “rising sun” and other trademarks, and to order the seizure of all unauthorized merchandise. Additionally, the campaign seeks money damages, including extra damages to punish the website for what it says is deliberate infringement.
Demstore.com didn’t respond immediately to an e-mailed request for comment.
The Obama campaign is represented by Barry J. Reingold, William C. Rava and Jeremy L. Buxbaum of Seattle’s Perkins Coie LLP.
The new case is Obama for America v. Demstore.com, 1:12-cv- 00889, U.S. District Court, District of Columbia (Washington). The earlier case is Obama for America v. Demstore.com, 1:11-cv- 07646, U.S. District Court for the Northern District of Illinois (Chicago).
For more trademark news, click here.

Copyright

Ubisoft Asks Court to Declare It Didn’t Infringe Beiswenger Book
Ubisoft Entertainment SA (UBI) filed a copyright suit against author John L. Beiswenger two weeks after he dismissed a copyright suit he filed against the French maker of computer games.
On May 15 Beiswenger and Ubisoft jointly filed a court document asking that the case he filed in April be dismissed. According to data compiled by Bloomberg, the parties said they reached a settlement in the dispute.
Beiswenger, a Pennsylvania resident, had claimed that Ubisoft’s 2007 “Assassin’s Creed” games infringed the copyright for his 2002 work “Link: A Novel.”
In the new suit, Ubisoft asked the court to declare that the game doesn’t infringe Beiswenger’s copyrights. His claims are “entirely meritless and were based on patently non- copyrightable elements” contained in the two works, Ubisoft said.
Montreuil, France-based Ubisoft said it filed the new case despite Beiswenger’s dismissal of the infringement suit because “his claim could be refiled at any time.” The company wants to establish “once and for all” that its “Assassin’s Creed” doesn’t infringe Beiswenger’s copyrights directly or indirectly.
Ubisoft argued that the “ancestral memories” element Beiswenger claimed was infringed “has existed in the cultural consciousness for decades -- long before the publication of either ‘Link’ or ‘Assassin’s Creed.’”
In addition to a declaration of non-infringement, Ubisoft asked the court for an award of attorney fees and litigation costs.
The French games company is represented by Stephen S. Smith of Greenberg Glusker Fields Claman & Machitinger LLP of Los Angeles.
The new case is Ubisoft Entertainment SA v. Beiswenger, 3:12-cv-02754-NC, U.S. District Court, Northern District of California (San Francisco). The original case is Beiswenger v. Ubisoft Entertainment, 1:12-cv-00717-CCC, U.S. District Court, Middle District of Pennsylvania (Harrisburg).
For more copyright news, click here.

Trade Secrets/Industrial Espionage

Saab CEO Claims He Was Target of Industrial Espionage Attempt

The chief executive officer of Saab AB (SAABB)’s defense group said his phone was bugged when he was in negotiations with Switzerland over the sale of 22 of his company’s fighter jets, Agence France-Presse reported.
Hakan Buskhe claimed he was the target of industrial espionage and didn’t identify the person or company behind the action, according to AFP.
He said he had been “closely watched” and “monitored, one way or another,” AFP reported.
Switzerland said in November it would buy the planes, choosing them over aircraft produced by France’s Dassault Aviation SA (AM) and the European EADS (EAD) group, AFP reported.

Industry Standards

ETSI Chooses Apple Standard Over Nokia for Mobile-Phone SIM Card

Mobile-phone makers agreed on a new standard for smaller SIM cards, overcoming a deadlock in which Finland’s Nokia Oyj (NOK1V) and Apple Inc. had competing proposals.
The so-called “fourth form factor” will be 40 percent smaller than the current smallest SIM card design, the European Telecommunications Standards Institute said in a statement on its website, following a meeting held May 31 and June 1 in Osaka, Japan. “It can be packaged and distributed in a way that is backwards compatible with existing SIM card designs.”
ETSI agreed to pick Apple’s SIM card standard, beating a proposal from Nokia, MacWorld said on its website, citing cardmaker Giesecke & Devrient. Spokesmen for ETSI and Nokia couldn’t immediately be reached for comment.
In March, a two-day meeting to adopt a format from competing proposals by Apple and Nokia finished without reaching a decision. The smartcards that identify wireless subscribers are standardized to reduce industry costs and give consumers freedom to switch handsets and networks. Smaller versions permit the design of thinner phones.

Technology for the greater good

These Computerworld Honors laureates benefit society by using low-tech gadgets for high impact.


Computerworld Honors medal
Computerworld - A mother in Tanzania walks for three days with a sick child on her hip, only to arrive at a rural clinic whose inventory of malaria medicine is depleted.
It's a matter of life and death for the mother and child. But from a business standpoint, it's a straightforward supply chain issue. Antimalarial medicines -- with a 96% cure rate -- are available. Yet far-flung clinics have a hard time keeping them in stock. Having adequate supplies when and where they are needed is critical, because the medication isn't fully effective unless patients take it within 24 hours of contracting malaria.
Novartis -- a company whose innovations include micro-chipped pills that can track whether patients take their medication on schedule -- resolved the crisis in Tanzania by relying on, of all things, SMS text messaging.



Phones contain a significant amount of enterprise data. Learn how to configure and secure them centrally.
Similarly, OhioHealth in Dublin, Ohio, is using text messaging to deliver health and wellness information to patients subscribing to its OH Mobile app. The app can alert obstetric patients of upcoming tests and procedures or remind pre-operative patients to refrain from eating and drinking after midnight the day before their surgery.
"A very important component of patient care delivery is dependent on patient engagement. That idea and the fact that the vast majority of patients had smartphones gave us the idea for the app," says Dr. Mrunal Shah, vice president of physician IT services at OhioHealth. "We wanted something easily deployable and easily updatable," he notes. The result: "Patients are leaving wonderful feedback. They're just hungry for more information, which is a fantastic problem to have," says Shah.
Novartis and OhioHealth are among dozens of 2012 Computerworld Honors laureates that are leveraging low-cost, consumer-oriented technologies to create and deploy systems and applications designed to greatly benefit society, especially in the areas of education and healthcare.
The Computerworld Honors program, now in its 24th year, recognizes organizations that create and use IT to promote and advance public welfare. Award winners will gather at an event in Washington on June 4 to celebrate their achievements.

Necessity Drives Innovation

Usability and affordability are the heart and soul of these innovations, many of which are being deployed in poverty-stricken and remote areas of developing nations where life's basic necessities -- much less state-of-the-art IT and ubiquitous Internet access -- are not readily available.
But what is available is SMS, which in remote areas performs more efficiently than costlier, more complex options, according to Rob James, CIO at Novartis. Working with IBM and Vodaphone, Novartis IT came up with a simple idea: Have each remote clinic text four numbers, representing the inventory levels of four different medicines, to distribution facilities in major cities that ship supplies. The application is known as SMS for Life.
"The idea was to take that information centrally and look at inventory levels overall so we could do a better job of forecasting stock-outs," says James.
Initial results of a pilot test at 20 sites across Tanzania were daunting: More than 25% of remote facilities were totally out of stock on all medications.
"The good news is that once we had that data, we could reduce stock-outs to less than 1% in a very short time," James says. "That led to a rollout across Tanzania, then through Kenya, and we're now in the planning stages for Cameroon and the Republic of the Congo," he adds. Over the past decade, Novartis has provided more than 500 million malaria treatments for adults and children.

Developed by an IT team at Novartis, the SMS system comprises an SMS management tool and a Web-based reporting tool. The SMS app stores a single registered mobile phone number for one healthcare worker at each facility. Once a week, the system automatically sends a text message to each of these phone numbers asking for the current stock of medicines at their facility. Stock data is then returned using a short code number at no cost to the healthcare worker.
"This is one of those unique programs and one of our favorite programs in IT," James says, adding that everyone who worked on the project did so as a volunteer.

Low-Cost Literacy Tools

Keeping user costs low was also a major driver in the development of an application known as Mobile and Immersive Learning for Literacy in Emerging Economies, or MILLEE for short. Designed as a series of English literacy games that are played on cellphones, the application aims to improve English as a second language among poor children living in rural villages and urban slums in the developing world.
Matthew Kam started the project in 2004, when he was a graduate student at the University of California, Berkeley. When Kam moved to Carnegie Mellon University to become an assistant professor in human-computer interaction, he expanded the project with the idea of having students rewrite the software from scratch so that it would operate on very low-end cellphones.
"Before CMU, the application was running on higher-end phones," Kam explains. "What we were really trying to do with the expansion is to target the most affordable phones out there, so as to perform research pilots that reflect more realistic cost conditions. We were looking for the lowest common denominator," he says.
Specifically, Kam and his team were targeting Java Micro edition (J2ME) phones, which are significantly cheaper than high-end smartphones. Technical barriers included optimizing the application for use on low-resource devices with limited memory and organizing the English-language learning content, including graphics and voiceover files, on the phone's storage system so that file input and output remained efficient.
There were cultural challenges as well. The earliest game designs weren't intuitive to children in rural India.
"This forced us to take a step back and study 28 of their traditional village games and contemporary Western video games," Kam says. The analysis provided the team with a set of guidelines on how to design educational games for non-Westerners.
MILLEE team member Ashton Thomas, who graduated from CMU in May 2011, developed a game called Word Catch, in which a player is presented with an English word and four images, one of which corresponds to the meaning of the word. "You had to stop a ball over the correct image, and the speed of the ball would change. As the words got harder, the speed of the ball got faster," he recalls.
Thomas, who has since launched a fitness software company called Acrinta, recalls that one of the challenges for his MILLEE team was that it was geographically dispersed, with some members in India and others at CMU's campus in Pittsburgh.
"The time zone difference, the physical distance and the communication barriers were all challenges," he says. "The students in India would help maintain the code base and do some development. They would also take the phones and install the games and go to the learners to get feedback and relay all of that information back to us."
As Thomas sees it, one key to the value of the MILLEE project is that "it's a game, and as the students are playing, they're having fun." But he points out that the students are also learning, "and that is creating opportunities that could lead to serious social change" -- an observation confirmed in a recent report from the British Council, which estimates that the salary gap between professionals with and without English skills in some developing countries is as high as 20% to 30%.


Enabling a Livelihood

Improving the economic prospects of villagers in India is the goal of MicroGraam, a project that taps mobile and Internet technologies to enable urban professionals to find, select and provide microcredit to underprivileged borrowers in rural India.
MicroGraam co-founder Sekhar Sarukkai notes that the concept of microfinance isn't new. But as he and co-founder Rangan Varadan saw it, it could be improved.



Phones contain a significant amount of enterprise data. Learn how to configure and secure them centrally.
"A few years ago, Rangan went back to India to run the banking and finance practice for Infosys, and he saw that microfinance was a great model, but borrowers were struggling," he recalls. "They had to start repaying the next month after they borrowed the money," he explains. But it could take several months before a newly launched venture paid enough to begin repaying the original loan.
The two men decided to apply the principles of venture capital to the microfinance market. Rather than having borrowers start to pay back their loans immediately, lenders would begin to receive payments -- plus an agreed-upon amount of interest -- when the new venture became more solvent.
The model required transparency between lender and borrower, which MicroGraam addressed by developing a marketplace platform using open-source technology, including integration with online payment gateways. A key feature of the system is that micro-fund transfer costs are less than 0.5%, compared to the industry-standard 5%.
"Complete transparency is one of the most important ways technology can help these low-cost transactions. But you need to do it in a very low-cost manner," Sarukkai notes. "Open source helps a lot. This is a fully open-source application."
MicroGraam lenders can search through a database that includes descriptions of borrowers, photographs, and information about the purpose of and terms of the loan. Lenders also receive updates about the progress of the businesses they fund. In addition, the system provides scheduled reminders to MicroGraam's nongovernmental organization (NGO) partners that administer the loans on the company's behalf.
"MicroGraam doesn't have any field offices, so we go to select NGOs who are already working in villages and partner with them so we don't have any overhead on our end," explains Sarukkai. "It's the NGOs that go and collect the money, so it's very important for us to have visibility into that."
What has become equally important is providing transparency to the borrowers. This is done via SMS technology.
"Borrowers are very interested in visibility into their progress, and almost all of these people have phones, because they are very low cost," he explains.
In the past two years, MicroGraam has facilitated 836 loans totaling about $230,000. The repayment rate is 98%. A woman in the province of Trichy in India, who borrowed 1,500 rupees (about $50) to buy a mixer to grind flour, is typical of MicroGraam's borrowers, who are mostly women.
"She started making batter and selling the batter to others in the slum," Sarukkai says. "You could think it's not a big deal, but by selling batter she was able to share in profits. It took her a year and a half, but now she gets more than 1,000 rupees a month from selling batter."
"It's amazing how $100 can change lives so substantially."

6/02/2012

Hot June but IT news continue

Summer is coming and IT news are more and more interesting.

First thing we are going to speak about Facebook:


Why Facebool is worth a dollar year ...and falling

Another interesant investigation to know the current situation is the document written by Ashlee Vance last 16 May.

The Ellison Files: Oracle Strikes Back

Over the course of a few months in 2009, Hewlett-Packard’s (HPQ) top executives engaged in a furious debate. Intel (INTC) was considering halting production of its Itanium chip, a high-powered but expensive workhorse that had fallen out of favor with nearly every company except HP. If it went extinct, so too would HP’s Integrity line of servers, which still made a ton of money for the Silicon Valley hardware manufacturer. The executives decided to hide the outlook from their customers, business partners, and even employees. “The Itanium situation is one of our most closely guarded secrets,” wrote Martin Fink, the head of HP’s high-end server business, to another executive.

That e-mail is part of a stream of private communications between Fink, former HP Chief Executive Officer Mark Hurd, Intel CEO Paul Otellini, and numerous others that have just been made public in connection with ongoing litigation. In March 2011, enterprise giant Oracle (ORCL), which makes database software that runs on about 80 percent of HP’s Integrity servers, announced it would no longer develop Itanium-based products, citing conversations with Intel higher-ups about the chip’s murky future. HP sued Oracle for reneging on an agreement. Oracle countersued.
Snippets of both parties’ arguments have trickled out as the cases wind through court. But more than a dozen previously redacted documents were obtained by Bloomberg Businessweek in their original form. The confidential communications make it vividly clear how badly HP wanted to prolong the life of Itanium—widely considered one of the tech industry’s most costly duds—even, possibly, at the expense of its customers and business partners. Regardless of the legal outcome, the documents show that Oracle CEO Larry Ellison retains a remarkable knack for dragging opponents through the muck.
It was not, of course, supposed to come to this. Itanium was conceived in the mid-1990s as a state-of-the-art chip that would correct decades of past design mistakes. Intel and HP spent billions developing and marketing it, expecting rivals like Sun Microsystems, IBM (IBM), Silicon Graphics Inc. (SGI), and Compaq Computer to recognize the technology’s supremacy and use it in their own servers.
In the years that followed, Itanium became an industry laughingstock, even as its backers poured more than $15 billion into the chip. New versions tended to arrive late and slower than expected, and software makers took their time updating applications for it. SGI, one of the Itanium converts, went bankrupt and abandoned the chip. Most others decided that Intel’s regular Xeon chips, which were advancing at a quicker clip, were the better long-term bet. Critics dubbed the chip “Itanic,” and HP ended up as the only major Itanium backer.
While HP did not sell huge volumes of Itanium-based servers, it did make a lot of money on them, with some configurations costing upwards of $1 million. The buyers tended to be large organizations like banks and government agencies, which also paid HP for multiyear service agreements to support the hardware. But by the late 2000s, the court documents show, Intel wanted a clean break. Due to Itanium’s high engineering costs, Intel barely made any money on the chips, and only then because HP helped pay the R&D bills and guaranteed a certain level of sales. “So what happens if we don’t pay?” one HP staffer asked Fink in a 2009 e-mail. Fink’s response: Intel would “exit Itanium and have a round of high-fives.”
 
Photograph by Gary S. Chapman/Getty Images
Even as Fink wrote that e-mail, HP continued telling its customers that Itanium had a bright future. In fact, the company was plotting ways to dump Itanium and stop paying Intel. One option: Acquire Sun Microsystems. Its server software would give HP flexibility in the event of the demise of its own product, which was “on a death march due to [the] inevitable Itanium trajectory,” argued one executive in 2009. (Oracle ended up purchasing Sun later that year.) In other documents, HP describes a number of secret projects to move its high-end server business over to the Xeon chips the rest of the industry had come to rely upon.
Time and again, Fink, in particular, talks about the imminent “end of life” of Itanium while trying to keep that knowledge from partners like Microsoft (MSFT), which, like Oracle, made software to run on Itanium. Fink urged HP executives to “avoid any (Itanium) conversations” about the chip’s future and instead “extract money from Microsoft” to accelerate engineering work related to the secret projects. 

Secondly, the security is important in our homes but we have to be awake with the e-security, the security in internet.


Security performance should be part of procurement

Data Art vs. Data Visualization: Why Does a Distinction Matter?

We can see an interesting and personal article about Data Art and Data Visualization

Two distinct approaches to presenting data graphically exist today—data visualization and data art—and rarely do the twain meet. They differ in purpose and in design. When we fail to distinguish them from one another, we not only create confusion, but do great harm as well.
There are as many definitions of data visualization as there are definers, but at the root of this term that has been around for many years is the goal that data be visualized in a way that leads to understanding. Whatever else it does, it must inform. If we accept this as fundamental to the definition of data visualization, we can judge the merits of any example above all else on how clearly, thoroughly, and accurately it enlightens.
By data art, I’m referring to visualizations of data that seek primarily to entertain or produce an aesthetic experience. It is art that is based on data. As such, we can judge its merits as we do art in general.
Either one, done well, is worthwhile, assuming that it fits the task at hand. If the task is to help a particular group of people understand something, then data art is not appropriate, no matter how well it is executed. If the task is to entertain or engage an audience in a particular emotional experience, then data visualization probably isn’t appropriate. If the situation requires that both objectives are achieved, then a deeply informing and aesthetically beautiful visualization would be in order. Although it is quite easy to make any data visualization aesthetically pleasing, it takes a great deal of skill as a visual designer and information communicator to make one beautiful.
People make better decisions when they’re based on understanding. For information to be understood, it must often be presented in visual form. This is because patterns, trends, outliers, and a sense of the whole as opposed to its parts require a picture for the human brain to see and comprehend. Data visualization is essential. Visualizing data effectively is vital. Anything less is frivolous, costly, and harmful.
How in particular is data art—visualizations that strive to entertain or to create aesthetic experiences with little concern for informing—harmful when it masquerades as data visualization?
  1. It suggests that data cannot be visualized without training in the graphic arts. As such, it works against the democratization of data. In fact, anyone of reasonable intelligence and a little training can present data effectively. It’s vital that this ability spreads more broadly across the population, because it can play a role in making a better world.
  2. It features ineffective practices as exemplars of data visualization. It encourages people to present data in ways that are difficult to perceive and understand simply because they are prettier or more entertaining, which is rarely relevant to the task.
  3. It keeps the practice of data visualization spinning its wheels, never able to progress beyond the mistakes of the past. Best practices of data visualization have emerged through many years of research and experience. “Those who cannot remember the past are condemned to repeat it” (Santayana).
I am personally and painfully acquainted with each of these problems. For this reason, I try to differentiate data art from data visualization and encourage others to do so as well.

5/13/2012

Technological US Mother's Day

Mother's Day in the US.
We continue our research in the technological world and we have found a lot of news to share with you.

First one, cloud computing is in our homes and the politicians know the importance of that:

Public-sector cloud computing: The good, the bad and the ugly

As state and local governments look to the cloud, everyone can learn from agencies' struggles with compliance and ingrained cultures.

By Howard Baldwin
May 9, 2012 06:00 AM ET
public-sector cloud
Computerworld - When the second-in-command of one of the most technologically advanced states in the country slams public-sector computing -- publicly -- it's a resounding wake-up call.
"Don't underestimate how far local, state and federal government is behind [in computing]," said California Lt. Gov. Gavin Newsom at a tech conference in Silicon Valley earlier this year. "We have to wake up to the new reality."
The new reality Newsom was referring to is cloud computing -- a versatile way for government agencies of all sizes to solve a variety of technological issues relating to cost, human resources and the ability to respond quickly to constituents' needs. Many government agencies are doing just that -- albeit in limited areas, such as email and data center consolidation.


A Deloitte survey of midmarket businesses found they are interested in leveraging technology to improve business efficiency

More information click here.

Secondly, I would like to share again the importance of social networks and the business of themselves.

Facebook admits in SEC filing that making money off mobile is a problem

By Sharon Gaudin
May 11, 2012 11:52 AM ET
Computerworld - While Facebook executives talk to the country's top investors about its 900 million users and its powerful global reach, analysts say they need to explain how they're going to fix one glaring problem -- mobile.
Facebook executives are in the midst of a roadshow pitching the company's initial public offering to potential investors around the country. The company, the world's largest social network, is eclipsing its competitors and whipping up a flurry of pre-IPO interest in its stock.
However, industry analysts and investors are asking questions about how Facebook will generate revenue from the growing number of members who are accessing the network via their mobile phones from restaurants, park benches and commuter trains.
"Facebook is really struggling with mobile," said Zeus Kerravala, an analyst with ZK Research. "They make almost no money on it. I think mobile makes people use Facebook more, but Facebook hasn't figured out how to monetize that."

Technologies news:

Apple is going to show his potential with his new maps for iOS.

New Maps for iOS? It's not exactly a surprise

Posted May 11th 2012 3:45PM by Megan Lavey-Heaton
Filed under: iOS
ImageThe latest rumor to get everyone talking is that Apple will drop Google Maps for their own proprietary software. Sources told 9to5 Mac that a new Maps app will debut with iOS 6 with an Apple-created backend that will resemble the current Maps app, but with a more amenable solution.
This really isn't the big shocker that everyone is making it out to be. The writing has been on the wall for a couple of years. It was never a matter of if Apple would drop Google but when. The question was asked as early as 2009 when Apple bought Placebase. This was followed by the acquisition of several 3D-technology companies -- Poly9 in 2010 and C3 Technologies in 2011. While Apple and Google did renew their partnership last year, it was most likely on a year-to-year basis. Apple wasn't ready with the technology in 2011. It looks to be ready now, and the leak could be intentional to drum up excitement for WWDC.
The proof is in iPhoto for iOS. When the iOS version of iPhoto debuted in March, Apple was using older OpenStreetMap data instead of Google. This was most likely a testing ground to see how their own maps would function before pushing it out to a wider audience with a major iOS release. While the iPhoto maps aren't anywhere near as full-featured as a new Maps app would likely be, and I hope it looks different because the maps in iPhoto are rather ugly, it was a good place to start.
Maps for iOS has long lacked the features offered to Android users, including a solid integration with voice control. If this Maps app does debut with iOS 6, I hope Siri can be used to voice turn-by-turn directions. The addition of 3D-map technology would of course bring additional benefits. What features would you like to see in a new Apple-originated Maps app?

The last new of this week I would like to highlight is the trend of VMware:

VMware Wants to be Your New SDN

VMware doesn't just want your servers virtualized, they want software define your network but they are taking a different approach than Cisco, HP and others that have embraced OpenFlow.

By Sean Michael Kerner | May 11, 2012
VMware helped to lead the revolution that has transformed the data center server space with virtual nodes of compute server infrastructure. Now VMware wants to lead the way in virtualizing networking. It's a movement that is aligned with the newly emerging trend of software defined networking (SDN) that enables programmable networks abstracting networking hardware.

VMware's Software Defined Data Center

Allwyn Sequeira, vice president and CTO of Security and Networking at VMware explained to Enterprise Networking Planet that while it might take two minutes to set up a virtual machine (VM) it could take an additional five days in traditional data center deployment to set up the network to support and enable that VM. To get around that, Sequeira is advocating for VMware's Software Defined Data Center, which is a new architectural approach that virtualizes the network elements like firewalls and load balancers.
"The whole idea is about delivering a scale-out elastic architecture that is available to apps on-demand," Sequeira said. "You are freeing yourself from the tyranny of having to buy hardware."
In Sequeira's view, it's not possible to scale physical networking hardware to meet the on-demand needs of modern virtualized applications. From a product and technology perspective, the Software Defined Data Center architecture involves applications and specification available now, as well as work that is coming. With server virtualization there is now the concept of one vSwitch per host and, in that context, a VLAN is how VMs are networked. VLANs traditionally have been limited in their ability to stretch across data center domains, which is where the VXLAN standard comes into play.
The VXLAN specification was initially proposed in September of 2011, and is a multi-vendor effort that includes VMware along with Cisco, Arista Networks, Citrix and Red Hat. The basic idea behind VXLAN is to have a Layer 2 abstraction for virtual machines so they are not restricted to a particular LAN boundary.
"VXLAN is the basis for us untethering ourselves from current network limitations," Sequeira said. "VXLAN is what enables end-to-end elasticity in the data center and allows you to build a software defined network."
VXLAN abstracts the VLAN from the underlying physical network and it also supports multi-tenancy. VXLAN is not a finalized industry standard. It is available as an Internet Engineering Task Force (IETF) draft standard known as NV-03 (network virtualization over Layer 3).

VMware vs. OpenFlow

For Sequeira, he now sees two views of the SDN world. One is the VMware type approach with vSwitch and VXLAN and the other is OpenFlow. OpenFlow is an open source protocol for SDN that is also gaining in interest and popularity.
"I want the equivalent of a LAN across a data center and that's what VXLAN does," Sequeira said. "The VXLAN overlay combined with what we have with vSwitch and vCloud networking is what we believe to be the most prevalent form of SDN in the world today."
When it comes to OpenFlow, in Sequeira's view there are a set of vendors that are now building monolithic stacks on top of the OpenFlow protocol, trying to establish control points. As such he expects that SDN silos will emerge over time that will require some form of federation to connect together.
"For us, SDN is a natural extension of our current product lines, extending what we already have for a VMware domain," Sequeira said. "When do we see a world when there is a VMware SDN working with an OpenFlow SDN? I don't think, that's in the cards."


5/10/2012

Welcome - Bienvenidos


Today, May 10th, is the first day of this blog.

The object of itself will be to inform about trends, new technologies, business intelligence and news about IT world.

Trends

  • My first words will be focused in the importance of the IPAD nowadays.                   IPAD-NOWADAYS

  •  Social Networks are more and more important and the politician world is concerned about it.  DemocratsFacebook

Business Intelligence

  • Gartner's 2012 Magic Quadrant for BI Platforms Report.                        Source: Gartner.com

    Magic Quadrant

    Figure 1. Magic Quadrant for Business Intelligence Platforms
    Figure 1. Magic Quadrant for Business Intelligence Platforms
    Source: Gartner (February 2012)

    Market Definition/Description

    This document was revised on 10 February 2012. For more information, see the Corrections page on gartner.com.
    Business intelligence (BI) platforms enable all types of users — from IT staff to consultants to business users — to build applications that help organizations learn about and understand their business. Gartner defines a BI platform as a software platform that delivers the 14 capabilities listed below. These capabilities are organized into three categories of functionality: integration, information delivery and analysis. Information delivery is the core focus of most BI projects today, but we are seeing an increased interest in deployments of analysis to discover new insights, and in integration to implement those insights.

    Integration

    • BI infrastructure — All tools in the platform use the same security, metadata, administration, portal integration, object model and query engine, and should share the same look and feel.
    • Metadata management — Not only should all tools leverage the same metadata, but the offering should provide a robust way to search, capture, store, reuse and publish metadata objects such as dimensions, hierarchies, measures, performance metrics and report layout objects.
    • Development tools — The BI platform should provide a set of programmatic development tools and a visual development environment, coupled with a software developer's kit for creating BI applications, integrating them into a business process, and/or embedding them in another application. The BI platform should also enable developers to build BI applications without coding by using wizard-like components for a graphical assembly process. The development environment should also support Web services in performing common tasks such as scheduling, delivering, administering and managing. In addition, the BI application can assign and track events or tasks allotted to specific users, based on predefined business rules. Often, this capability can be delivered by integrating with a separate portal or workflow tool.
    • Collaboration — This capability enables BI users to share and discuss information, BI content and results, and/or manage hierarchies and metrics via discussion threads, chat and annotations, either embedded in the BI platform or through integration with collaboration, social software and analytical master data management (MDM).

    Information Delivery

    • Reporting — Reporting provides the ability to create formatted and interactive reports, with or without parameters, with highly scalable distribution and scheduling capabilities. In addition, BI platform vendors should handle a wide array of reporting styles (for example, financial, operational and performance dashboards), and should enable users to access and fully interact with BI content delivered consistently across delivery platforms including the Web, mobile devices and common portal environments.
    • Dashboards — This subset of reporting includes the ability to publish formal, Web-based or mobile reports with intuitive interactive displays of information, including dials, gauges, sliders, check boxes and traffic lights. These displays indicate the state of the performance metric compared with a goal or target value. Increasingly, dashboards are used to disseminate real-time data from operational applications or in conjunction with a complex event processing engine.
    • Ad hoc query — This capability enables users to ask their own questions of the data, without relying on IT to create a report. In particular, the tools must have a robust semantic layer to allow users to navigate available data sources. These tools should include a disconnected analysis capability that enables users to access BI content and analyze data remotely without being connected to a server-based BI application. In addition, these tools should offer query governance and auditing capabilities to ensure that queries perform well.
    • Microsoft Office integration — In some use cases, BI platforms are used as a middle tier to manage, secure and execute BI tasks, but Microsoft Office (particularly Excel) acts as the BI client. In these cases, it is vital that the BI vendor provides integration with Microsoft Office applications, including support for document and presentation formats, formulas, data "refreshes" and pivot tables. Advanced integration includes cell locking and write-back.
    • Search-based BI — This applies a search index to both structured and unstructured data sources and maps them into a classification structure of dimensions and measures (often, but not necessarily leveraging the BI semantic layer) that users can easily navigate and explore using a search (Google-like) interface. This capability extends beyond keyword searching of BI platform content and metadata.
    • Mobile BI — This capability enables organizations to deliver report and dashboard content to mobile devices (such as smartphones and tablets) in a publishing and/or interactive (bidirectional) mode, and takes advantage of the interaction mode of the device (tapping, swiping and so on) and other capabilities not commonly available on desktops and laptops, such as location awareness.

    Analysis

    • Online analytical processing (OLAP) — This enables end users to analyze data with extremely fast query and calculation performance, enabling a style of analysis known as "slicing and dicing." Users are (often) able to easily navigate multidimensional drill paths. And they (sometimes) have the ability to write-back values to a proprietary database for planning and "what if" modeling purposes. This capability could span a variety of data architectures (such as relational or multidimensional) and storage architectures (such as disk-based or in-memory).
    • Interactive visualization — This gives users the ability to display numerous aspects of the data more efficiently by using interactive pictures and charts, instead of rows and columns. Over time, advanced visualization will go beyond just slicing and dicing data to include more process-driven BI projects, allowing all stakeholders to better understand the workflow through a visual representation.
    • Predictive modeling and data mining — This capability enables organizations to classify categorical variables and to estimate continuous variables using advanced mathematical techniques. BI developers are able to integrate models easily into BI reports, dashboards and analysis, and business processes.
    • Scorecards — These take the metrics displayed in a dashboard a step further by applying them to a strategy map that aligns key performance indicators (KPIs) with a strategic objective. Scorecard metrics should be linked to related reports and information in order to do further analysis. A scorecard implies the use of a performance management methodology such as Six Sigma or a balanced scorecard framework.