The Ten Worst Internet Security Myths

The Ten Worst Internet Security Myths |
The online world is a very mysterious place to a large number of people. They can navigate to their Gmail and Facebook accounts easily enough, but once you try explaining more complex things to them…well, I’ve noticed this way that their eyes drift away to somewhere less geeky.

This sort of willful ignorance has lead to a large number of internet security myths popping up, and I’m here to wreck a few of them.

Source: The Ten Worst Internet Security Myths

Rights to all content (text, images, videos etc.) with post source. If you think these are wrongly attributed email us

Who Made That Autocorrect?

Who Made That Autocorrect? |
“How fast will new Word 6.0 fix typos? How fast can you make them?” asked an advertisement in a computer magazine from October 1993. The newest version of Microsoft’s word processor came with a brand-new feature called AutoCorrect. Type in “SHip teh cartons friday,” and the program would correct your text to “Ship the cartons Friday.”

The original AutoCorrect didn’t use a dictionary. Instead, it checked each typed-out word against a preprogrammed table of everyday mistakes and their proper substitutions: “teh” for “the,” “friday” for “Friday,” and so on. The makers of WordPerfect, Microsoft’s major rival at the time, soon introduced its own version, called QuickCorrect.

In the years that followed, real-time spell-checkers grew more sophisticated.

Source: Who Made That Autocorrect? –

Rights to all content (text, images, videos etc.) with post source. If you think attribution is in error email us

If your brain were a computer, how much storage space would it have?

If your brain were a computer, how much storage space would it have? |
The comparison between the human brain and a computer is not a perfect one, but it does lend itself to some interesting lines of inquiry. For instance: what is the storage capacity of your brain? The answer to the first question – how much storage space is there inside the average human head? – varies considerably, depending on who you ask. Some estimates come in as low as 1 terabyte, or approximately 1,000 gigabytes. These days, you can purchase an external hard drive with twice that capacity for under a hundred bucks.

 Source: If your brain were a computer, how much storage space would it have?

Rights to all content (text, images, videos etc.) with post source. If you think these are wrongly attributed email us

iPhone design: Documents from the Samsung trial reveal more than ever about Apple’s secretive design process. – Slate Magazine

Like many of Apple’s inventions, the iPhone began not with a vision, but with a problem. By 2005, the iPod had eclipsed the Mac as Apple’s largest source of revenue, but the music player that rescued Apple from the brink now faced a looming threat: The cellphone. Everyone carried a phone, and if phone companies figured out a way to make playing music easy and fun, “that could render the iPod unnecessary,” Steve Jobs once warned Apple’s board, according to Walter Isaacson’s biography.

Fortunately for Apple, most phones on the market sucked. Jobs and other Apple executives would grouse about their phones all the time. The simplest phones didn’t do much other than make calls, and the more functions you added to phones, the more complicated they were to use. In particular, phones “weren’t any good as entertainment devices,” Phil Schiller, Apple’s longtime marketing chief, testified during the company’s patent trial with Samsung. Getting music and video on 2005-era phones was too difficult, and if you managed that, getting the device to actually play your stuff was a joyless trudge through numerous screens and menus.

That was because most phones were hobbled by a basic problem—they didn’t have a good method for input. Hard keys (like the ones on the BlackBerry) worked for typing, but they were terrible for navigation. In theory, phones with touchscreens could do a lot more, but in reality they were also a pain to use. Touchscreens of the era couldn’t detect finger presses—they needed a stylus, and the only way to use a stylus was with two hands (one to hold the phone and one to hold the stylus). Nobody wanted a music player that required two-handed operation.

This is the story of how Apple reinvented the phone. The general outlines of this tale have been told before, most thoroughly in Isaacson’s biography. But the Samsung case—which ended last month with a resounding victory for Apple—revealed a trove of details about the invention, the sort of details that Apple is ordinarily loath to make public. We got pictures of dozens of prototypes of the iPhone and iPad. We got internal email that explained how executives and designers solved key problems in the iPhone’s design. We got testimony from Apple’s top brass explaining why the iPhone was a gamble.

Put it all together and you get remarkable story about a device that, under the normal rules of business, should not have been invented. Given the popularity of the iPod and its centrality to Apple’s bottom line, Apple should have been the last company on the planet to try to build something whose explicit purpose was to kill music players. Yet Apple’s inner circle knew that one day, a phone maker would solve the interface problem, creating a universal device that could make calls, play music and videos, and do everything else, too—a device that would eat the iPod’s lunch. Apple’s only chance at staving off that future was to invent the iPod killer itself. More than this simple business calculation, though, Apple’s brass saw the phone as an opportunity for real innovation. “We wanted to build a phone for ourselves,” Scott Forstall, who heads the team that built the phone’s operating system, said at the trial. “We wanted to build a phone that we loved.”

The problem was how to do it. When Jobs unveiled the iPhone in 2007, he showed off a picture of an iPod with a rotary-phone dialer instead of a click wheel. That was a joke, but it wasn’t far from Apple’s initial thoughts about phones. The click wheel—the brilliant interface that powered the iPod (which was invented for Apple by a firm called Synaptics)—was a simple, widely understood way to navigate through menus in order to play music. So why not use it to make calls, too?

In 2005, Tony Fadell, the engineer who’s credited with inventing the first iPod, got hold of a high-end desk phone made by Samsung and Bang & Olufsen that you navigated using a set of numerical keys placed around a rotating wheel. A Samsung cell phone, the X810, used a similar rotating wheel for input. Fadell didn’t seem to like the idea. “Weird way to hold the cellphone,” he wrote in an email to others at Apple. But Jobs thought it could work. “This may be our answer—we could put the number pad around our clickwheel,” he wrote. (Samsung pointed to this thread as evidence for its claim that Apple’s designs were inspired by other companies, including Samsung itself.)

Around the same time, Jonathan Ive, Apple’s chief designer, had been investigating a technology that he thought could do wonderful things someday—a touch display that could understand taps from multiple fingers at once. (Note that Apple did not invent multitouch interfaces; it was one of several companies investigating the technology at the time.) According to Isaacson’s biography, the company’s initial plan was to the use the new touch system to build a tablet computer. Apple’s tablet project began in 2003—seven years before the iPad went on sale—but as it progressed, it dawned on executives that multitouch might work on phones. At one meeting in 2004, Jobs and his team looked a prototype tablet that displayed a list of contacts. “You could tap on the contact and it would slide over and show you the information,” Forstall testified. “It was just amazing.”

Jobs himself was particularly taken by two features that Bas Ording, a talented user-interface designer, had built into the tablet prototype. One was “inertial scrolling”—when you flick at a list of items on the screen, the list moves as a function of how fast you swipe, and then it comes to rest slowly, as if being affected by real-world inertia. Another was the “rubber-band effect,” which causes a list to bounce against the edge of the screen when there were no more items to display. When Jobs saw the prototype, he thought, “My god, we can build a phone out of this,” he told the D Conference in 2010.

The company decided to abandon the click-wheel idea and try to build a multitouch phone. Jobs knew it was a risk—could Apple get typing to work on a touchscreen?—but the payoff could be huge: If the phone’s only interface was a touchscreen, it would be endlessly flexible—you could use it not just for talking and music but for anything else, including lots of third-party applications. In other words, a touchscreen phone wouldn’t be a phone but “really a computer in your pocket in some ways,” as Forstall said in court.

Apple is known for secrecy, but Jobs wanted the iPhone kept under tighter wraps than usual. The project was given a codename—“Project Purple”—and, as Forstall testified, Jobs didn’t let the iPhone team recruit anyone from outside the company to work on the device. Instead, Forstall had to make a strange pitch to superstar engineers in different parts of the company: “We’re starting a new project,” he’d tell them. “It’s so secret I can’t even tell you what that project is. I can’t tell you who you will work for…. What I can tell you is that if you accept this project … you will work nights, you will work weekends, probably for a number of years.”

The iPhone team took over an entire building at Apple’s Cupertino, Calif., headquarters. “Very much like a dorm, people were there all the time,” Forstall said in court. “It smelled something like pizza, and in fact on the front door of the Purple Dorm we put a sign up that said ‘Fight Club’—because the first rule of that project was to not talk about it outside those doors.” (Thanks to The Verge for transcribing Forstall’s testimony.)

The iPhone team broke down into two separate but closely integrated groups—the guys who were doing the hardware and the guys who were doing the software. (I can’t find any evidence that there were any women working on the phone.) The software team’s main job was figuring out a way to make a completely novel interface feel intuitive and natural. One way they did this was by creating finger “gestures” that allowed you to get around the phone very quickly. Some of these, like pinch-to-zoom, had been used in multitouch projects in the past (you can see some in Minority Report) but others were Apple’s fresh ideas. For instance, Forstall used a prototype iPhone as one of his main computers, and as he used it, he found that constantly pinching to zoom in on the screen became tedious. In a flash, he thought, why not have the phone figure out how to zoom with a just a double-tap on the screen? This was a difficult gesture to implement—the phone had to “understand the structure” of the document it was zooming in on, he explained—but once engineers got tap-to-zoom to work, Forstall found the phone to be much easier to use. “It allowed me to browse the Web much more fluently,” he said.

The hardware team, meanwhile, was trying to figure out what the phone would look like. In court, Christopher Stringer, one of the Apple’s veteran designers, explained that the company created the phone through a process of rigorous refinement. A group of about 15 designers would regularly assemble around a kitchen table set up in Apple’s design studio to review, in painfully fine detail, every idea for various parts of the iPhone’s design. Apple has an extensive array of systems to quickly create physical prototypes of digital designs, and the team would handle all of these prototypes and remark on how they felt. “We’re a pretty maniacal group of people,” Stringer explained, pointing out that they would sometimes review 50 different refinements of a single hardware button.

Documents in the trial revealed some of the many iPhone designs that Apple considered. There were thin phones; fat ones; ones with rounded glass on the front and back; some with flat sides and a rounded top and bottom, and others with rounded sides and flat tops and bottoms; and even one with an octagonal shape. Apple also looked to other companies as inspiration. In 2006, design chief Jonathan Ive pulled aside one of his designers, Shin Nishibori, and asked, “If Sony were to make an iPhone, what would it be like? Would you make it for me?” according to Nishibori’s deposition. The result was a skinny phone that looks much like today’s iPhone, except it had volume buttons on the front, rather than the side, of the phone. (Samsung attempted to argue in court that this design proved Apple copied Sony, but the judge barred that argument, which was bogus anyway—the design didn’t look like any actual Sony phone, and was instead only Apple’s take on Sony’s design aesthetic.)

By the spring of 2006, about a year before the iPhone’s release, Ive and his team had settled on a design for the iPhone. Their winning prototype looked similar to Apple’s 2004-era iPod Mini—it was a metallic device with rounded sides, what designers referred to as “extruded” aluminum. You can see it in a 2006 photo unveiled in the trial—it’s the one left.

Two iPhone prototypes revealed during the Samsung trial. On the left is a version that was scrapped just months before the phone’s release.

The phone on the right is another prototype, one that looks a lot more like the iPhone that Steve Jobs unveiled in January of 2007. Indeed, the phone on the right seems almost identical to the iPhone 4, which Apple launched in 2010. What happened? Why did Apple go from building the phone on the left to a version of the one on the right?

We can’t know for sure, but we have some clues. One reason Apple switched the design was that the rounded sides seemed superfluous. “I’m really worried that we’re making something that is going to look and be too wide,” Apple designer Richard Howarth argued in an email to Ive. Plus, Howarth argued, if Apple cut volume control buttons into the rounded sides, it would remove “the purity of the extrusion idea.”

There was a bigger problem with the extruded-metal phone: One morning Jobs came into the office and declared that he just didn’t love it. As Isaacson describes it, Jobs realized that the design squeezed the phone’s glass display into an aluminum frame—but because the display was the iPhone’s only interface, the design had to put the screen on center stage. Ive realized instantly that Jobs was right. “I remember feeling absolutely embarrassed that he had to make the observation,” he told Isaacson.

So, around the spring of 2006, a few months before the iPhone’s public debut, the team decided to start all over with something new. Looking through their old designs, they found a prototype they’d sketched a year earlier. This phone was a plain rectangle with rounded corners, a single button on its face, and a glass panel that covered the entire face of the phone. This was the iconic design that would become the iPhone.

Changing the design meant that Apple had to alter all of the phone’s internal components in just a few months’ time. The team would have to work nights and weekends in complete secrecy, and most of them would never, ever be able to take credit for what they helped accomplish. Of course, none of this is a surprise about Apple. In some ways, the trial only added fresh details to a story about maniacal precision and obsession that has long been clear. On the other hand, the story is a powerful reminder of something you tend to forget when you goof off on your iPhone: Nothing about it was obvious. Stuff that seems really small and intuitive about its design—things like inertial scrolling, the rubber-band effect, the simple idea of making the device a rectangle with rounded corners—only came about because Apple’s designers spent years thinking those things up and making them real. As designer Christopher Stringer said during the trial, “Our role is to imagine products that don’t exist and guide them to life.”

Source: Phone design: Documents from the Samsung trial reveal more than ever about Apple’s secretive design process. – Slate Magazine

Email Will Never Die – The Man Who Invented It Reveals Why

Texting, instant messaging, Facebook, Twitter – we have dozens of ways to pass a message from one user to the next, and yet we keep coming back to email. Why? According to the man who sent the first one, because there’s still nothing quite like it.

Possibly the most revealing statement that can be made about the power and perseverence of email is that – unlike almost everything else in the technology industry – how we use it has remained virtually unchanged for more than 40 years.

According to the Radicati Group, 144.8 billion emails are sent every day, and that number is projected to rise to 192.2 billion in 2016. There are about 3.4 billion email accounts worldwide, Radicati said, with three-quarters owned by individual consumers.

The youngest users of email, however, have an enormous number of different methods to choose from to communicate – and many of them prefer these methods for most communications.

This, in turn, has prompted to some to wonder whether email is a dinosaur, among them young people who say they actually mean “Facebook” when they say “email”. In 2010, comScore kicked off a fuss by noting that Web email use had dropped 59% among teens. So why would anyone continue to use email in the age of social media?

“Because none of them really fill the space that email serves, which is you have a specific audience,” answers Ray Tomlinson, a principal engineer at BBN Technologies and the so-called “father of email.”

“A lot [of the alternatives] are like a billboard, with limited utility – you put these things on the billboard, and if they choose to they [your audience] can look and see it.”

“But email has the time difference – that is, you send it now, you read it later – you don’t have to have someone sitting there and ready to respond like you do with instant messaging to make it work and make it effective,” Tomlinson explains. “You can use instant messaging that way, but if they’re not there, nothing happens, and you gotta remember that there may be a message coming back to you and go back to the IM client and look for the response.”

The Birth Of Email

In 1971 Tomlinson worked as an engineer for Bolt Beranek and Newman (BBN), a contractor that had been assigned to develop ARPANET, a communication network that would allow scientists and researchers to share each other’s computer resources.

In the fall of 1971, Tomlinson sent the first network email, using the SNDMSG program that ran on the TENEX time-sharing program for Digital PDP-10 computers. Email on a single computer had existed since the early 1960s, the equivalent of a digital post-it note that could be left to another user. But Tomlinson tweaked the CPYNET file transfer program, then appended it to SNDMSG. That gave one user the power to send a message to another on a remote machine, and email was born.

The first email message has been lost to history; Tomlinson tells ReadWriteWeb that it was one of a number of “entirely forgettable” test messages. But that first email message, sent from one machine physically sitting next to another, functioned as a sort of “hello world” message explaining that, well, network email was up and running. The response was low-key.

“I don’t recall any actual replies” to the first email, Tomlinson says. “I did get some comments from people in the hall.”

Tomlinson was also the first person to use the now ubiquitous “@” symbol – a no-brainer, as it explained that a user was “at” a given host, Tomlinson said. There was one glitch, however: “I was later reminded that the Multics time-sharing system used the @ sign as its line-erase character. This caused a fair amount of grief in that community of users,” he notes on his own website.

Email began to take hold as both a cultural and a technical phenomenon in 1972, when the next release of TENEX was shipped – on magnetic tape via snail mail – to some 15 other sites scattered around the country. Users could then send messages back and forth. As each site came online, email’s utility increased, Tomlinson recalls.

Even back then, though, email was used in much the same way it is now.

“I think it was mostly used as a replacement for telephone calls,” Tomlinson says. “You got a more immediate response. With time zone differences you didn’t have to have someone there to receive the call.”

Email Today

Forty years later, email use has grown to enormous proportions. But most of it is not legitimate communications and more than half of it never gets delivered. According to the Messaging Anti-Abuse Working Group (which has reformed to fight take on malware as well) between 88% and 90% of all email sent during the first three quarters of 2011 were spam, or unsolicited commercial email. For example, Microsoft’s Hotmail alone processes more than 8 billion messages a day. But only some 2.5 billion messages are delivered to the user’s inbox.

Several types of methods of dealing with spam have sprung up: blocking or “blacklisting” domains notorious for sending spam; blocking everything except for approved“whitelisted” domains,” and various filtering techniques that use reputation or text analysis to try and block suspicious emails.

Tomlinson supports whitelisting, where only users who pass through some additional level of security are allowed to send email. “If it’s a person out there he’ll send it again,” Tomlinson said. “If it’s a machine he’ll move on and send it to the other five million.”

But the spam problem is also one of identity. When Tomlinson first sent networked emails into the ether, the address was a specific person. Today, email senders can use aliases, multiple accounts and even bots to communicate. Should users be forced to tie themselves to a single email identity? The debate has included both Facebook chief executive Mark Zuckerberg, who has promoted user Facebook accounts as identity tokens, as well as 4chan founder Christopher Poole, a strong advocacy for privacy and anonymity online. Tomlinson takes a middle view.

“In some ways the lack of an official identity when using email has compounded problems like spam, but I think that’s the convenience versus utility versus functionality,” Tomlinson says. “It’s more convenient if you don’t have to worry about identifying yourself. You don’t have to buy a [security] certificate, or authenticate the centers of email.

“I think completely anonymous email would not be a good idea,” Tomlinson adds. “On the other hand, having email identities that you can link to very specific information is a definite problem. It’s one thing to say I am who I am, but I’m not going to tell you my life history at the same time.”

The Future of Email

In many ways, the future of email is already here today. SMS text messages are archived; instant message windows can be left open, and Facebook Messenger treats an instant message to an offline friend as, essentially, an email. This latter model is what Tomlinson sees email evolving into over time.

“Whether the name will persist or not, I suspect email will be around for at least for a good long time,” Tomlinson predicts. “We may find that these other forms of communication may be merged with email, so you send an IM to somebody, and if they don’t respond it turns into an email-like thing without any intervention on your part.”

Source: Email Will Never Die – The Man Who Invented It Reveals Why

How the Tech Scene in India is Changing

Two news items from the tech startup ecosystem broke through the clutter of politics, cricket, Bollywood, and more politics news in India earlier this month. The first was good. The second was just grim.

Just Dial, which runs the local business listings site, had raised US$57 million in one of the largest ever pre-IPO rounds of VC funding in India from existing investors Sequoia Capital and SAP Ventures.

This round also made Just Dial one of Sequoia’s single largest investment commitments in the country. It is also on its way to a purported US$130mn initial public offering, the biggest ever for a consumer internet company in India.

The second, grim piece of news in a business magazine was that online retailer Flipkart, long heralded as the ‘Indian Amazon,’ was in fact in trouble thanks to lower margins, higher cost pressures, and a host of other problems. How authentic was the report, was argued for and against on all social platforms. We do know that Flipkart clocked US$18 million in sales in June. Profits, we don’t know.

In a way, the news reflects the current paradox of a situation with the Indian tech startup space – doing awesome in the service space, but not really cutting it till now in the product end of things.

Services? Ok. Product? Ummm…

In 2010, one of India’s top travel bookings sites, MakeMyTrip, listed on Nasdaq, raised about US$80 million, and became the first Indian company to list on the bourse since 2006.

Sanjeev Aggarwal, co-founder of Helion venture Partners, which is an investor in MakeMyTrip, bus ticket booking site Redbus and also growing online ad platform Komli Media, says that as far as services go, the Indian ecosystem has succeeded.

“When we talk of services such as travel bookings, payments, and classifieds, where the delivery mechanisms are simple, the ecosystem has matured. The product ecosystem still has to come up to scale however,” says Aggarwal.

Aggarwal points out that there are a host of e-commerce startups coming in the five or six verticals, such as apparel, footwear, personal care, electronics etc. and this is a period where all of them are trying to gain a foothold in a growing population on internet users. In the process, there are stretching themselves beyond safe limits, and investors too are rushing in.

The numbers are seducing when it comes to the consumer internet space in India. Forrester Research predicts that India will have the third largest number of Internet users in the world by 2013, after China and the US. According to the Internet and Mobile Association of India (IAMAI) the number of internet users in India jumped to over 120 million this year and will triple by the year 2015.

“By 2015, we would also separate the men from the boys. The market would mature by then, and we will see real sustainable models come through. Right now, none of these businesses have their unit economics fine-tuned,” he says, adding that his firm is watching the space to identify such sustainable models.

A bubble?

Sameer Guglani, co-founder at The Morpheus, an incubator that has mentored about 54 startups in all from the technology space, says that current pure multi-brand retail ecommerce businesses do not make the cut.
“I think the space is overheated. There are just too many startups trying to do the same thing, or aping the west. They are spending tons of money on advertising and marketing. Such businesses would not be able to last long,” he says.

Morpheus, which is taking in its 8th batch of startups currently, therefore is actively staying away from such startups. Other incubators and accelerators such as the Startup Centre and Seedfund are also approaching this space with caution, as are government-backed incubators.

Instead, says Guglani, they are looking at online startups that are more engaging and are also working towards a more targeted audience.

According to Guglani, the problem in the space is that people are getting pulled in with the lure of the great Indian internet user base, but not acknowledging that problems exist in actual infrastructure in the country.
Indeed, one of the most vehement criticisms of the ecommerce space is that the country lacks shipping and payment infrastructure to be able to sustainably support ecommerce businesses. Some like Flipkart, and growing online retailer Myntra are investing in their own shipping network at a high cost.

No support infrastructure

Ashutosh Lawania, co-founder at Myntra, tells us that the company is currently investing heavily in making its own shipping network. Funded by the likes of NEA, IDG & Accel Partners, Myntra has now its own shipping network in the major metros and is now going to about 25 cities by year-end.

“The cost of delivery has gone up for us now because of this. But with volumes and time, we expect this cost to come down,” he says, adding that Myntra is expected to break even in another 18 months and would add further product categories such as home furnishings and jewellery to its existing offering of footwear, accessories and apparel.

Guglani thinks that break even for the entire space is even more far away than that. “We have a system where all these top online businesses are offering Cash on Delivery services, which goes against the entire grain of an ecommerce model. That is because payment infrastructures have also not matured.”

Josh Bornstein, co-founder at Footprint Ventures, agrees on this point and adds more. He says that service startups in India have an easier path where the delivery mechanism is simple. But it is much more complicated for product startups who have to find ways to collect money.

“There is no doubt that there is a problem with the physical and electronic infrastructure in the country. While it is convenient to sit in one corner of the country and order something, it is not that easy to deliver it and reconcile the payment as well,” he says.

Footprint Ventures has invested into tech startups such as Canvera, that provides e-commerce solutions to photographers, bus travel bookings site Ticketvala that was sold to MakeMyTrip, and enStage, a payments provider.

Bornstein says that enStage was an investment that best describes Footprint’s philosophy of investing not in pure users but investing into a company that would make revenue off a growing base of payers. He also adds that most Indian online retailers are poor in their inventory management.

Unit economics? D+

Aggarwal of Helion adds to that, saying that the unit economics of most product startups in this space is poor and that is why they end up selling their products at most times with little or no margins. “Logistics and warehousing is one end of it. You should also be able to manage inventory supply and demand. That bit needs maturity and experience,” he adds.

Eventually, says Aggarwal, some of these companies would get the unit economics and revenue models streamlined. “That is when you can expect some correction and leadership to occur in this area.”
Aggarwal says that there are three periods of growth in the consumer Internet space, the first being the making of the business and revenue model, the second of getting the unit economics right, and then the period of making profits. “Right now, most of these startups are stuck in between period one and two.”

Indeed, we have already seen the correction in business models that were earlier backed with much bravado. Most noticeable is the death of India’s Groupon clones. While only a year ago, there were a dime a dozen, now there are hardly a couple of these daily deal sites still running. The biggest player in the space, Snapdeal, which was all over television advertising a year ago, has now shed its daily deal avatar to become a business model of pure retail.

Consolidation is coming

As the global slowdown has trickled into India again and a lull is expected in the space as purchasing power dips, perhaps a lot more than just a change in business models is expected.

“You can expect a lot of consolidation and correction going forward. My guess is that in a few years, we would see a lot of the big guys gobbling up the smaller ones and eventually a more mature ecosystem,” says Footprint’s Bornstein.

Indeed, earlier this year Flipkart bought out a majority stake in Letsbuy, another multi-brand online retailer, said that it would continue independently, and then promptly shut it down a couple of months later. More consolidation is expected sooner rather than later. Once that period is over, Bornstein then expects a new spate of companies that would piggyback on the efforts, lessons, and framework that their predecessors left behind. Much like the Phoenix.

Whichever way it goes, Bornstein reiterates that eventually, the Indian consumer Internet space would flower and be bigger than the current top players. After all, Amazon did not become Amazon in five years, he says. Helion’s Aggarwal says that the demand, the talent, and the investors are there. And it’s a matter of waiting and watching as the consumer Internet in India comes of age.

Source: How the Tech Scene in India is Changing

How Something You’ve Never Heard Of Is Changing Your World

I’ve got a riddle for you. What do Blu-ray disks, military radars and LED light bulbs have in common? Chances are, if you work outside of the defense or electronics sectors, you may not easily make the connection. But the common thread is a little-known technology called Gallium Nitride (GaN for short). GaN is evolving rapidly behind the scenes to transform many aspects of modern day life, while also serving vitally important roles within our nation’s military.

GaN is a wide band gap semiconductor material with special properties that are ideal for applications in optoelectronics, and high-power, high-frequency amplifiers. Aerospace and defense innovators have long recognized the critical competitive advantages GaN represents for high frequency electronics – including significant cost, size, weight and power reduction capabilities – and have spent years refining and continuously pushing GaN technology to new limits. For example, GaN is playing an integral role in developing more reliable military radars that can be five times more powerful than traditional systems or only half the size. In recent years, technologists across a number of commercial industries have taken notice of these pioneering innovations for the military, and have started putting GaN to work to power every day technologies in ways that significantly reduce energy costs and environmental impact.

Take the Blu-ray disc, for example. The next generation DVD is changing the way the world watches movies. Blu-ray discs store video and audio data packets in “pits,” or tiny grooves, which are about half the size of those in traditional DVDs. The tiny, highly accurate Blu-ray laser beam – powered by GaN-based violet laser diodes – can precisely read these hyperfine pits. This enables closer spacing of data packets and up to five times the storage capacity of a traditional DVD (roughly 27 GB of data). GaN technology enables higher resolution for the crystal clear imagery modern movie buffs have come to expect. With support from two of the world’s largest PC manufacturers, HP and Dell, Blu-ray technology is poised as the next-generation optical disc format – with potential to increase PC data storage exponentially in the coming years.

You’ve likely seen the light bulb revolution that’s taking place, but may not have known gallium nitride is at the center of it. As traditional, century-old incandescent bulbs are slowly phased out by federal mandate, LED light bulbs represent the future of the lighting industry. A GaN-powered LED light bulb can easily outlast traditional bulbs by several years, while consuming a tenth of the power and reducing CO2 emissions by 90 percent. The Department of Energy recently commended Philips Lighting for creating a LED bulb that would last more than 20 years – an innovative design with the potential to save Americans a combined $3.9 billion in annual energy costs and reduce U.S. carbon emissions by 20 million metric tons. A number of young companies, including startup Sorra, remain focused on driving innovations in cost-effective LED lighting for the masses.

And LCD televisions, backlit by GaN-powered LED lighting, are thinner, lighter and up to 40 percent more energy-efficient than those using CCFL backlighting. In an effort to reduce the price point for consumers, pioneering companies such as Sony are now introducing the next wave of LED televisions, which will use edge-lit LED as the TV’s light source, reducing the number of LED lights required as compared to first generation LED televisions.

For mobile users, GaN can help ensure an affirmative answer to the old question, “Can you hear me now?” The efficiency and resistance to heat and electronic interference of microwave amplifiers built with GaN enables broader, more reliable cellular coverage, while eliminating the need for power-sucking cooling fans required by older cell phone tower technologies. RFHIC Corp of Suwon, South Korea, which makes GaN-based radio frequency and microwave components for telecommunications and broadcasting industries, estimates U.S. carriers could save approximately $2 billion per year by using GaN technology for their wireless infrastructures. Large carriers, including Sprint, have already launched GaN-powered towers in several markets.

While GaN-powered technologies quickly evolve to alter many aspects of modern day life, GaN electronics are expected to play an increasingly more important role within our nation’s military systems. Raytheon has been awarded a contract by the Defense Advanced Research Projects Agency (DARPA) to develop next-generation GaN electronic devices bonded to diamond substrates, which is expected to triple current GaN circuit capabilities. The application of a markedly more efficient GaN-on-diamond material is expected to significantly benefit next-generation radar, communications and electronic warfare systems that employ GaN-based radio frequency devices.

When you think of how much technology is empowered by a tiny microchip, it’s not hard to imagine how GaN will rapidly accelerate innovation across numerous industries in the years ahead. Undoubtedly, future innovators will find new ways to apply GaN technology to our iPads and smartphones, bringing the networked world to consumers’ fingertips more quickly and effortlessly. Companies from start-ups to larger enterprises looking to revolutionize their industries would do well to consider how GaN can drive innovation within their business models. In the meantime, rest assured, innovators, investors and military engineers are already hard at work, staging the next technical revolution.

John C. Zolper, Ph.D. is the Vice President of Research and Development at Raytheon, an American corporation with core manufacturing concentrations in weapons and military and commercial electronics. So yeah, neat stuff.

How Something You’ve Never Heard Of Is Changing Your World | TechCrunch

Insanely bad: Ten Apple duds of the decade

Apple has had an amazing decade, revolutionizing industries as diverse as music, movies, telecoms, and software with its iPod/iTunes/iPhone products and ruling the OS Wars with its much-copied Mac OS X.
Apple CEO Steve Jobs won PC Advisor’s Person Of The Decade. Fortune magazine named him CEO Of The Decade.
Could the man do no wrong?
Oh, yes he could do wrong. Very wrong.
Here’s our list of ten products—in no particular order of badness—that Apple and Steve Jobs probably wish had never seen the light of day.
Read On

Top 7 Disruptions of the Year

Technology is like a dog; each year of it seems like the equivalent of seven human years — at least when you get to the end of it and realize it’s only been 12 months since that now indispensable service first launched.

We spent 2009 documenting technology’s disruption of how we live, entertain ourselves and do business. Looking back on the year from the comfortable perch of December, here are the seven most disruptive developments of 2009.
Read More…
eklectica - this, that & everything