Wednesday, July 1, 2009

The Latest from TechCrunch

The Latest from TechCrunch

Link to TechCrunch

StyleCaster Nets $4 Million For Personalized Fashion Community

Posted: 01 Jul 2009 08:00 AM PDT

Interactive fashion community StyleCaster has raised $4 million in Series A round of financing from investor Dan Gilbert, the chairman and founder of Quicken Loans.

Launched in February 2009, StyleCaster is a platform that hopes to be the future of online fashion. The site features style tips optimized for each individual, news and content on the latest fashion trends, a niche social network based around style, and a large online retail catalog of brand-name clothing. And the site isn’t just geared towards women—StyleCaster will soon include a community around men’s fashion.

The funding will be used to invest in content creation, web-based and mobile applications, and its advertising network. StyleCaster is an interesting take on combining an interactive community with shopping and editorial and styling content. The site blends several different experiences into the fashion-based platform, creating a one-stop-shop for fashonistas.

Crunch Network: MobileCrunch Mobile Gadgets and Applications, Delivered Daily.


Big Websites Start Running Bigger Display Ads. Big Mistake.

Posted: 01 Jul 2009 06:15 AM PDT

According to MediaPost, The Online Publishers Association yesterday announced that 37 of its members, including juggernauts like The New York Times, Forbes, ESPN, CNN and MSNBC.com are (or are soon going to start) running the new, larger ad units the organization introduced last March. Since the members who are running these campaigns for brands like Bank of America and Mercedes-Benz reach about 68% of the total U.S. Internet audience, there’s a good chance you will soon see them, too.

There’s also a good chance you’ll hate them.

The three new ad units are named and sized as follows:

- The Fixed Panel (336×700) - remains in view as a user scrolls up or down the page
- The XXL Box (468×648), the extra wide side-of-page ad that expands to 936 x 648 and includes page-turn and video capability
- The Pushdown (970×418), which opens big to display the ad and then after seven seconds rolls up to the top of the page (collapsing to 970 x 66).

The first one was actually supposed to be 860 pixels in height, but they reconsidered it and brought it down to 700, reportedly after feedback from publishers.

In order to visualize how big these ads are in their most expanded state, I overlayed the TechCrunch homepage with boxes of the same size:

Here’s how OPA President Pam Horan justified the introduction of the new ad units:

“The real motivation was to provide marketers and agencies with the opportunity to deliver a branded experience directly on the pages of these very rich content sites.”

But what about the children website visitors?

Nobody seems to really care, apart from the fact that OPA recommends their members that the frequency of the pushdown ad be capped at once per day per page. Horan says it passed that recommendation on because they “want to stay close to consumers”. I puked in my mouth a little.

Is it just me or does anyone else think that display advertising units on websites should become more relevant to them instead of just bigger? What’s next, 1200×600 ads?

Crunch Network: CrunchBoard because it’s time for you to find a new Job2.0


And Now You Know: Enabling Multi-touch in Firefox 3.5

Posted: 01 Jul 2009 05:11 AM PDT

Did you know you can switch tabs in Firefox by making a twisting motion with your fingers on a multi-touch surface? I did. Turns out I've been doing it for months — I thought I was late to the party and was too ashamed to mention it to anybody for fear of an epic internet ribbing ("What, you just figured that out?"). But no, apparently it was top secret and highly experimental. That was in the beta, though; it looks like the official version has reduced it to a hack. Fortunately, mastering this multi-touch-enabling technique will allow you to tweak your gestures, resulting in everlasting glory.


Venture-Backed Liquidity Going Down, Down, Down

Posted: 01 Jul 2009 04:50 AM PDT

Fifty-seven percent. That’s how much overall venture-backed liquidity decreased in the second quarter of 2009 compared to that of last year: from $6.48 billion to $2.8 billion, if you want the hard numbers. Looking at the chart, I’d say the drop compared to the second quarter of 2007 ($14.6 billion) is even more telling. It’s the bad news from this just-released Dow Jones VentureSource report, with the only positive nugget the fact that three VC-backed companies have been able to complete IPOs (raising a total of $232 million), ending a nine-month drought.

Just half an hour ago, we reported separately that the National Venture Capital Association actually counted five IPOs during the quarter in which a total of $721 million was raised (including DigitalGlobe-$279, SolarWinds-$152 million and OpenTable-$60 million). Also, while the NVCA pegged the number of venture-backed acquisitions at 59 in the second quarter, generating $2.6 billion, the Dow Jones VentureSource report says $2.8 billion was reached through M&As of 67 portfolio companies instead.

Either way, it’s looking very bleak out there, as venture capitalists are not only struggling to take their portfolio companies public but also to sell them. According to the Dow Jones report, M&As were down 60% from the $6.48 billion raised via 89 M&As in the same quarter in 2008. This represents the lowest quarterly M&A deal total since 1999. Furthermore, the median amount paid for a VC-backed company in the second quarter of 2009 was just shy of $22 million, a 46% drop from the nearly $41 million median paid during the same period in 2008.

Jessica Canning, Director of Global Research for Dow Jones VentureSource, commented that the market appears to be correcting the “possibly inflated figures” posted in 2007, but sees the IPO window finally opening up again.

I second Erick’s comment that it’s a bit too early to call it a come-back, but I’m sure a lot of people are happy to finally see at least some IPO activity again, even if it pales in comparison to what we’ve seen from 2004 to 2007.

Crunch Network: CrunchBoard because it’s time for you to find a new Job2.0


The Venture-Backed IPO Pokes Its Head Out Of The Water In 2nd Quarter, M&A Still Meh

Posted: 01 Jul 2009 04:30 AM PDT


After four quarters in which venture-backed IPOs have been dead in the water, a handful finally poked their heads up in the second quarter. The National Venture Capital Association counted five IPOs during the quarter, including DigitalGlobe ($279 million raised), SolarWinds ($152 million), and OpenTable ($60 million). A total of $721 million was raised. Just for a little context, two years ago during the same period, there were 25 IPOs which raised $4.15 billion.

So don’t call it a comeback just yet. But any activity is a sign of hope. And this was the most active period since the fourth quarter of 2007. Will it keep building, or will IPO candidates duck their heads back under water?

On the M&A front, the pace remained lackluster. There were 59 venture-backed acquisitions in the second quarter, which was down from 84 deals a year ago (and 62 deals in the previous quarter). The total value of M&A deals was $2.6 billion.

The average M&A deal size shot up to $198 million, no doubt boosted by Intel’s $884 million acquisition of Wind River Systems and NetApp’s $1.5 billion acquisition of Data Domain. However, Internet deals still represented 29 percent of the total count, with lots of smaller deals such as AOL’s purchase pf both Going and Patch.


Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


Realtime Matching Startup Raises Series A From Dawn Capital

Posted: 01 Jul 2009 03:55 AM PDT

The wave of investments in "realtime" is continuing with today's announcement from Cognitive Match that it has raised Series A investment from Dawn Capital. Terms were undisclosed but it's understood the figure was in the £1m+ ballpark, in tranches. The UK company applies artificial intelligence, learning mathematics, psychology and semantic technologies to match content to individuals in, you guessed it, realtime. This content can be product, offers, editorial or advertising of course, making it a very interesting prospect for an outfit like Twitter.


Twitter Grows “Uncomfortable” With The Use Of The Word Tweet In Applications (Updated)

Posted: 01 Jul 2009 01:43 AM PDT

We were just forwarded an e-mail conversation between a Twitter API team member and a third-party developer because the latter was using a UI for its web-based service that was admittedly very similar to Twitter’s web application.

The startup of course has the right to protect its assets and do its utmost to avoid confusion with users who might think they’re using a Twitter product rather than that of a developer making use of its API.

But something else caught our attention in the thread:

Hi,

Twitter, Inc is uncomfortable with the use of the word Tweet (our trademark) and the similarity in your UI and our own. How can we go about having you change your UI to better differentiate your offering from our own?

Thanks,

First of all, I had no idea that the word ‘tweet’ was trademarked by Twitter, and after browsing its Terms of Service and API documentation I couldn’t find any reference on their website about this either. (update: a commenter links to the US trademark application, which was filed April 16, 2009 and another one claims a trademark application has been filed in Europe in June as well)

Second, I’m assuming that the note about the company being ‘uncomfortable’ with the use of the term was in reference to the combination of that with the closely resembling UI of the web application. If I’m wrong and this signals that Twitter wants to move forward with actively barring third-party apps from using the word ‘tweet’ in their names in the same way that it refrains them from using the word ‘twitter’, then this could have consequences for a plethora of developers.

Should TweetDeck, TweetMeme, Tweetie, BackTweets, Tweetboard etc. start worrying?

We’ve asked Twitter management for clarification.

Update: Twitter co-founder Biz Stone’s response (emphasis ours):

“The ecosystem growing around Twitter is something we very much believe in nourishing and supporting. As part of this support, we encourage developers of new applications and services built using Twitter APIs to invent original branding for their projects rather than use our marks, logos, or look and feel. This approach leaves room for applications to evolve as they grow and it avoids potential confusion down the line.

As we build our platform team, we will be adding more guidelines and best practices to help developers get the most out of our growing set of open APIs. We have healthy relationships with existing developers who sometimes include Twitter logos, marks, or look and feel in their applications and services. We’ll continue to work together in a fair and flexible way to ensure success for Twitter, developers, and everyone who uses these services.”

It’s a rather vague statement that doesn’t really make it clear whether the use of the word ‘tweet’ is now frowned upon or not. We’ll see when the API team puts forward clear guidelines on this in the future.

Crunch Network: MobileCrunch Mobile Gadgets and Applications, Delivered Daily.


Joost’s Last Hope Isn’t A Promising One

Posted: 01 Jul 2009 01:22 AM PDT

It’s sad to see a company that we were all so excited about fade further into oblivion. Today Joost, one of the most anticipated startups in 2006/2007, is just an also ran in a sea of big online video sites like YouTube and Hulu. Today CEO Mike Volpi stepped down, the company is laying off most of staff, and refocusing the business to “white label online video platforms for media companies.”

Om has a good monday morning quarterback overview of why they failed, but to me it comes down to just a few things. They over funded ($45 million before they even launched) and they ignored the fact that users were quite willing to sacrifice quality in online video for the convenience of Flash in the browser. Joost waited until late last year to go all Flash - until then users had to use the downloadable Joost software and allow P2P streaming of shows. In the meantime there was no linking to Joost videos. YouTube and Hulu got all that social media and SEO juice that could have gone to Joost.

Founders Niklas Zennstrom and Janus Friis, who founded Skype and Kazaa, see the world in terms of P2P and downloadable clients. The joke about how everything looks like a nail if you’re a hammer is very true with Joost. But what worked with Kazaa and Skype a decade ago doesn’t work with online video in today’s world, obviously.

And this new business focus for Joost - white label video platforms - is a very tough market. Yahoo just bailed on it entirely after investing $160 million or so in an acquisition of Maven Networks last year. And competitors like Brightcove and Ooyala aren’t just going to roll over and let Joost take market share in this space.

Here’s what I learned from Joost’s failure - celebrity founders, celebrity CEOs and tons and tons of cash can be a recipe for disaster. Applying yesterday’s solutions to today’s problems isn’t an interesting business. And finally, knowing when to throw in the towel and just return what’s left of capital to investors is an important skill as well. That way everyone can move on and focus on real value add opportunities. There’s no room for Joost in the consumer online video space, and there’s almost certainly no room for them in white label video, either. Time to call it a learning experience and move on.

Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


Live Web, Real Time . . . Call It What You Will, It’s Gonna Take A While To Get It

Posted: 30 Jun 2009 07:40 PM PDT

This guest post is written by Mary Hodder, the founder Dabble. Prior to Dabble, Hodder consulted for a number of startups, did research at Technorati and wrote her masters thesis at Berkeley focusing on live web search looking at blog data.

Hands on clock

Real time search is nothing new. It is a problem we’ve been working on for at least ten years, and we likely will still be trying to solve it ten years from now. It’s a really hard problem which we used to call “live web search,” which was coined by Allen Searls (Doc’s son) and refers to the web that is alive, with time as an element, in all factors including search.

The name change to “real time search” seems a way to refocus attention toward the issue of time as an important element of filters. We are still presented with the same set of problems we’ve had at least the past ten years. None of the companies that Erick Schonfeld pointed to the other day seem to be doing anything differently from the live web search / discovery companies that came before. The new ones all seem to be fumbling around at the beginning of the problem, and in fact seem to be doing “recent search,” not really real time search. While I’m sure they’ve worked really hard on their systems, they are no closer than the older live web search systems got with the problem. All the new ones give a reverse chron view, with most mixing Twitter with something: blog data, other microblog data, photos, creating some kind of top list of recent trends. Some have context, like a count of activity over a period of time, or how long a trend has gone on or a histogram (Crowdeye) which both Technorati and Sphere experimented with in the early years. Or they show how many links there are to something or the number of tweets. All seem susceptible to spam and other activities degrading to the user experience and none seem to really provide the context and quality filters that one would like to see if this were to really work. All seem to suffer from needing to learn the lessons we already learned in blog search and topic discovery.

Publicly available publishing systems starting in 1999 took the value of time and incorporated it into what was being published (think Pyra which is now Blogger, Moveable Type, Wordpress and Flickr, among the many) as well as search and discovery systems for those published bits like Technorati, Sphere, Rojo, Blogpulse, Feedster, Pubsub and others, to walk down memory lane . . . (btw, for disclosure purposes I should state that I worked for Technorati in 2004 for 10 months, and consulted or advised most all the others in one form or another).

I started working on this problem in 1999, at UC Berkeley, and eventually did my master’s thesis on live web data search and topic discovery at SIMS (or the iSchool as it’s now known). From 2000 to 2004, people at SIMS would say to me, “What are you doing with blogs and data, it’s just weird. Why does it matter?” But the element of time was the captivating piece that was missing for me from regular search. It’s the element that makes something news, as well as the element that can group items together in a short period to show a focus of attention and activity that often legacy news outlets miss (until more recently when they decided that live web activity was interesting).

Barney said, you have my explicit permission to flickr me, so get your camera..

At Burning Man in 2005, under a shade structure during a hot, quiet afternoon, I remember having a four or five hour conversation with Barney Pell (who would later found Powerset) about the Live Web and Live Web Search, how to do it, what it meant, how to understand and present time to the user, how much was discovery and how much was search, how structured was the data you could get and how reliant on the time could you be with the data, what meaning you could make from that data, etc. Sergey Brin was sitting and listening, and finally, after a couple of hours, he asked me, “What is the live web and what is live web search?” Since Barney and I had already been doing a deep dive, I assumed Sergey knew what we were talking about, so it surprised me, but I explained why I thought time was a huge missing element of regular search, and that this was the type of search I worked on. Barney and I continued for a couple more hours. And it got cooler so it was time to go admire the art and that was the end of that. But I have wondered over the years where Google is with the live web and when they might do something with time. Twitter seems to be prodding them.

In 2006, “The Living Web” Newsweek cover story by Steven Levy and Brad Stone poked at this issue for the first time in a national forum.

When I look at the latest crop of search startups, I think: Why are we doing it all the same way again? Reinventing the wheel? Is anyone doing anything original either with data or interface? Is anyone building on what we’ve learned before about the backend or UI’s?

Frankly, our filters suck.. and I suppose that if a name change gets us to think anew about better filters, well, I should rejoice. I’m partly to blame for the bad filters we have to date because in having worked on this problem, I’ve contributed to some of the various live web or real time or whatever the word of the moment is to describe trying to solve this problem. We are very good at publishing our thoughts and visions, with time stamps, but not very good at the filtering side of things. The old method of information search and discovery was to open the paper or magazine, turn the pages with editorially filtered and placed information, and when you were finished, you said, “Okay, I’m informed” (whether you really were or not). But the media got complacent, missed stories and with the ease of blog publishing and sites like Flickr for photos, we could replace paper and supplement our information needs with the whole web. The only problem is, it’s the whole freaking web. An avalanche. We feel anxiety on the web from the lack of filter and editorial grace that one or two printed news sources used to give us.

I did a study in 2002, which I repeated in 2004 and again last year in 2008. I asked users to track their online information intake for one week. There were only 30 people in each study, chosen randomly from Craiglist ads, but what I found across each group of 30 was that the average time spent online with news and information sites was 1.25 hours in 2002, 1.85 hours in 2004 and 2.45 hours in 2008. These people are not in Silicon Valley, but they do all have broadband at home and live in the US. Every one of them reported some level anxiety over the amount of data they felt they needed to take in in order to feel informed. They often dealt with it by increasing the time they took to stay informed. They didn’t know that better filters might actually reduce their anxiety.

As Erick noted, the tension to solve this problem is between memory and consciousness; or as Bob Wyman and Salim Ismail called it at Pubsub: retrospective verses prospective search. And it is part of the issue. But there is more.

Discovery does mean you have to introduce time as an element. The user cannot be expected to know what is bubbling up, or the specific phrases that will name the latest thing.

Some people will say “michael jackson” and some will say “MJ” and some will say “king of pop.” And Michael Jackson as a topic is actually pretty easy. I remember once doing usability tests for a live web search and discovery system in 2003, where we asked users to search on Google News and various live web systems for an incident in Australia where a “giant sea creature” was found. But since all the media covering it originated in Australia, and they’d all called it a “massive squid,” and all the follow-on American sources including bloggers had copied the Aussie language, there were no recent hits for “massive sea creature.” Testers had to think creatively about how to get to the info they knew was there, and yet it was a semantic leap. One search tester actually cried as she refused to give up, she was so determined to find the result in any of the live web systems we were testing. We begged her to stop; it was painful. Good discovery could have helped.

Another key element of discovery and live web search is getting structured data, because spidering, which Google uses to get data from the web for it’s regular retrospective web search, makes understanding time with a published work more difficult. It’s hard to work with time if you only know for sure when you spidered the page. Twitter on the other hand has structured data because everything is published in their silo so the sites they provide their complete stream to get it in a structured format. They know the time of each tweet. Not to mention the data is available through API’s. This is the most efficient way to draw out meaning for search because you know for sure about the context of each piece of data, with time as one of the pivots, for search and discovery.

You also need to get the data model right for the backend search data base, in order to get meaning and link metrics. And you need to understand the different corpuses of data to know what things mean to users (not engineers), and figure out the spam and bad actor problems. There is the original context the data had and there is the UI which is so difficult when trying to make time understandable for many users. In fact some think that communicating the time element to regular users is so hard that making time focused search is really an “advanced search” problem.

If designed poorly, the system can contribute to the unnatural production of skewed data by users. If the system involves some sort of filter for authority or popularity, they are subject to power law effects (Technorati calls their metric “authority” but inbound link counts from blogs are not authority, they’re just a measure of popularity). What’s a power law effect? It’s when a system drives activity to reinforce unnaturally the behavior that caused something to be there in the first place. For example, if one of the metrics of a filter counts the number of people clicking on a top search, then the more clicks, the longer the item will stay at the top of the list of searches, even if naturally it would have fallen off the list earlier. Conversely if a metric for a filter involves a spontaneous act, driven by imagination, like writing a tweet, then exposing those items at the top of the filter might be less likely to drive up activity. However, if you show the results to the users, upon seeing a popular topic, they might begin tweeting about that topic without having thought of it before seeing the popular topic. In other words, by revealing the metrics you focus on, you can push users to change their behavior. By driving behavior, power-law distributions keep things with some power at the top because they are at the top or can drive them higher. It becomes a loop. And because no distinction is made between the quality or strength of a unit or what that unit might mean to a group of users in a topic area, straight number counts just aren’t very smart.

For example, if we made a system that counted Om Malik’s inbound links and called it authority, no matter the topic, I think Om would agree that even he wouldn’t have great authority and insight on the subjects of say, modern dance or metal working, if he happened to mention those words in a blog post. But on broadband issues, he is most definitely an authority. But Technorati, OneRiot, and other services that take a metric count and apply it for all topics, all circumstances, all search result matches, without context, randomize the quality of the information the user sees. They may provide a filter across the whole web, but they don’t give us any real help in judging what is useful or not. It’s why topic communities are helpful, and once you find a good editorial filter, driven by the human touch, you glom onto it for dear life because it’s such a time and energy saver.

I’m under no illusions that we’re remotely close to solving Live Web or Real Time search or even recent search. We are not. Nor are we near solving discovery. But I hope we will. Sooner rather than later. Because I need it now. The opportunity is huge. It means really building algorithmically the editorial filters we have today in the form of people, while balancing the mobs’ activities. Solve that and the prize will be big.

Crunch Network: CrunchBase the free database of technology companies, people, and investors


Say What? ‘Dial Directions’ Acquired By Arabic Language Specialist Sakhr Software

Posted: 30 Jun 2009 05:55 PM PDT

Bet you didn’t see this one coming. Back in 2007 we wrote about a service called Dial Directions which lets you call a special phone number and verbally ask for directions, which are immediately sent to you via SMS. Today comes news that the company has been acquired by Sakhr Software, a development house specializing in Arabic natural language processing (NLP). And with their powers combined, they’re building a real-time voice translation service that will allow users to translate phrases from their mobile phones on the fly.

It’s a better fit than it sounds. Dial Directions has spent the last few years building mobile applications (it has an app for the iPhone on the App Store), and has also built out the technology required to efficently transfer voice input to servers, where it can then be processed (this server-side processing is also used by Google Voice Search and a number of other apps). Once it makes it to the cloud, this speech will be routed through Sakhr’s software, which is capable of translating English to Arabic and vice-versa. Translated audio and text are then sent back to the mobile phone, all within a matter of seconds.

The companies have jointly produced a beta version of the application for the iPhone and BlackBerry, which you can see in the video below. The application is currently in testing with select enterprise customers, with plans to release a consumer version around the end of the summer.

Sakhr’s customers include the Department of Defense, Department of Homeland Security, and the Department of Justice, so it wouldn’t be surprising if the technology makes its way out to defense personnel. A Dial Directions spokesperson says that most translation devices in the field abroad rely on a set library of phrases, and says that the new Sakhr translation software should be more flexible. That said, it sounds like this will come with one significant drawback — if your phone can’t reach the network, the software won’t work.

Terms of the deal were not disclosed. Dial Directions intends to keep its service running for now, though it may not be indefinitely.



Crunch Network: CrunchBoard because it’s time for you to find a new Job2.0


SkyGrid Links Its Financial Firehose To Twitter

Posted: 30 Jun 2009 04:26 PM PDT

SkyGrid, the nifty, free financial news aggregator, is now publishing a stream of news on Twitter, letting users follow breaking business news headlines via the microblogging network.

The news aggregator, which only features stories about publicly traded companies, not only has an comprehensive Twitter feed for news stories, but the site also has Twitter feeds that are broken down by sector. So users can follow SkyGridHealth or SkyGridEnergy for sector-related news. SkyGrid currently has separate Twitter feeds for 8 different industries. SkyGrid says that the Twitter feed may be especially useful to users who want to access SkyGrid on their mobile devices.

Similar to TechMeme and Google News, SkyGrid clusters related news stories based on keyword analysis, what they're linking to, etc. SkyGrid also tries to determine the sentiment of each article - red for negative, green for positive.

As we wrote in our earlier review of the services, SkyGrid is an incredibly useful tool, especially now that it is free. But the one element that is missing from SkyGrid is coverage of larger private companies, like Facebook. In order to become a serious competitor to popular aggregators like Techmeme (which also has a Twitter firehose), the site will need to expand its range of coverage. But especially for people in the financial services industries who use Twitter as a news source, SkyGrid is on the right track to providing users with real-time valuable financial news.

Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


iPhone OS 3.1 Beta And SDK Already Rolling Out To Developers

Posted: 30 Jun 2009 04:23 PM PDT

picture-1211It looks like Apple has already started sending out the beta version and SDK for the next iteration of the iPhone OS, 3.1, to developers.

The iPhone 3.0 software was released just about two weeks ago, ahead of the iPhone 3GS launch. By most accounts it’s pretty stable, though some users have been having battery issues. That’s why it’s a bit odd that Apple would iterate up to version 3.1 already, instead of something like 3.0.1, which it tends to do for minor updates and bug fixes. Could we be seeing a larger update to the OS already?

I’m told that there’s nothing much of interest in the release notes to indicate anything major that is new or changed.

Here’s the text of the email being sent from Apple:

iPhone SDK 3.1 beta and iPhone OS 3.1 beta are now posted to the iPhone Dev Center. These versions are for development and testing only and should be installed on devices dedicated to iPhone OS 3.1 beta software development. Please read the iPhone OS Pre-Install Advisory and the iPhone SDK 3.1 beta release notes before downloading and installing.

Update: I’m hearing a few reports that one change is that MMS is turned on by default. Perhaps AT&T is getting closer to turning it on as well in the U.S.

picture-1115

[thanks Michael]

Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


Meebo Tries to Fill “Moments Of Boredom” With An Ad Network For Partner Sites

Posted: 30 Jun 2009 03:51 PM PDT

How do you advertise on a Web-based instant messaging service without interrupting conversations and annoying the hell out of users? Meebo CEO Seth Sternberg thinks he has the answer: “There is a moment of boredom while they are waiting for a response, that is when they click on ads.” He’s observed this based on how people interact with the ads which began appearing on Meebo.com last March. Today, Meebo is creating an ad network across partner sites which use its new Community IM service, which ads a Meebo IM bar at the bottom of participating sites.

Visitors to one of the 85 partner sites which have implemented the Community IM product (including Current TV, DailyStrength, Flixster, and Webs.com) can chat with their IM buddies without leaving the sites. Today, Meebo is introducing new ad units which pop up along the bottom left of the browser, beginning with ads for the Toyota Piou and AT&Ts. For the Toyota ad, a little car icon pops up on the left of the Meebo IM bar, away from all of the chat activity on the bottom right. If you click on the car, a larger ad 900X400 pixel rich ad overlay opens up which can show a video or any number of interactive ads. “When they click we do not take them away from the conversation,” says Sternberg. During the whole time people is watching the ads, they can still chat with their friends through the Meebo IM column on the right.

These ads are similar to VideoEgg’s Twig Ad bar, except they are integrated directly into each site rather than use a frame overlay. But the opt-in nature of both types of ads are part of a general trend of giving consumers control over when and how marketing messages are presented to them.

Meebo says its IM service reaches 50 million people a month and can target ads on age, gender, or location. Sternberg says Meebo is seeing 1 percent clickthrough rates on the ads. But he is not without competitors. AOL is planning to offer its own IM bar to external sites through its Socialthing for Websites service, which presumably will also be connected to its ad network. The exchange with sites is that they get social IM features without having to reinvent the wheel, and they get a share of any IM-based ad revenue as well.

meebo_prius2

Crunch Network: CrunchBase the free database of technology companies, people, and investors


Twitter Rolls Out UI Changes To Simplify Your Social Connections

Posted: 30 Jun 2009 03:49 PM PDT

Twitter has just quietly rolled out a set of changes to its user interface on the “Following” and “Followers” sections of its website. These changes will clearly make it easier to manage who you follow as well as take actions, such as @replying someone or direct messaging them, directly from the page.

There are two new views for looking at these areas. “List” is a compact list of the followers, while “Expanded” offers more details including that user’s last tweet and their real name and location. On the Followers page, there is also a button that allows for one-click following of users who already follow you.

What’s interesting about these icons is that they appear to look exactly like icons that Apple uses for OS X — including some of the ones on the iPhone. Could this mean that we’re about to see a revamped mobile version of the Twitter site? Who knows, but it could sure use an overhaul.

picture-136

picture-225

picture-417

picture-715

picture-817

Crunch Network: MobileCrunch Mobile Gadgets and Applications, Delivered Daily.


ClackPoint Brings Voice, Document Sharing To Google Friend Connect

Posted: 30 Jun 2009 01:52 PM PDT

Over the last six months Google has been ramping up Friend Connect, its social online identity platform that’s a direct rival of Facebook Connect (both products opened up to the public last December). Since then Google and third party developers have released a slew of gadgets and features, including the Social Bar, Recommendations, and Comment Translation. One of the latest to join the fray is ClackPoint, a powerful new gadget that integrates realtime text chat, voice conferencing and basic document sharing with Friend Connect.

The gadget works as you’d expect. Clicking on the ‘Call’ button will activate your microphone, and your voice can then be heard by anyone else in your chat room. Alternatively, you can dial in from a phone to one of the site’s dedicated lines (hit the button in the upper right hand corner for a list of numbers). There’s also a standard text-based group chat.

As far as sharing goes, you can participate in a group-edited notepad, import PDF slides that can be viewed by other chat members, and quickly send out a poll to everyone else in the chat room. You can try out the gadget for yourself here.

While the gadget could probably be used in a business setting, I suspect most businesses will stick with products like WebEx for their serious calls. That said, this would be perfect for more casual group meetings where real identities are still important (for example, a meeting discussing plans for your childs’ soccer team). For more, check out the Google blog post introducing the gadget. You can also find a full directory of gadgets available here.


Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


Digg Tries Again To Bury Dupes

Posted: 30 Jun 2009 01:41 PM PDT

picture-147Since its inception, one of the biggest problems with Digg has been that users often submit the same content over and over again. This makes it harder for cool content to become popular because some users digg one submitted story, while some digg another. Today, Digg is releasing “several major updates” to its duplicate (known as a “dupe”) detection system.

The solution sounds fairly intensive. “To better understand the nature of the problem, we analyzed the types of duplicate stories being submitted. Most common are the same stories from the same site, but with different URLs. Our R&D team came up with a solution that identifies these types of duplicates by using a document similarity algorithm,” Digg’s Director of Product Chris Howard writes in a blog post. He goes on to say that there will be a follow-up more technical post to explain a bit more about how this actually works, but says that it has proven to be a reliable system so far.

But the really tricky stuff comes when people submit the same story from a different site. This is a gray area because of course some sites have different takes on the same topic, and whose to say which is more Digg-worthy than another? Digg now says it will scan for descriptive information such as the story’s title to see if something very similar is already in the system. But still, it’s a gray area.

At least the submission process should be faster now. Digg will run these dupe checks after you enter the URL but before you enter the description, which saves a step in the process. It claims this dupe detection will take only “a few seconds.”

And if you ignore the dupe algorithms and submit dupe stories anyway, Digg is watching: “We'll also be monitoring when certain Diggers choose to bypass high-confidence duplicates and will use this data to continue to improve the process going forward.”

picture-135

[photo: flickr/yogi]

Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


Flickr And Twitter are Now Officially Sucking Face

Posted: 30 Jun 2009 01:24 PM PDT

Earlier this month, Flickr started flirting with Twitter integration by allowing users to link their Flickr accounts to their Twitter accounts. The experiment was only for email uploads, which simultaneously created a Tweet with a short http://flic.kr link back to the photo on Flickr. Now that integration is an official feature called Flickr2Twitter.

In addition to email uploads, Flickr now lets you Tweet out any photos directly from the site. After linking your accounts, whenever you click on the “Blog this” button on any photo on Flickr, your Twitter account will be one of the distribution options. This works for both photos you’ve uploaded and other photos you find on the site. I have a feeling you are going to be seeing a lot of http://flic.kr links on Twitter pretty soon.

Developers who want to add Flickr as a photo option to desktop and mobile clients can use Flickr’s existing APIs. (You can learn more here). Once that happens, Twitpic and yFrog will have some company on those clients as a pull-down option.

Crunch Network: MobileCrunch Mobile Gadgets and Applications, Delivered Daily.


What Went Down At Rackspace Yesterday? A Power Outage And Some Backup Failures.

Posted: 30 Jun 2009 01:21 PM PDT

picture-1114As many of you know, a lot of the sites that use Rackspace as their hosting provider were down for about an hour yesterday. That’s because Rackspace went down. Apparently, it was a power outage at a data center that caused it, an incident report that we’ve obtained explains.

While Rackspace has backup systems in place, a series of events apparently caused those backups to fail, resulting in the servers going down. Here’s the key nugget:

The breaker on the primary utility feeder tripped, initiating a sequence of events that ultimately caused a power interruption in Phase I and Phase II of the data center. All systems initially came up on generator power without customer impact. The 'A' bank of generators, which support UPS clusters A and B in Phase I and UPS cluster E in Phase II, then experienced excitation failure which escalated to the point where the generators were no longer able to maintain the electrical load. Rackspace then attempted to switch to our secondary utility feeder, but was unable to do so due to an issue in the Pad Mounted Switch (PMS). At approximately 3:15pm CDT, power supply through UPS clusters A, B and E was lost when the batteries in those clusters discharged, and equipment receiving power through those clusters experienced an interruption in service.

The service says only one of its nine data centers were affected by this failure, but many high profile sites collapsed as a result, including EventBrite, Justin Timberlake’s site and Michelle Malkin’s popular political blog. As Rackspace noted yesterday that "We owe better, and will deliver."

Below, find the full incident report.

picture-1011

Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


FriendFeed Feels Pretty, Oh So Pretty

Posted: 30 Jun 2009 01:10 PM PDT

FriendFeed now lets you individualize your account with six new designer themes. When you select a theme, your FriendFeed account will always include your theme, and other people looking at your profile page will see it in whatever theme they have chosen.

FriendFeed says that it plans to allow users to customize themes down the line as well as give users the ability to create an entirely new theme. Twitter and Gmail also let you add themes and designs to your homepages but some of FriendFeed’s themes have a nicer design, in my opinion. On the other hand, Gmail has a good amount of variety when it comes to choosing a theme. The advantage to Twitter’s themes is that you are able to choose multiple designs in different colors.

Crunch Network: CrunchBoard because it’s time for you to find a new Job2.0


High Gear Media Scores $5.5 Million For Auto Media Network

Posted: 30 Jun 2009 12:25 PM PDT

High Gear Media, the publisher of automotive media sites, has secured $5.5 million in Series B funding led by DAG Ventures with Accel Partners and Greylock Partners participating. The company raised $6.5 million in Series A funding in November 2007 from Accel Partners and Greylock Partners.

High Gear will use the funds expand its media network and acquire other media properties. High Gear owns and operates 38 auto websites including TheCarConnection.com, GreenCarReports.com, AllCarsElectric.com and
AllAboutPrius.com.

The network’s sites aggregate automotive content from around the web and syndicate content to other automotive websites and news sites such as Yahoo! Autos, The San Francisco Chronicle and Glam Media, among others.

Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


Firefox 3.5 Soars Past A Million Downloads. Approaching 100 Downloads A Second.

Posted: 30 Jun 2009 12:16 PM PDT

picture-416

Mozilla today released Firefox 3.5 into the wild. Not surprisingly, it’s flying off the virtual shelves. And unlike when Mozilla released Firefox 3.0 last year, its servers are staying up and reliable, so the rate of downloads is pretty incredible. This site, run by Mozilla, shows the download stats for the new browser. Overall downloads are now approaching 1.3 million worldwide, with over 350,000 of those in the U.S. But even more amazing is the number of downloads occurring each second, it’s ranging from 59 to 95 right now. Again, that’s every second.

Outside of the U.S., the browser is moving quickly in Germany, France and the UK. The claim is that it’s much faster than the previous iterations of Firefox, and based on just a quick run-through of my favorite sites, I’d say that is in fact the case. Though, to be fair, it’s hard to know if that has something to do with the fact that just about all my browser plugins are not yet working with this version.

Not surprisingly, the emphasis on speed in this version of Firefox is on its JavaScript performance. Both Google’s Chrome and Apple’s Safari have been making headlines recently claiming to be the fastest browsers in this regard. As you can see in the SunSpider test chart below, it appears that Firefox has made huge strides since the slow days of Firefox 2, and has now more than doubled performance over even Firefox 3. As Apple recently touted in a press release: “Safari quickly loads HTML web pages more than three times faster than IE 8 and three times faster than Firefox 3.”

So how does 1.3 million downloads in a few hours stack up against its rivals? Well, the most recent browser to offer a major upgrade was Safari, which claimed 11 million downloads in 3 days. But those numbers are tricky because Apple includes Safari updates in its regular OS X software updates, so pretty much all OS X users were at least asked to upgrade after its launch. Still, Apple claimed that of the 11 million, some 6 million were users on Windows machines. And Firefox also pings users to do auto-updates when a new version is available.

Despite its launch hiccups, Firefox 3 set the Guinness World Record for software downloads last summer. In just 24 hours, over 8 million people downloaded the browser around the world. We’ll see how this version stacks up.

You can watch the live-updating chart and map for Firefox 3.5 downloads here.

picture-616

picture-510

Crunch Network: MobileCrunch Mobile Gadgets and Applications, Delivered Daily.


TBD’s Deadpool Date Finally Determined

Posted: 30 Jun 2009 12:00 PM PDT

Back in 2007 I did a column on TeeBeeDee, a social network aimed at baby boomers. I’d spent some time looking at the space, and thought TBD was the best designed site, avoiding Eons age restrictions and fascination with death and building something a bit broader than Gather. The site borrowed heavily from what worked on sites like Yelp and Facebook, the design was delightful and it gave you fun, addictive get-to-know-me activities. I was also incredibly impressed by its founder Robin Wolaner. (Pictured)

But there was still a central question: Would a social network aimed at baby boomers appeal to the demographic? As it turned out, no. The site is shutting down. Below is the letter to users from Wolaner.

A Message I Didn’t Want to Send
June 30, 2009

I regret to have to inform you that TeeBeeDee will be shutting down by July 13, 2009. We thought we had raised sufficient money to get us to a sustainable business, but many factors changed in the 2 years since our launch. As you have no doubt noticed in the past few months, we lacked the resources to continue developing the product to meet the needs of our community.

We will have much to say to you, and to each other, in these next two weeks. Just as we've shared the experiences of our lifetimes here at TeeBeeDee, we will be sharing this goodbye. For me, I can say that the people I have met at this site, and those with whom I've worked these past years, have been a revelation. I have learned so much from so many of you. We have thrilled to marriages, and romances, and lifelong friendships, and support to those in need. Anyone who says "virtual" friendships are less than real ones, didn't spend time in this community.

Kat has posted tips about how to save what matters to you at TBD. And 500 TBDers have already joined a network at Ning: http://www.teebeedee.ning.com to stay connected.

As the founder, I'd like to close by saying that while our business opportunity proved disappointing, the contributions from our members rarely disappointed. I am proud to call so many of you my friends, and thank you for caring about TeeBeeDee.

Robin

Founder/CEO

Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


Attention Executives: 73% of You Need to Fire Yourselves

Posted: 30 Jun 2009 11:38 AM PDT

I still think "Enterprise 2.0" is a meh business trend with a horrible name. It's not that social media/collaboration tools don't have a role in business, and I agree there are some situations where consumer tools aren't the right fit. A great example is Twitter versus Yammer. (Oh, if you only saw the conversations that happen on TechCrunch's Yammer feed…)

But I don't see Enterprise 2.0 becoming a big area of corporate spending. The tools are too cheap and easy to replicate with tons of free alternatives, and many of the vendors are just not ready for prime time. One exception might be blogging software, but don’t most companies who want a corporate blog have one by now? Rather than the next Oracle (who by the way was one of the study's underwriters) or even Salesforce.com emerging from this space, I'm betting that existing software-as-a-service companies incorporate the functionality themselves or you get a lot of built-in-house code.

There's also the problem that nearly 20% of executives have no idea what “Enterprise 2.0″ is. That comes from a new study that's actually talking up the adoption of Enterprise 2.0. It points out that 40% didn't know what it was at the beginning of the year, so at least that’s progress. What's more it says that 50% of those surveyed consider enterprise 2.0 to be "very important" to their business success. (Of course, I think working out everyday is “very important” to my weight loss goals…doesn’t mean I actually do it.)

Still, given that number is so high, it stunned me that the study also said only 7% of people over the age of 45 think that Twitter is an important rapid-feedback tool for business. Sadly, it's not much better among younger folks: Only 27% of those between the ages of 18-30 say Twitter is an important rapid-feedback tool for business. What? Really? You may think we obsess about Twitter too much on TechCrunch, but clearly most business folks aren’t getting the memo.

Let's put aside for a moment that there are pretty well proven test cases on how Twitter utilization has helped companies like Dell and Comcast. Paying for outreach or collaboration tools without first checking out what a free, easy tool like Twitter could do is missing the entire point of the cheap flexibility and ubiquity of social media. Put another way (and to paraphrase James Carville): It's a recession, stupid. Try the free tools first.

Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


The MySpace Exodus Continues: SVP Engineering Allen Hurff Jumps Ship

Posted: 30 Jun 2009 11:32 AM PDT

Around this time last year we saw a stream of high ranking employees leaving Yahoo as the web portal reached new lows following the fumbled Microsoft deal. Now, we’re beginning to see a similar trend from MySpace, the once-shining social network that has been recently hit by stagnating growth, waves of layoffs (both in the US and abroad), and the ousting of its co-founder CEO. The latest member of the executive team to leave is SVP Engineering Allen Hurff, who annouced his decision to leave the company on Friday. A tweet he sent out that day confirms his departure.

Hurff was with the company for over four years, where he and former SVP Operations Jim Benedetto were largely reponsible for building up MySpace’s technology team (Benedetto left the company in March). Hurff also played an integral role in MySpace’s adoption of OpenSocial, serving as Chairman of the foundation. The OpenSocial platform, which allows for the integration of third party applications in MySpace, will likely play a key role in the site’s success moving forward, so this is a big loss for the social network.

Other key MySpace execs to leave in the last few months include CEO/Co-Founder Chris DeWolfe; E.J. Hilbert, MySpace’s Director of Security Enforcement; Amit Kapur, COO; and Steve Pearman, SVP Product Strategy. Kapur, Pearman, and Benedetto have teamed up to launch Blue Rover Labs, a still-mysterious venture funded startup.



Crunch Network: CrunchGear drool over the sexiest new gadgets and hardware.


Can Open Government Be Gamed?

Posted: 30 Jun 2009 10:09 AM PDT

If information is power, the first step to gaining power is to get the right data. The Obama administration is a big proponent of opening up government data and making it digitally available. Today at the Personal Democracy Forum in New York City, the government’s new chief information officer Vivek Kundra announced USAspending.gov, a new site which launched today that tracks government spending with charts and lists ranking the largest government contractors (Lockheed, Boeing, Northrop Grumman, etc.) and assistance recipients (Department of Healthcare Services, New York State Dept. of Health, Texas Health & Human Services Commission, etc.). There is also the Data.gov project, which is attempting to digitize government data and make it available in its raw form for citizens and companies to sift through.

While Kundra agrees in principle that all public government data should be online, he also cautions that the reality is government data sits in more than 10,000 different systems, many of them written in COBOL or are still locked in dusty paper archives. But at least the government is starting to tackle the problem. The government collects a wealth of data, and the more accessible it becomes the more transparent government itself will be (not to mention the opportunities to startups which can tap into this data to offer new services). The State Department is also using the Internet, and Youtube specifically, to reach out directly to citizens of other countries every time Obama or Hillary Clinton travel abroad. They record video messages to citizens of other countries, which are distributed in multiple languages. Call it YouTube diplomacy.

In addition to releasing government data in digital form, the Obama administration is learning to listen to direct feedback from citizens through its Open Government initiative where people can suggest and vote on policy initiatives. These are then further refined and discussed on the Open Government blog and using tools such as wikis. Beth Noveck, the United States Deputy Chief Technology Officer for Open Government, says the administration has gone from using the Internet to broadcast and amplify its message during the campaign to the realization that it can get information back as well, which it is trying to fold back into policy. She says, “What we've seen is enormously thoughtful suggestions that no small group of people in the White House could have come up with themselves.”

Digital tools are bringing participation back to democracy, or at least that is the idea. But once all of this data becomes free and new modes of influencing government policy are deemed to be effective, attempts to manipulate the data and game the system will emerge. Well-funded lobbyists and special interests will descend on these nascent institutions of “open government” like SEO consultants on Google. People from both sides of the aisle will also participate. Todd Herman, the GOP’s Director of New Media, admits as much. The GOP has learned from how the Obama campaign “changed the way community organizing works,” he says. The GOP’s failure in the last election was because “we did not use the tools” of digital organizing and outreach.

Speaking of SEO, Herman suggests that Democratic political activists are better than Republican ones at using SEO techniques to promote stories on Google News. He uses the example of the American Medical Association opposing Obama’s healthcare proposal and shows a screenshot of a Google News search where you wouldn’t know that was the case by scanning the headlines. He shows that as proof of the left’s SEO tactics, without really explaining how they are manipulating Google News. The problem with his example, is if you go to Google News right now and search for “AMA Obamacare” the top result is a Forbes story with the title “Will Doctors Buy Obamacare?” So maybe the Republican SEO experts are fighting back, or Google News is self-correcting.

What his comments reveal is the lengths to which political operatives and activists are going already to shape public opinion and policy online. They will do the same with open government because they are the most motivated and the most organized. The lobbyists and political parties have never suffered from a lack of data. The new open data initiatives will arm them further, but they will also arm regular citizens. If information is power, we might be about to see a leveling of the playing field.

Except there is one big problem: indifference. Most people will not do anything with that data. The ones who are most motivated to use the data about to be unleashed are exactly the special interests who run Washington today. They will use any new data or two-way policy mechanisms to further their own interests or those of their well-heeled clients. Do the rest of us stand a chance? Just because government is edging towards more openness doesn’t mean it can’t still be gamed. People will try.

Will the rest of us let them?

Crunch Network: CrunchBoard because it’s time for you to find a new Job2.0


No response to “The Latest from TechCrunch”

Leave a Reply