Skip to main content

Digital Marketing

Eric Enge interviews Live Search’s Mike Nichols

Picture of Mike Nichols

Mike Nichols


Mike Nichols is Group Program Manager for Live Search. He is responsible for the planning and design of verticals such as Image search, Video search, and News search, as well as Query Suggestions to help people get to exactly what they’re looking for.
Nichols joined Microsoft in 1996. Prior to joining Live Search in late 2006, he was a Senior Director responsible for strategic and organizational initiatives for Microsoft’s search, portal, marketplaces, and advertising groups. Before that, Nichols was the Director of Product Management for online services such as Messenger, Hotmail, and MSN Internet Access. Prior roles also included product management for Windows, Internet Explorer and Microsoft Project.
Nichols holds a degree from the University of Michigan and lives in the Seattle area.

Interview Transcript

Eric Enge: You recently announced your new video search engine, and how’s that been received?
Mike Nichols: Really well so far. The people who’ve tried it seem to really like it, and they continue using it. That’s a key indicator for us. If you try it, and then you come back and use it again, that’s a good indicator that we’re hitting on something good. We felt like that might happen as we were working on it, because there are a few features in particular thatonce you start using them, it’s sort of hard to go back. For example the new “smart previews” feature. But, it’s actually good to see the usage happening in real life though.
Eric Enge: Whose video search were you using before?
Mike Nichols: We were using a white-labeled version of Truveo.
Eric Enge: Do you have any data on the engagement level in the Truveo video search versus your new one?
Mike Nichols: Our usage from before the launch has gone up 15-20x. Part of that is that we’ve introduced new features, like a video answer that weren’t available beforehand. There are lots of different dynamics that go into it, but overall our engagement with the new product as well as the feedback we’ve gotten from customers has been really encouraging.
Eric Enge: That’s great. You mentioned the smart preview feature a couple of moments ago, I think of it as smart trailers. But, it’s pretty unique the way you built that, and maybe you can tell me a little bit about that?
Mike Nichols: The goal that really drove the feature was that we wanted to help people save time browsing through videos. Even though a lot of online clips are just a few minutes long, when you add up all that time to watch a few different ones, you can start to get to some pretty big numbers before you even watch a whole video. What we wanted to do is to give people a quick snapshot of what the video is all about before they decided to view it. I like your name smart trailers because really we thought about concepts people are already familiar with that we could borrow with this feature.
One of them was the movie industry’s use of movie trailers. It’s really nice to be able to check out thirty seconds or sixty-second clip which are a rough representation of what the movie is all about. Then, you can decide do I want to invest a couple of hours to go watch it? That’s really what we’ve tried to do for online videos, and so all you have to do as a user is to hover over a video result. The majority of them will get one of these smart previews of that video.
To develop this, we partnered with Microsoft Research to create a smart representation of the video. We use a whole bunch of different techniques, such as shot boundary detection, to determine the beginning and the end of a different kind of shot or scene, and audio detection, to be able to tell if somebody’s talking at a particular time. We then identify the chief scenes in a video, and put them together in a short representative clip. Notice how as you preview a video you see the scenes fading in and out.
Sometimes that works really well, and sometimes we still have a little bit of tweaking to do. But, it’s really all about trying to provide that movie trailer kind of feeling. My favorite recent example is one where you search for a basketball player, and you get a 1-2 minute video that starts off with some game footage where a player is just dribbling the ball at the top of the key as the clock is winding down. And then, he drives in for a slam-dunk. With the smart previews technology, what happens is you hover over that video, and it immediately shows you the slam-dunk, because it’s been able to tell oh okay, that seems to be the most important scene in the video.
One of the unique things this helps you do is browse around a bunch of (ten or more) different videos in the time that it takes you to watch a full video. In that way, you can make sure that you are investing your time in the videos that you really want to watch.
Eric Enge: Right. The crux of this is the ability to recognize what’s most important in the video. Presumably what happened there is you recognized the big spike in crowd noise at the moment when he dunks.
Mike Nichols: Right.
Eric Enge: I think what the most common thing other video search engines do with these sort of previews is to just show the first thirty seconds.
Mike Nichols: Right.
Eric Enge: That’s a fairly simple algorithm, anybody could do that. But, once you start trying to investigate the content, and provide a smarter trailer, you have a lot of different algorithmic scenarios you need to deal with, right? Not all of them are as convenient as a slam-dunk. It must have been quite an investment to do that across millions of different scenarios.
Mike Nichols: Yes. That was one of the challenging parts – figuring out a solution that would work for a wide range of scenarios. It still doesn’t work perfectly for all scenarios, but it works for many and we’re going to keep improving it. We were fortunate because we have all of these rocket scientists at Microsoft Research. They are thinking about these things all the time, and so part of that was they were working on a project that we happened to see had application for this customer problem that we were trying to solve.
We worked with them on questions such as: what happens if the video is short versus one that’s long? How about it if there is conversation versus action in the video? You were right on, there are just so many different dynamics that you have to account for.
Eric Enge: Any of the unique features of video search that you want to highlight?
Mike Nichols: Sure. Aside from the smart previews, we’ve focused in a few other areas with our first generation video search product. One was selection, and just making sure that we had a big representative index of the videos on the web. There are certain sites that simply crawl a few different sites. We think the selection is a big part of what search is all about. So, we work to index videos from all the major sites: YouTube, MySpace, AOL, Yahoo, MTV, the major television networks, CNN, ESPN, etc., When you do your typical query or a wide range of queries, you will notice a lot of videos from a lot of different sites. That was one really big area of investment for us.
The second one was the viewing experience. We really wanted to borrow this idea that we have from the image search product, which was that once you’d actually decided that you wanted to view an image, and you clicked on the image, we opened up our viewing pane on the left-hand side that would still have the actual search results there for you. So, as you were going from one image to another on the host page, you could very easily change from one result to another. We’ve done the same thing with videos. The application here is that after you’ve decided that you are going to watch a video, and it’s playing over in the right-hand side, you can be scrolling on the left through different videos that were also in your search results.
Within your mind, you can decide okay, after this one is done, here is the next one that I am going to watch. When you marry that with the smart preview technology, you really have something cool, because right after you are done, watching a video in the right-hand side, you can preview all the other videos in the left-hand side before you decide you want to click on one. So again, we are trying to save your clicks.
Another example is that if you search for a person, you will see on the right-hand side some suggestions for related celebrities. So, if you search for Britney Spears, over in the right-hand side you might see Christina Aguilera or other related celebrities. The reason why we did that is because when we were watching user behavior, we’d often times see somebody browsing not only within the query that they have done but also other related concepts and topics. That whole browsing behavior is a really interesting dynamic that we’ve noticed with images, and news, and in other areas of search where people start off with one query, and then find themselves trying out all kinds of other related topics.
Eric Enge: Right, they wander. So, with videos is there an aspect of being better able to handle long-tail queries that you get by looking at more sites?
Mike Nichols: Yes. That is one of the reasons to make sure that you are searching more sites.
Eric Enge: Right. Ultimately, if somebody does a video search for Britney Spears video, they are going to find it on any of the major video sites. But, if they do something much more at esoteric, that might not be on YouTube.
Mike Nichols: That’s right. By searching multiple sites, we will provide better results, and it will particularly help on tail queries. This will become more evident as more and more videos are posted online across different sites. And of course, even with head queries, many companies have content that they want you to watch on their site instead of YouTube or similar.
Eric Enge: Right. Let’s talk a little bit about the image search. What are some of the recent changes there?
Mike Nichols: We are really proud of our image search product — we think it’s a great reason for people to try Live Search. It consistently tests very favorably relative to the competition in terms of relevance and includes a number of unique features. We did recently update it in a few ways. We increased the size of the index quite a bit, and we introduced image search in several new markets around the world. But perhaps most interestingly, we again used some Microsoft Research technology to deliver some unique innovation. We really want to help you get to exactly the image you are looking for, and a popular application of image search is using it to find close-ups of people, perhaps for your Messenger ID, Facebook photo, etc. We, therefore, have launched some really nice face detection technology to limit your results to images with faces in them. For example, if you do a standard image search for Tiger Woods you’ll see results with him swinging a golf club, walking the fairway, etc. But, if you limit the results to the face-only using this feature, then you just see lots and lots of pictures of just his face. The technology isn’t yet perfect – sometimes you’ll see pictures of his wife’s face, for example – butdirectionally it is a powerful concept. Now, not only do we provide the most relevant image results for a typical image query, we have features like this to quickly get to exactly the image you want. To check it out, search for a celebrity – say Tiger Woods – and under the image answer at the top of the page choose “close-up.”
Eric Enge: One thing that looked unique to me is the zoom level slider that allows you to decide whether you are going to look at smaller snapshots or larger ones. Smaller ones, of course, allow you to see a lot more on the screen, all at once, and that’s very handy if you are looking for a particular shot to view.
Mike Nichols: Definitely. The idea there was to help people browse through large collections of photos. Either by making the images smaller so more fit on the screen, or through this concept we call ‘infinite scroll’, where you can simply scroll through the entire image collection rather than click to go to page 2 and then page 3, etc. It makes it really easy to browse through an entire photo collection.
Eric Enge: Right. Then, there is the Scratchpad, which I think is pretty neat too, it seems like a little scrapbook that you can keep for yourself. Why don’t you tell me a little bit about that?
Mike Nichols: The reason why we introduced that was to keep people from needlessly having to switch context over and over. For example, often times when you find an image, you want to do something with it. You either want to make it your messenger ID, or you just want to share with a friend or save images for viewing later. Or let’s say you’re looking for images to use in a presentation. When you are looking for images, it’s nice to be able to drag thumbnails of those images into a collection – instead of finding an image, go to Powerpoint, return to Search. With the Scratchpad, once you have found say five six images that are potential contenders for use in your presentation, you can then take the images, and figure out which one is most appropriate, or maybe all of them are, and do what you want with them.
Eric Enge: You can basically do a lot of browsing without ever leaving the image search engine, and collect a whole group of things in one place.
Mike Nichols: Exactly. Some people retain very large lists of their favorite, images that they can always get back to.
Eric Enge: That’s pretty cool. Another thing that stood out from your recent announcements was the enhancements you made to celebrity search. One of the things that you’ve talked about is that when people are searching on celebrities, they really want to browse through a lot of information. You must have done quite a bit of research into that. Talk a little bit about what you did, and how it affected what you decided to do?
Mike Nichols: A very large percentage of queries happen to be celebrity queries. So we decided to look into what they were doing. We started to see that there are many different scenarios. Sometimes people are looking for something very, very specific for a celebrity. For example, they might be watching a movie on TV, and know the name of a celebrity in that movie, and be wondering what other movies was that person in? They then go into the search engine, type in the name, and then want to be able to get to very specific things like the list of films they’ve been in. Sometimes, they are really just looking for celebrity gossip and news. Britney Spears is often a top celebrity search for us, and it’s largely because of news and gossip about her. And sometimes, people are looking for images & videos related to a celebrity. There are such a large number of scenarios that involve celebrity queries. After looking at that we started to draw some conclusions. For example, for younger, more attractive celebrities, not surprisingly images and gossip tended to be what people were looking for. For other types of celebrities who are better known for the work that they have done rather than gossip, such as Frank Sinatra or Elvis Presley, people then would be looking for what movies have they been in, what music have they done, etc.
We developed these theories, and then we validated those with the research that we were doing. We decided when people searched for these different types of celebrities that we should give them some good, smart results that compliment the traditional algorithmic web results. So, when you type in a younger celebrity, such as Jessica Alba or Britney Spears or George Clooney, you’ll tend to get images and news as an in the results. If you typed in Frank Sinatra, you’d get links to his filmography and discography. Now, you can still get to images for Frank Sinatra and movies for Jessica Alba, but the research showed that the odds were that people tended to be looking for this information. We are just trying to save people a click here and there.
Eric Enge: Right. If you type in Tom Brady, you’ll get some of his stats.
Mike Nichols: Exactly. That’s a great scenario because a popular scenario with a football player is people want to see what statistics they had last week to help them with their fantasy football team. That’s why we have the statistics for Tom Brady and other athletes.
Eric Enge: Right. Going back to music and film celebrities, you created this thing called xRank. What is it, and how is it calculated?
Mike Nichols: Sure. xRank was an incubation project that we started a few months ago. It actually came out of a lot of the research we were doing into celebrity queries where we found people were often times going to sites that had popularity information about celebrities. It’s sort of a who’s hot and who’s not list that we put together. There are so many TV shows these days about which of celebrities is hot or not because people want this information.
We thought that as a search engine we probably have among the most useful real-time data to share back with our customers about what the community or users happen to be searching for. The goal of xRank was to provide this really fresh trustworthy source of celebrity popularity and trending data. We have five thousand or six thousand celebrities who are ranked, based on who is being searched for more. We update this list multiple times throughout the day.
It actually catches spikes rather quickly, like for example the other day, Porter Wagner, who I think was a country music star, popped up really high into the top five of most searches for celebrities. Basically what had happened was that our users had heard that this guy had passed away on the news and they searched for information for him. This basically capitalizes on the activity on the internet which is so real-time these days, whether it’s people going and posting something on Wikipedia or searching for somebody. We can use that data to provide value back to users. And, that’s really the idea behind xRank. People can check it out by typing in www.xrank.com.
Eric Enge: Right. So, the celebrity has the highest search volume has an xRank of one. Is it a daily number or the last twenty-four hours or something like that?
Mike Nichols: Yeah, it’s primarily who has been searched-for most over the last 24 hours. It gives you the rank over that period, and then it will tell you whose up or down, as far as their place is. On the xRank page for a celebrity, you also see interesting info about them as well as a bunch of people they are related to. You can see all kinds of interesting relationships between different celebrities.
Frankly, the product is a bit addictive. You can cut the data in a bunch of different ways, whether it’s who is top-ranked, who are the fastest movers, and, you can look historically and see what was this person’s xRank last week versus six months ago and so on.
Eric Enge: As I searched on some celebrities I saw a little menu bar that appears, which essentially allows you to select the domain in which you are searching, whether it’s images, or videos, or news, or movies, or biography. It’s an interesting implementation, a little different in what some of the other folks do. Google obviously mixes up the results right in with the web results, Yahoo is very much in the smart answers type mode. But, I think the little menu bar was interesting. What made you do that?
Mike Nichols: When people search for a particular celebrity, people could be looking for any number of different things for them. We have statistical models that’ll show us what the odds are depending for a certain type of celebrity. But at the same time, we want to show the user that there is other useful information for that celebrity as well. Let’s use Jessica Alba as an example, by default you’ll get imagery and news, if there is anything in the news for her, just because that’s what the average query is looking for, for her. But, at the same time, she’s been in a bunch of movies, and there is bio info for people who are interested in her background. We just want to expose to the users that there is this other useful information as well.
We are experimenting a little bit with the exact UI, whether it’s that menu bar, or whether there is another way to communicate to users that there is this other interesting information. Also, if I type in Jessica Alba movies, I get the movies by default with the option of going to images and news.
Eric Enge: Right. I did a search on Jessica Alba. When the images came up I clicked videos, and you automatically modified the query box to add the word videos. Is that to reinforce to people where they are?
Mike Nichols: It does reinforce where they are, but even more than that we want to teach them that they could get back there by just typing in the name and videos.
Eric Enge: People tend to browse around quite a bit when they are searching for celebrities. With Tom Brady for example, in the results, his name is a link over to a co-branded Fox Sports page, which is his detailed profile. So it looks like you will also provide different resources based on the genre of the search.
Mike Nichols: Sometimes when we see people browsing for more detailed celebrity information, and we know of a high-quality resource for it, we provide it the link directly in the answer.
It’s part of this whole concept of browsing for all of these different things related to this celebrity, and we are testing in a whole bunch of different ways to help people get to all of the information that they want. There is this other scenario where people have searched for all the information on a particular celebrity, and they want to see other related celebrities.
When you do the Tom Brady query, you’ll notice on the right-hand side that there are related searches including his current girlfriend, and Bridget Moynahan, and other ways to refine your query.
Eric Enge: You mentioned earlier that celebrity search is a very significant amount of total search queries. Do you have a percentage number on that?
Mike Nichols: Sure. It ranges between 2% and 3% of total queries, which is actually a rather large percentage.
Eric Enge: I’ll bet you a buck right now that the majority of that searching happens during the workday when people are supposed to be working.
Mike Nichols: You’re right.
Eric Enge: Great Mike, thanks for speaking with me today.
Mike Nichols: Thank you. It’s been great to chat with you.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Eric Enge

Eric Enge is part of the Digital Marketing practice at Perficient. He designs studies and produces industry-related research to help prove, debunk, or evolve assumptions about digital marketing practices and their value. Eric is a writer, blogger, researcher, teacher, and keynote speaker and panelist at major industry conferences. Partnering with several other experts, Eric served as the lead author of The Art of SEO.

More from this Author

Categories
Follow Us