It was 2007 when Netflix opened up a contest to programmers around the world. The challenge, and it was a lofty one at the time, was to improve the quality of the Netflix recommendation algorithm by 10%. Teams were provided with anonymized data with which to create their new and improved programs. Netflix paid out the prize in 2009, but curiously, the company recently announced that it never used the new algorithm. Technology marches on, and this kind of raw algorithmic approach is increasingly being left in the dust as the social web closes in around us.
According to Netflix, not only does it have much more data from streaming titles to analyze, but it has social signals to weave into recommendations. Netflix CEO Reed Hastings knows this is the future of content recommendation, and that is why the video provider is working so hard to overturn a 30 year-old video privacy law in the US.
In general, there’s nothing wrong with a social signal being used in recommendations. Humans have been seeking the opinions of friends since long before the internet existed. We trust what we hear from our friends and family more than the cold, emotionless suggestions of a machine. People want a little social interaction in these things, but there is a danger that the serendipity of machine-generated recommendations will be lost if social cues become the core mechanism.
The more the web relies on your social circle to pipe content to you, the more sameness you will experience. Common sense tells us that we tend to associate with people that hold similar interests and beliefs to our own, and that can limit the content we are exposed to through recommendations on sites like Netflix. You may end up only seeing the art that we agree with, not the art that challenges you to think.
When social signals take over search and recommendation algorithms in the future, it’s more than likely going to be coming from Facebook. Of course privacy is a concern with the king of social networks, but the closed nature of Facebook is another thing altogether. Facebook is happy to swallow up all that data, and it’s nigh impossible to get it exported someplace else.
Nothing in Facebook’s walled garden is searchable from the outside, so it would be problematic if all the efforts in improving recommendations revolved around that platform — the web would be at Facebook’s mercy. We’ve seen throughout the history of technology that when one solution gains control of the market, innovation slows down. If most services outsource recommendations to Facebook, everything else suffers. There might come a day when you try to get out of the social web only to find that none of the recommendation and search algorithms are any good without Facebook.
The purity of low-bias, machine-generated algorithms could be forever tainted by the new move to social signals. Right now we see reasonable indications when our search results and recommendations are based on a social connection. There are Facebook friends staring at us from sidebars, and Google+ buttons emblazoned with a friend’s name, but it’s the behind the scenes tweaking and organizing that could one day steer us all to less insightful, opinion-confirming art and news.
According to Netflix, not only does it have much more data from streaming titles to analyze, but it has social signals to weave into recommendations. Netflix CEO Reed Hastings knows this is the future of content recommendation, and that is why the video provider is working so hard to overturn a 30 year-old video privacy law in the US.
In general, there’s nothing wrong with a social signal being used in recommendations. Humans have been seeking the opinions of friends since long before the internet existed. We trust what we hear from our friends and family more than the cold, emotionless suggestions of a machine. People want a little social interaction in these things, but there is a danger that the serendipity of machine-generated recommendations will be lost if social cues become the core mechanism.
The more the web relies on your social circle to pipe content to you, the more sameness you will experience. Common sense tells us that we tend to associate with people that hold similar interests and beliefs to our own, and that can limit the content we are exposed to through recommendations on sites like Netflix. You may end up only seeing the art that we agree with, not the art that challenges you to think.
When social signals take over search and recommendation algorithms in the future, it’s more than likely going to be coming from Facebook. Of course privacy is a concern with the king of social networks, but the closed nature of Facebook is another thing altogether. Facebook is happy to swallow up all that data, and it’s nigh impossible to get it exported someplace else.
Nothing in Facebook’s walled garden is searchable from the outside, so it would be problematic if all the efforts in improving recommendations revolved around that platform — the web would be at Facebook’s mercy. We’ve seen throughout the history of technology that when one solution gains control of the market, innovation slows down. If most services outsource recommendations to Facebook, everything else suffers. There might come a day when you try to get out of the social web only to find that none of the recommendation and search algorithms are any good without Facebook.
The purity of low-bias, machine-generated algorithms could be forever tainted by the new move to social signals. Right now we see reasonable indications when our search results and recommendations are based on a social connection. There are Facebook friends staring at us from sidebars, and Google+ buttons emblazoned with a friend’s name, but it’s the behind the scenes tweaking and organizing that could one day steer us all to less insightful, opinion-confirming art and news.
0 comments
Post a Comment