Pinterest is announcing plans to expand its product tagging beta for shoppable Story Pins, which allow advertisers and creators to tag products in photos. Pinterest is also adding to its Try on platform with an AR eyeshadow launch. This news continues a series of features from the company, including the introduction of its facial AR technology exactly one year ago.
Pinterest senior VP Jeremy King talked with VentureBeat about how the company’s data strategy made these technical changes possible. At a high level, Pinterest is powered by computer vision technology, which focuses on advancing its AR capabilities and making the site shoppable.
This interview has been edited for clarity and brevity.
VentureBeat: Could you tell me about Pinterest’s decision to expand Try on with eyeshadow and add shoppable Story Pins? You mentioned that this technology has been developing for about a year. I’m wondering if there are any considerations at the data level that drive this innovation.
Jeremy King: One of the hardest things about these kinds of AR technologies is really making sure that you don’t introduce bias into the data. So we have some wonderful technologists, both on the machine learning side and the computer vision side, that can really help us test all kinds of different skin tone ranges and different lipstick colors.
And it’s funny, there has been a huge amount of face mask Try on during the pandemic, and you can imagine that adds its own bias into the computer because the computer doesn’t know whether it’s a regular face or if somebody’s got a face mask on. And so it changes. It has changed the algorithm pretty dramatically. We have billions of pictures of people in our system and so can use that data to make sure that we shake out all the bias in the system.
The number one request on Pinterest is “Once I found this beautiful thing,” and you want to be able to get it. We call it inspiration to action. It has a lot to do with sometimes seemingly boring technology, ingesting millions of catalogs from thousands of retailers, making sure you’ve got pricing and inventory correct. Making sure that the latest images are there and you can understand, from a hero image — you call it a lifestyle-type image — you want to identify 20 different items that are in a lifestyle picture. As you take a picture of your living room, there are probably 20 or 30 items sitting in that picture. And VMO identified each one of those items, because when people add, that picture may be added to dozens, if not hundreds of boards. And it was added to those boards for different reasons. Sometimes we think of Pinterest as a giant human labeling system, where people use board games, and they tag items inside of boards. That empowers our computer vision technology to help us get better and better at item identification.
VentureBeat: How do you get to that inspiration point with the data? Are there any particular types of frameworks or languages that you use?
King: We use a lot of open source and machine learning technology that we’ve enhanced over the years. Our core technology that powers Pinterest is a system called graph stages, a giant graph database that uses a number of techniques, nearest neighbors, the common term, techniques to identify pictures and images that are like each other. The more that you use Pinterest, the more that you add things to a board, the better we get, of course. We’ve also open-sourced a number of those technologies.
If you do a search on Pinterest, oftentimes you’re very open. You don’t use very specific queries. Like on Google it’s not uncommon to have 7- to 10-word queries. In Pinterest, oftentimes you’re saying “inspiration,” or “inspiring living room” or “shabby chic bedroom set,” and so we have lots of opportunities to show many different things. We can start pretty quickly into several different fingers on the inspiration track and then get people to narrow down the results with images, which is what we’re after.
VentureBeat: Pinterest builds and releases new features often. How has your existing tech infrastructure allowed your engineers to add on extra capabilities like the ones launched today?
King: The underlying framework of Pinterest not only includes this graph database, but it also includes an experimentation platform that we built from scratch. And so we’re running hundreds of experiments at a time and slicing the user basis to have different users try different areas. So, as a result, we can rapidly iterate features and then launch the things that actually do well. We have about a 30% to 40% success rate for features as we launch them. And that’s pretty common, to throw away 70% of the work that you’re doing because it doesn’t work.
VentureBeat: You mentioned that with the experimentation system you have a 30% to 40% success rate. How do you measure success?
King: We have hundreds of different metrics that we’re tracking. Sometimes it can be user engagement. Time on site isn’t a metric that we really are driving people toward. We want you to discover what you want, and actually go do it — whether it’s to go paint your walls or to find something creative for your kids to do or what to cook for dinner, we want you to go find the thing and actually go out and do it. So oftentimes, it’s getting to the exact right item and actually going and doing it.
VentureBeat: And how do you adjust your tech and that backend data to make Pinterest more inspirational and relevant with the right advertisements?
King: Our computer vision technology really allows us to make sure that even when you’re floating through an organic experience, where you’re throwing billions of pins on your board, that when we’re injecting ads, they’re relevant ads. So we have hundreds of thousands of ads, and we can show ads that are very relevant to what you’re thinking. And what we find is most of the time, when we get it right, you don’t even know which ones are bad and which ones are not. We get a lot of feedback that we’re deliberately engineering Pinterest in a way that leaves people feeling positive and inspired. We have the manual override features where you can say “hide this pin” or “hide this ad,” or “don’t ever show me this again,” and that sort of thing. And that feedback loop is very deliberate.
VentureBeat: How would you describe Pinterest’s overall data strategy? And what has that journey looked like, maybe since when you joined Pinterest a couple of years ago?
King: At a high level, Pinterest used to be completely about an image signature. So everything we did was built around a pin or an image. And it turns out that, you know, it’s not always the best when you’re doing something like a catalog, right? A canonical catalog? Because you might have 20 different colors of a T-shirt, or, you know, 50 different colors of a vase, or hundreds of different colors of lipstick. And so when you identify a particular item, you actually want to nail that item, right. And so it turns more into sort of a canonical database. And so we’ve built really two parts of Pinterest now, and I was describing sort of the inspiration part, which use our image technology, and then our shopping cart, which we’re using sort of more traditional data structures. And, and what we’ve been doing is really spending time ingesting hundreds of millions of items from Etsy and eBay and so forth, from all these large and reputable retailers to make sure that when you when you find an item, whether it’s a table or couch or a or a lamp or something that you can find not only that exact items that but items that are very similar to that, to that and things that may have color variations and that sort of thing. And, and that’s really what’s made the shopping experience so much better in the last 18 months.
We still have a lot, a lot of ways to go. You know, there’s hundreds of billions of pins. And you know, so many of them have items in them, and so we’re rapidly going through as many companies and catalogs that we can in order to product tag. This is part of the reason that we’re announcing product tagging with story pins, because we really feel like the creator can actually identify items.