Is your phone giving you away? Velocity uses advanced sensor data processing techniques to identify a mobile phone user’s motion, technology that has been integrated into the analytics of advertisers in industry verticals like beverage, automotive and retail. “We started our business by identifying risk of injury for athletes, so we have highly developed analysis of body symmetry, an important indicator of healthy athletic form,” says Brett Bond, President, Velocity. “Today, we can sense the asymmetry associated with pulling luggage, which indicates an upcoming trip. So we can tell if a person is about to head to the airport even before they call their Uber.” Bond offers more details on an intriguing technology – point by point.
What It Is
Motion sensing can interpret subtle human activities. Velocity’s mission is to decode these activities as a means of identifying all human motion.
Velocity data identifies activities not possible with GPS. This includes being able to differentiate between a passenger and a driver, a commuter in a car, differentiating between a passenger on train or bus, identifying someone pushing a shopping cart versus waiting in line.
Complex use-cases can be built by combining this data with location, enabling marketers to go beyond GPS. For example, being able to identify when a visit to a car dealership has morphed into a test drive can identify an in-market customer.
Motion data can be used to finely discriminate inexact location coordinates. That is, we can tell whether a person who is riding an exercise bike is actually at the gym, not the Starbucks next to it. Since location data is about 70% accurate, this can be a big advantage and potential customers include all location data providers, some of which we are already working with.
We have emphasized several common human motions and have models representing subtle differences in these: are you walking faster than normal? perhaps you are in a rush, or stressed out. Our work on shoe type helps identify or confirm high heels vs sneakers.
Our motion models seek signature behaviors: many drivers pick their phone up as they approach a red light. They put the phone away when the light turns green (hopefully!). Drivers also sometimes clip a phone on to their dash. Passengers have different signature usage patterns that we can distinguish.
How it works
Our Active Impression for advertising produces a simple red, yellow, or green rating for each moment. Red impressions should be avoided, they tend to convert poorly because the user is in a distracted state, paying little or no attention to their device (for example it may be resting on a table). Yellow impressions avoid fraud, but are not highly attentive moments. Green impressions are the best times to reach a person, when the user is both attentive to the device and in a receptive state such as a lean-back moment.
Today we have platform partners either currently integrated or in pilot phase: Criteo, Adobe, Factual, LiveRamp, Foursquare, Beeswax, Sito, Nativo, YouAppi, Smadex
The Cynsiders column is a platform for industry leaders to reach out to colleagues, followers, and the public at large. In their own words and in targeted Q&As, columnists address breaking news, issues of the day, and the larger changes going on in the ever-evolving world of television, video and digital. Cynsiders columns live on Cynopsis’ main page and are promoted across all daily newsletters. We welcome readers’ comments, queries, and column ideas at Lynn@Cynopsis.com.