Technology

Introducing The First AI Engine for Content Understanding

AnyClip’s team of world-class deep learning experts have spent years developing a patented deep learning and image recognition technology that automatically cuts premium content to scenes. It then tags, analyzes, and categorizes them according to official IAB categories, sentiments, celebrities and more, while filtering out non-brand safe scenes.

SCENES

AnyClip’s AI engine ingests short or long-form video content. A proprietary scene detection algorithm analyzes the video and leverages patented technology to identify the exact beginning and end timestamps of each scene, effectively cutting any video content into shorter, thematic clips/scenes.

TAGGING

Each clip is then processed by proprietary video and image recognition technology. By applying the most advanced image recognition and deep learning available, the AI engine essentially identifies everything in a given scene – from body parts and still objects to colors, locations, text, brand names, celebrities, and uniquely, even emotions and actions.

INSIGHTS

An unlimited number of tags are analyzed by an advanced Natural Language Processing (NLP) engine, and then statistically weighted and matched against AnyClip’s proprietary taxonomies. Three taxonomies are currently operational, and others are in development:

  1. Brand Safety
  2. Advertising Category
  3. Sentiment

1. Brand Safety

The brand safety taxonomy analyzes tags against 14 brand safety violations such as nudity, profanity or drugs. Each Clip is either marked as safe or unsafe, along with the reasons for concern.

2. Advertising Category

The engine identifies several advertising categories that most accurately characterize the content in each clip. AnyClip’s taxonomy follows the official Interactive Advertising Bureau (IAB)’s Tech Lab Content Taxonomy updated in 2017 and consults three levels of categories – primary, secondary and tertiary. Each clip is ultimately matched with categories such as “Travel” or “Automotive” from the IAB’s primary shortlist, which includes 29 categories.

3. Sentiment Analysis

AnyClip’s sentiment analysis is based on a prototype approach offered by Professor Phillip Shaver and his peers in the Journal of Personality and Social Psychology. At the basic level of the emotion hierarchy are six concepts — love, joy, anger, sadness, fear, and surprise — most useful for making everyday distinctions among emotions. The taxonomy uses these six primary emotions with one of 24 secondary emotions (secondary emotions for Joy, for example, include Cheerfulness or Zest).

Introducing The First AI Engine for Content Understanding

AnyClip’s team of world-class deep learning experts have spent years developing a patented deep learning and image recognition technology that automatically cuts premium content to scenes. It then tags, analyzes, and categorizes them according to official IAB categories, sentiments, celebrities and more, while filtering out non-brand safe scenes.

SCENES

AnyClip’s AI engine ingests short or long-form video content. A proprietary scene detection algorithm analyzes the video and leverages patented technology to identify the exact beginning and end timestamps of each scene, effectively cutting any video content into shorter, thematic clips/scenes.

TAGGING

Each clip is then processed by proprietary video and image recognition technology. By applying the most advanced image recognition and deep learning available, the AI engine essentially identifies everything in a given scene – from body parts and still objects to colors, locations, text, brand names, celebrities, and uniquely, even emotions and actions.

INSIGHTS

An unlimited number of tags are analyzed by an advanced Natural Language Processing (NLP) engine, and then statistically weighted and matched against AnyClip’s proprietary taxonomies. Three taxonomies are currently operational, and others are in development:

  1. Brand Safety
  2. Advertising Category
  3. Sentiment

1. Brand Safety

The brand safety taxonomy analyzes tags against 14 brand safety violations such as nudity, profanity or drugs. Each Clip is either marked as safe or unsafe, along with the reasons for concern.

2. Advertising Category

The engine identifies several advertising categories that most accurately characterize the content in each clip. AnyClip’s taxonomy follows the official Interactive Advertising Bureau (IAB)’s Tech Lab Content Taxonomy updated in 2017 and consults three levels of categories – primary, secondary and tertiary. Each clip is ultimately matched with categories such as “Travel” or “Automotive” from the IAB’s primary shortlist, which includes 29 categories.

3. Sentiment Analysis

AnyClip’s sentiment analysis is based on a prototype approach offered by Professor Phillip Shaver and his peers in the Journal of Personality and Social Psychology. At the basic level of the emotion hierarchy are six concepts — love, joy, anger, sadness, fear, and surprise — most useful for making everyday distinctions among emotions. The taxonomy uses these six primary emotions with one of 24 secondary emotions (secondary emotions for Joy, for example, include Cheerfulness or Zest).