The U.S. Federal Communications Commission (FCC) oversees compliance with the Children’s Television Act, which in part regulates the advertising that appears alongside children’s television programming. The Children’s Television Act prohibits the use of “program talent or other identifiable program characteristics to deliver commercials” during or adjacent to children’s programming featuring that character. In other words, the talent or other identifiers within a children’s program cannot also appear in commercials during that program. As such, U.S. television broadcasters must ensure their children’s shows and concurrent advertisements do not contain the same people or characteristics. Many other countries have similar rules.
Compliance with the Children’s Television Act is an industry-wide issue due to significant fines that can be levied upon broadcasters who don’t comply. Staying in compliance requires diligence that takes significant time and resources, which presents a great challenge even for the largest media companies.
This brief explains how Veritone’s digital asset management solution, Veritone Core™, powered by aiWARE™, a SaaS operating system for artificial intelligence, enables broadcasters to automate a previously manual process to save time and resources, ensure accuracy, and help avoid fines.
Any media company that runs children’s television shows in the U.S. must distinguish between the talent in the programming and the talent in the advertising by reviewing the commercials and identifying faces and other traits. Through the years, most broadcasters have relied on Ad Sales or Standards and Practices teams to perform this review manually. That is, people scrutinize every incoming ad spot and log identifying information into the traffic system. Over time, some media vendors began offering automated metadata-processing tools to make the process faster and easier. This technology vastly improved the logging workflow and the amount and quality of metadata, but because such tools cannot recognize faces, a person must still take the time to look for faces in every eligible ad.
With some media companies receiving hundreds of ads per day, sometimes with multiple versions, keeping track of the talent can be a daunting task. For instance, the ad department might get 20 versions of the same movie trailer for different audiences. Each one is a slightly different edit than the last, so the faces in each version could be different. Even if there were only 50 eligible ads each day, watching for faces to identify individuals in all those 30-second spots is nearly impossible. Most media companies simply don’t have a large enough staff to handle the volume.
Evolving programming. Evolving Personalities.
Another challenge in complying with the Children’s Television Act involves the constant re-examination of older ad spots still in current rotation. In this scenario, teams must hunt for the presence of previously “unknown” actors featured in older ads that are still being placed against contemporary programming. While the actor might have been unknown at the time of the ad’s creation, in the intervening years, he or she might have become a star and is now present in the new programming. In a related matter, shows that feature guest stars in every episode must continually be scrutinized for compliance as well.
The point is, things change, so processing the content once is not enough.
Making Compliance Easier and More Dependable With AI-Driven Digital Asset Management
Applying AI to the compliance-verification process augments both automated metadata-processing tools and the human workflow, providing another set of “eyes” that can watch and identify known actors in the video faster than humans ever could. Taken a step further, combining artificial intelligence with cloud-native digital asset management transforms the way broadcasters handle Children’s Television Act compliance and improves the efficiency of their operations.
Broadcasters can accomplish this task through Veritone Core™, an enterprise cloud-native asset management solution. All of the assets in Core are processed by aiWARE running multiple cognitive engines including: Facial Recognition, Facial Detection, Logo Recognition, Speech-to-Text, Optical Character Recognition, and Object Recognition — all in in a secure cloud environment for live and archived broadcasts. This results in the generation of metadata, which may include the names of identified actors, sponsor or advertiser logos, and more.
As a result, Core helps broadcasters proactively avoid conflicts between children’s programs and the ads that run alongside them. aiWARE is continuously trained to recognize new actors too, enabling broadcasters to reindex and examine their archives for potential conflicts.
With Core as the platform where assets live and aiWARE enriching the metadata behind the scenes, broadcasters get the best of both worlds — a sophisticated, powerful cloud-native asset management system and cognitive engine that go far beyond what any manual logging workflow could ever achieve.
Peeking Under the Hood: How aiWARE and Core Support Compliance Data with Fewer Resources
As the digital asset management system, Core ingests an advertisement from a broadcaster as a high-resolution video file in its native format and stores whatever metadata the broadcaster provides — ad number, category, advertiser name, brand, etc. — as structured data in the system. From there, Core kicks off an automated workflow whereby aiWARE runs several different cognitive engines against that file.
Once in Core, aiWARE quickly inspects the video to help identify critical information about its content. For example, upon ingestion, aiWARE applies face detection engines to determine whether faces exist. If so, then facial recognition engines evaluate the identified faces in the video against a database of known actors and return the actor’s name into Core whenever there is a match. Meanwhile, optical character recognition engines scan the text that appears on the screen and store it as metadata, which is useful in the event that the actor’s name appears on the screen. A transcription engine converts speech to text to make it easier to spot phrases that are used in both the program and the ad. Object recognition engines identify things such as lingerie or beer that should not be shown during children’s programming, while logo recognition engines can identify the logos for such products. Upon processing the videos, metadata results occur in near-real time, much faster than manual tagging.
Once aiWARE has delivered the enriched metadata into Core, users can automatically populate that information into their traffic systems via the Core API. From there, the user’s traffic server will match this information against the data for TV shows in its lineup and, if there’s a name match, either cancel or flag any offending or conflicting commercials that are set to run in a given window.
Core stores all the results from the aiWARE engines chronologically along the asset’s timeline. When users conduct a search, Core returns the matching assets. Clicking on an asset opens it in a player with a progress bar at the bottom and dots along the bar indicating all the moments in time that match the search. Clicking on a dot jumps the user directly to the corresponding search result, with timecode value displayed on the right side of the screen.
Besides automatically integrating data from Core into their traffic servers, Core also provides an easy and convenient way for broadcasters to review commercials for compliance via the Core search bar.