Goodbye, Outstream: The Digital Video Classification Shakeup

When an advertiser places video content on a publisher’s site, the opportunities are manifold.

Do they place the ad between content, next to content or floating next to content? Is it an adhesion unit, or do you scroll past the ad? Does the player loop ads, or does the video ad precede content? Is that content custom or AI-generated? Is the player click-to-play or autoplay? Sticky or not?

Until recently, buyers understanding of these options were murky, or reduced to two options – like instream or outstream. Buyers haven’t always known exactly what video ad experience they were buying. Changing the definitions, as the IAB Tech Lab did, gives buyers more transparency.

We hear from our associate editor, Anthony Vargas, who is reporting on the changes wrought by this new technical standard.

GPTs Come For Media Buying

Then, we’ve all seen what ChatGPT can do when chatting with the AI bot and its uncannily human ability to pull together information and transform it. But what can ChatGPT do for media buying?

Ad tech companies are adopting the large language models underpinning ChatGPT to create new contextual targeting models, and they are adopting its front-end functionality as well. Imagine being able to paste a creative brief into a chat function and letting AI do the rest. AdExchanger Senior Editor Hana Yoo shares more about the ad tech companies that are using the latest large language models to classify content and improve media buying’s contextual targeting capabilities.

Source link