The marketing stack is shifting again—but this time, the changes feel connected. Search is starting to answer more like a colleague than a catalog. Video production moves from storyboard to publish in the space of a single creative sprint. Social experiments hint at new ways to blend conversation with conversion. Together, Google’s Search Generative Experience (SGE), Runway’s text-to-video capabilities, Descript’s AI-powered editing, and TikTok’s testing of its “Tako” chatbot are creating a new rhythm for how brands show up across channels.

For marketers, the opportunity isn’t in chasing each feature on its own. It’s in finding the through-line—how search snippets, cinematic ad assets, streamlined edits, and conversational AI can reinforce the same narrative in different contexts. That’s where the gains compound: faster cycle times, a higher share of on-brand outputs, more visibility in generative snippets, and deeper engagement in short-form and social spaces.
The change starts in search, where SGE is surfacing direct, conversational answers above traditional links. Content that is structured, credible, and easy to quote is finding its way into these new generative snippets. For B2B teams, that means tightening how solution pages and thought-leadership posts present expertise—building clear clusters of related content and answering key questions concisely. In B2C, product and lifestyle copy benefits from a softer, more conversational tone so it feels natural when pulled into an AI-driven answer, increasing the chance of being the snippet that drives discovery.
That same attention to tone and clarity now extends to video. Runway’s Gen-2 has brought cinematic text-to-video creation to the desktop, turning written prompts into branded visuals in minutes. The real win is iteration speed—producing multiple ad narratives for a product launch without tying up an entire production crew. B2B marketers can adapt this approach for event promos or explainer clips that look premium without requiring a premium timeline. B2C brands can launch social-ready trailers, swapping scenes to match the interests and styles of different audience segments.
Once the visuals are created, Descript’s AI-powered editing keeps the production line moving by handling repetitive, time-consuming cuts. A campaign manager can take a long-form brand video, create shorts for social, and even prepare an audio-only version for a podcast, all in one workspace. What once took a week can now be completed in a day, freeing teams to focus on story development and creative direction instead of file wrangling.
Even in the social space, the AI layer is moving closer to the point of interaction. TikTok’s “Tako” chatbot introduces interactive, AI-driven conversation inside the platform itself. Early uses include influencer Q&A, script suggestions for creators, or guiding followers toward a product page without leaving the chat. This creates a bridge between discovery and decision, allowing brands to shape the customer journey while it is happening in real time.
The same principle is now finding its way into blogs and owned content. Tools like YOU-TLDR make it possible to embed short, branded AI-generated video summaries directly into long-form posts. These visual hooks help extend on-page engagement, keep visitors scrolling, and give time-pressed readers a quick way to share the key points with others. Across search, video, and social, the pattern is the same: AI isn’t replacing the marketer—it’s removing the gaps between an idea, its execution, and its delivery to the audience.
Best Practice Spotlight
CitizenM Hotels took an unconventional path to stand out in the crowded hospitality market. Partnering with creative agency KesselsKramer London, the brand produced what was billed as the world’s first fully AI-generated video advertisement using Runway’s Gen-2 text-to-video technology. Instead of polished travel visuals, the ad depicted surreal, unsettling versions of generic hotels — imagery designed to contrast sharply with CitizenM’s own modern, human-focused luxury experience. The team used text prompts to direct Gen-2 toward creating nightmarish yet memorable sequences, positioning the brand as the opposite of the outdated settings shown. While the campaign’s measurable results weren’t disclosed, the approach demonstrated how AI video can be used to create a strong emotional hook and a clear point of differentiation in brand storytelling.
Creative Consulting Concepts
B2B – A consulting firm audits its knowledge library, rewriting top articles into snippet-friendly formats while keeping depth intact. SGE begins surfacing their answers in decision-stage searches, bringing in higher-intent traffic and shortening sales cycles. The risk? Cutting too much detail in pursuit of brevity—balanced editing is key.
B2C – A consumer brand uses Runway for launch visuals, Descript to create multiple cutdowns, and Tako for influencer Q&A during launch week. Sales pages see a sharp lift in traffic, but the real gain is in retention—the brand stays in conversation mode long after launch day.
Non-Profit – A global non-profit hosts a live Q&A with Tako, uses YOU-TLDR to summarize it into a 45-second clip, and embeds it in an SGE-optimized blog post. The result: more shares, more site visits, and a noticeable uptick in small-donor conversions. The watch-out? AI summaries need human review to avoid oversimplifying complex issues.
References
CMSWire. (2023, June 9). Google’s Search Generative Experience: Shaking up the SEO game.
Maginative. (2023, June 11). Runway introduces Gen-2: Pushing the boundaries of generative AI.
TechCrunch. (2023, June 27). AI’s impact on video production: Descript and the new wave.
TechCrunch. (2023, May 25). TikTok is testing an in-app AI chatbot called ‘Tako’.
Google Blog. (2023, May 10). Supercharging Search with generative AI.
YOU-TLDR. (2023, August 20). AI for YouTube video summary.
Creative Bloq. (2023, June 23). The first advert made with AI video will give you nightmares.
Leave a Reply