"Every routine process will be automated – we'll all become entrepreneurs"
Anna Zdorenko of INCYMO.ai chats with us about using AI to create creatives that outperform human-made ads.
Hello there! If this is your first time, congrats. You’ve found the weekly newsletter that quizzes leaders from the games industry about the practical approaches they’re taking with AI. Don’t worry about what you’ve missed: you can read six months of these interviews for free in the archive.
Your Q&A this time is with Anna Zdorenko, co-founder and CEO of INCYMO.ai. She discusses how her platform is changing the way studios approach marketing, using AI to analyse thousands of video creatives, spot what works, and generate briefs that often outperform ads conceived by humans.
As always, scroll to the end for a short round-up of the latest news and links.
Anna Zdorenko, INCYMO.ai

Meet Anna Zdorenko, founder and CEO of INCYMO.ai, a platform that uses AI to boost revenue for games through data-driven video ads.
With a background in medical engineering and startup consulting, Anna launched INCYMO.ai in 2021, just before the recent gen AI boom. She brings a decade of experience in sales, product management, and business consulting to her mission of making AI-powered marketing accessible to both indie developers and major studios.
Top takeaways from this conversation:
AI excels at analysing vast amounts of market data that humans simply can’t process. Anna's platform can identify subtle elements and correlations in thousands of video ads that dramatically impact performance.
The most valuable part of AI in advertising isn't necessarily generating the final creative, but identifying the winning concepts.
While generative video tech is advancing rapidly, Anna believes fully automated gameplay video generation is still some time away. She predicts the next breakthrough will come through playable ads generation.
AI Gamechangers: Please tell us a little bit about yourself and your own background. How did you get involved in this business?
Anna Zdorenko: In my school time, I dreamed of creating innovations, and that's why I graduated from technical university as a medical engineer. After that, I joined a startup where we created a hardware solution, and later, I consulted at more than 100 startups.
I realised I wanted to build my second company. Through my network, I found a mentor who told me about gaming. He told me about his own need to personalise in-game purchases. I started to research games, and I realised that it's a very interesting industry that is growing rapidly.
I went to Cyprus, where I talked to many game developers, founders, and product managers, and they confirmed the need for AI for game personalisation.
We built this platform, and after one of our clients asked for prediction models for marketing, I started to talk to other marketing managers. All of them were struggling with creating ads that performed well.

The first idea was about creating prediction models for marketing, but some of them already had tried to build these for themselves. With creatives, getting the best-performing creatives was the biggest pain point. Through the network, I found my co-founders, who had already solved this problem.
I love the gaming industry so much. I'm pleased to be here because I have worked in other industries, and we're so lucky to be in games!
You started INCYMO.ai in 2021. That was before ChatGPT launched and before AI really took off in the last couple of years. So you were quite early on! How has your vision evolved since you launched?
Our team has actually already been building AI solutions for businesses for the last 10 years. AI was around very much earlier than ChatGPT, of course.
We built our core infrastructure before ChatGPT. After all the hype, we started to use different LLMs in our pipelines to make explanations for our clients smoother, faster. But here’s the interesting part: it's not enough to rely solely on them. You still need to analyse all the market data. You have to capture video creatives and their performance data, break it down into many meaningful gaming and ad creatives elements — tags and their combinations — and teach your models how to work with video inputs to deliver valuable insights. Only after that can you translate insights into credible briefs.
It was great for us because we created the first part, and then we could streamline the second part. We were like, "Woo-hoo, we don't need to build it from scratch!"
Can you talk us through how your system works? How does your platform analyse game footage and market data? How does it generate the briefs? What is the experience of working with your tools?
First of all, clients provide us their game link. From this game link, we analyse what is on their store, the main tag of their game genre, and the sub-genre, and we have our own database that we're ingesting from the market – ad creatives link to their games to their genres, to their performance marketing metrics.
That's how we can find similar games with similar tags and analyse their ad creatives' performance. After that, we gather these specific videos and analyse each second: what elements they have, what game mechanics, sub-mechanics, characters, emotions, texts, voice lines, whatever.
“Game development will be faster. Game analysis will be faster, and insights will get faster. We'll help with different features so they can think more about great game mechanics and worry less about the monetisation and user acquisition parts”
Anna Zdorenko
That's how we get this information – which elements currently are the most performing in the market. After that, our clients choose how they want to get results, be it UA/gameplay creatives, mislead creatives or other creatives. Then, they provide their initial gameplay video.
From this video, our platform gets information about real gameplay, what characters they have, and what UX/UI elements they have. It then starts to combine the best-performing elements from the market with the existing gameplay video from the game. That's how it starts to generate new, unique ideas for creating top-performing creatives.
If clients have previous creatives, they can also provide them to our team, and we will split each creative in terms of performance – good performing, not enough, and bad. That's how the platform also analyses all these elements across their own creatives and starts to make predictions about which elements most likely perform well and which don't. Don't waste your money or your time on non-performing ads! Focus on what performs and combine that with the market data for more unique ideas.
You did a case study with Game Gears. It showed impressive results, with the AI-generated ad concepts outperforming the studio's own benchmark. Can you talk us through what your platform was able to do that the human team couldn't?
The core, I suppose, is that humans cannot analyse all videos. It's impossible! We can try to find the top-performing creatives of our competitors, figure out what works there, watch them with our eyes, make some notes, and make ideas. But after all that, we cannot connect them to all the other ad creatives. We cannot analyse all the bad performing creatives.
In these correlations, we miss a lot of information. We miss even small elements. We saw results when you fix a sub-mechanic or small, specific element; performance gets so different. It's exactly for these details that automation can help – analysing all these parts in the right way, in the right timeline, with the right small details.
“Every routine digital process will be automated. What will we be doing in the future? Probably, we'll have to be more like founders and entrepreneurs, creating our own small solutions based on AI rather than creating big teams with big IT processes”
Anna Zdorenko
After that, they shared their results with us, creating a feedback loop. They tested four creatives and gave us feedback — one of them performed the best. We thought, "Cool." So we fed that information back into the INCYMO.AI system. It analysed which elements didn’t perform — based on visuals, tags, emotions, and game mechanics — and which ones worked well. Then it focused on the high-performing elements, asking, “What else is similar to these?” The system generated the next batch. In the second generation, two out of four creatives performed well; in the third, even three out of four. That was really exciting — a strong validation of the approach.
Your platform can entirely automate the generation of briefs, and you've also got some semi-automated video production. How do you balance a fully automated process with human oversight?
It's a very good question because not everyone in the market right now understands the possibilities of AI. We have to explain that. Today, we can generate the entire video on our platform. We have these pipelines, we have these models, but what we see is it doesn't perform the same, it doesn't provide the same quality, the same involvement of users.
That's why we highly recommend using brief generations for ideation to get the most performing ideas. In ad creatives, what we see from our experience is that the idea is much more important than video visualisation. Of course, visualisation is important, but we saw examples of when you could create not highly great performing visuals, not polished visuals, spend less money on production, and when you find your top performing idea, after that, you can improve it.

An interesting case is that you can convert it sometimes into 3D from 2D, and it will get you even higher performance metrics. So it's super important to find the right idea and the right brief. After that, we recommend doing it manually because, in this case, we see that the results can be great.
With others, we probably need to wait a bit longer for generative models to generate real gameplay videos because it's super hard. In my opinion, we will not see it this year because it's unique, specific stuff that will probably appear through playable ads generation.
If you can generate a mini-game, it means that it'll provide your gameplay mechanics results, and you will probably have an option to screencast record these videos to make gameplay generation.
The technology changes all the time! How do you ensure you stay up to date with all the latest possibilities of AI? Is your platform capable of being updated depending on new models?
We're doing this almost every week or two weeks. We update it because we have many pipelines inside. For these specific needs, we see that there are new models we have to test. We test in our R&D team. "Okay, this works... this doesn't work... this doesn't provide a good amount of context to integrate..." After that, if we see something works internally, we start to integrate it in the platform.
It's fast. Sometimes I think, "Will we do this forever?!" Updating each element. Probably yes! Because we see better results.
How does your approach differ between big and small companies? Can a small indie team benefit from this sort of AI tool as much as a big multinational company?
They probably have different approaches. I noticed that small indie studios didn't have a lot of ad creatives before. Even on a demo call, one founder generated a first brief based on their game link and their gameplay video. They had some previous creatives, but it probably wasn't efficient for them to provide us with all the previous data. So, they tried the online self-service platform and generated a brief. After that, they ordered a manual production of their video. In three days, they got results, put in their campaigns, and provided us feedback that they got the same benchmark and metrics as their previous top-performing creatives.
Because of this, I assume that small indie games can just generate briefs. Even with the free trial on our platform, they can create videos and get good metrics for their user acquisition right from the beginning.
Big corporates probably need to provide more data because they have already tested some approaches. They should share their previous data, the thousands of ad creatives they tested before, to analyse all of them. We can then find the most performing creatives based on their historical data.
I think this is the biggest difference. Bigger companies have more experience and more tested creatives and probably need more help adding specific rules for different platforms and specific situations. However, in general, all companies can use the self-service platform for their own needs or use it with all the advanced options.
It was one of my goals. I tried to create a platform that would be affordable for small companies. I can see in the market that the majority of solutions come with very big prices for big corporations. I understand them; it's a go-to-market approach. But we try to do our best to make it affordable for everyone.
There are still some sceptics about AI. People worry that the information they share with your platform will be used to train it for their competitors, for instance. How do you ensure clients that working with an AI platform is safe?
It's a great question. Before, we wanted to connect our platform to all ad networks, and we even had integration with Meta, where we can launch your campaign, analyse data, get metrics back and approve. But after talking with the customers, I realised that they don't want to provide this access. They're very cautious about their internal data. That's why we trained our platforms on market data. We don't really need their own creatives if they don't want to share; they can keep them to themselves.
“Humans cannot analyse all videos. It's impossible! We can try to find the top-performing creatives of our competitors, watch them with our eyes, make some notes, and make ideas. But after all that, we cannot connect them to all the other ad creatives [without AI]”
Anna Zdorenko
We also trained our platform to work with anonymised data. So they don't need to provide us with their spend, CPI, or ROAS. We only ask them to provide their top- and medium-performing creatives in this case. We use this data only for that specific client. So, we provide this algorithm training only for specific clients.
We see more or less the same from the market creatives. That's the reason why the legal departments of even big companies and public companies can pass their compliance. It's already open market data, and they don't provide us with anything very specific.
You've worked in a number of industries. What areas do you think are going to be most affected by AI in the next couple of years?
I think that every routine digital process will be automated. Even programming. Before, we had to write a lot of code with a lot of explanations. In the future, the only thing that people need is to clearly describe what they need, and it'll be automated.
So, any content creation, analytics, programming. All this will be done even faster and better. What will we be doing in the future? Probably, we'll have to be more like founders and entrepreneurs, creating our own small solutions based on AI rather than creating big teams with big IT processes. The world will be shifting.
What's next on the roadmap for your company?
Our R&D department is already preparing the process for generating playable ads. We believe that it'll be done this year, hopefully in the next several months.
And also a deep analytic platform for big companies for their in-house teams when they want to see results. They not only want to get detailed briefs. First of all, they want to get insights and then briefs. So we'll help them more by providing more control so they can choose specific genres if they want to create something new.

When you analyse the market data, you can generate new videos for new game types, test them, get metrics, and then create games as well as generate playable ads based on the videos. You will get a mini-game to test for user acquisition and metrics. And after that, you can add more correlated game mechanics about this game.
That's why game development will be faster. Game analysis will be faster, and insights will get faster. We'll help with different features so they can think more about great games with great mechanics and worry less about the monetisation and user acquisition parts. We're focusing on the industry, helping increase revenue and monetisation with AI. That's the name of the company: INCYMO.ai, increase your monetisation with AI.
When machine algorithms can beat the top human benchmarks after several iterations, this is how the market will be changed dramatically. I go to events like PGC and GDC to tell more people about us, to find more partnerships, to work together, to scale our solution and to become the market leader in this field.
Further down the rabbit hole
What’s been happening in AI and games? Here’s your essential news round-up:
Krafton announced it had made 1,000,000 sales of InZOI, the Sims-like simulation game, after just a week of full availability. InZOI includes generative AI for users to create content for the game.
Meanwhile, Krafton CEO Kim Chang-han and Nvidia CEO Jensen Huang met in California on Thursday to chat about potential areas of cooperation in AI.
Black Mirror’s new season is accompanied by a standalone game. Thronglets ties in with episode 4, “Playthings”, which satirises the artificial life craze of the 1990s, with a horrific sci-fi conclusion. You can play the game on your phone now.

Welsh generative AI games studio 10six Games, founded by Tiny Rebel’s Susan Cummings and Lee Cummings, has come out of stealth with a six-figure total investment.
The AI-powered Storycraft platform (where players create and share interactive story worlds) has raised a $3 million seed round, led by Khosla Ventures.
Tickets are available for the Dubai GameExpo Summit 2025, which features a games-focused Investment Summit on 7 May and a Practical AI track on 8 May, amongst other topics and fringe events.
OpenAI is planning to phase GPT-4 out from ChatGPT and replace it fully with model 4o.
Forget Ghibli-style memes. Last week, the fun trend was getting 4o Image Gen/Sora to make pictures of ’90s video game characters playing their own games.
Microsoft has created an AI-generated version of Quake. Muse (the name for its WHAM model) was announced in February. You can try this Copilot Gaming Experience out, if you can face playing the FPS classic at an unpredictable pace. John Carmack, Quake’s original lead programmer, even defended it as “impressive research work!”