Blog #19 Predicting the Future of Filmmaking
As a fun thought experiment, I’d like to predict the future of filmmaking. In the last few years, we’ve seen shifts in filmmaking like never before, with different formats and workflows popping up and taking over. I’m going to attempt to do the impossible (unless you’re a Simpsons writer) and predict how things will continue to change in the future.
Real time 3D mapping that can be utilised in post to create whatever shot you so desire.
Here’s the concept: You press record and walk around someone, creating a 3D environment. Your camera will be able to tell where the subject starts and ends. In post, you import this as a 3D environment you can move around in. You could create a dolly shot, a pan shot, etc. The possibilites are endless. This would be useful for using the same shot twice but from different angles, so you could use a shot from the back of the subject or from the front. You could also swap in a different subject/object and use them in place of your original actor. This would be super versatile and open up many possibilities. Go nuts with it.
AI Generated Backdrops. In these last few years, virtual environments are the latest craze. They’re powered by LED panels connected together like Lego and display the work of the visual effects team working on the project.
Here’s the concept: Instead of hiring an expensive post-production studio, why couldn’t you press click and generate an environment instantly with AI, changing the end product to your desires with a few small prompts. You could even prompt the system for a few animated objects to move through, like wind or leaves, adding depth and realism. Of course, this may be done through your post-production software for a better experience but it’s something to think about…
Post-lighting.
Here’s the concept: you want to add another light to your shot but it’s too late because you’ve already shot. That’s where this comes in. You 3D track your shot, creating a depth map, which you can then use to throw in a light to add more nuance to your frame. It’s also a good way to pre-viz a look, dialling in the look you want before you pull up a single C-stand.
Edit: Blackmagic released this in Resolve 20.2, in an effect called AI Environment.
I’ll admit, I’ve run out of ideas and asked ChatGPT for help on this one. But it’s good, so I wanted to share it:
AI Cinematographer’s assistant. Using AR technology, an app will scan your scene and suggest lighting ideas, for example an overhead, cool backlight, diffusion to your already turned on key light. This will make sure you don’t look back in regret and get the best out of every shot. It can also suggest shots or show how it may look in post of you do a certain move, with certain lighting styles. This one is exciting and one I would have a lot of fun with, as someone always looking to improve.
Smart reframing of different aspect ratios. So picture this:
You’re on Tiktok/Youtube/Instagram and are watching a vertical video that you want to watch in landscape mode. Using smart reframing, with a touch of AI, you can watch it in landscape mode with smart reframing of the shot, so you don’t lose any details. How it does it is up for debate. Does it stretch it wide, creating a funny looking image? Or does it use AI to add in predicted extra details? It’s up to the engine but nothing a good AI engine can’t do in the future…
This image was generated by AI
This was interesting for me to write and think about, I hope it got your juices flowing and thinking alternatively about the possibilities out there. It’s a wild time to be alive.
Thanks for reading and if you’ve any comments, let me know down below.
Good luck out there.
-D.C.