Film to File: Film Studies Edition

Digital technologies have dramatically transformed the filmmaking process, and a bit slower on the uptake, film studies scholarship is also modifying its tried and true approach. Up until fairly recently, scholars presented their ideas by doing what I'm doing right now: typing words. If we are lucky, we may get an image to accompany our text, but that luxury is rare and expensive in the world of academic publishing. A massive disjuncture stems from utilizing a text-based medium to analyze a visual and auditory one. We've all had those moments when we want to bring our words to life but unfortunately one can't embed videos into a print essay (now at least) and if we could, there are many roadblocks to getting permission to do so. The issue of "permission" is still a contested one, but thankfully things are moving in the right direction: film copyright owners are seeing scholars as less of a threat to their income than they did before. (Note: we present ZERO threat to motion picture revenues. In fact, we inadvertently provide free advertising.) 

The growing trend of the video essay format has the potential to revolutionize film studies scholarship. Yes, we still need words (lots of them) but the ability to combine them with visual and auditory demonstrations is a game changer. As a particularly strong example, I present to you "Joining Up: Scotland, Cinema, and the First World War." This short film utilizes documentary footage from the Scottish Screen Archive. By employing montage for criticism (as it was intended to be), the authors have arranged clips in such a way as to visually demonstrate their thesis about this particular moment and place. The authors guide the viewer through their claims by way of voice-over. Once you experience an illuminating video essay such as this one, it becomes clear that this practice needs to expand among cinema studies scholars. 

While working as a solid example of the video essay, the project also demonstrates additional important points:

  • Copyright: The authors make use of extraordinary footage that may or may not be owned by the Scottish Screen Archive. Intellectual property, however, doesn't apply in the case of criticism. By utilizing the privileges granted by Fair Use, these scholars combine glorious visual examples to express a scholarly point.
  • Digitization: Making something like this was entirely possible before film became file, but it was really hard to do, and took forever. The ease of combining clips (not to mention attaining them: yay for the digital archive!), and adding  scholarly voices to make sense of it all, are new to this era. Additionally, all film students can watch this anytime and anywhere. Before we stop making this point because it seems tired (or it will soon), it is important to recognize how digitization has not only facilitated innovations in scholarship, it has also dramatically expanded access to the works created therein.

Can a computer write a screenplay? Yes!

I'm not sure cinema can get more digital than this short film. Yes, it was shot on film and yes, it utilizes a little CGI, but the deeply digital quality stems from the writing: it was written by a neural network, an algorithm.

The film begins with some confusing yet expositive words: 

Just above your smartphone keyboard lives an artificial intelligence. It was trained on lots of texts and emails. And tries to guess what you'll type next. We were curious what would happen if we trained this kind of software on something else; science fiction screenplays. So we fed a LSTM Recurrent Neural Network with these: [long list of sci fi screenplay txt files]. Then gathered a cast & crew for one day. Then we fed in random seeds from a sci-fi filmmaking contest...[contest prompts]....and turned it on. This was the screenplay it wrote:

In other words, the screenplay was sourced from three ingredients: neural network software, sci-fi screenplays, and solicited user contributions. The creation of this film is innovative in its use of software, databases, and collective intelligence. Now, that's one complex digital "author."

The outcome is a 9 minute short film that is just as incoherent as you might expect. What's surprising though, is that the combined sci-fi source material, user input, and smart direction creates something that does indeed look and feel like a sci-fi movie. The chaos of so many authors (seriously, consider the authorship that went into all of the screenplays) churned through an impressive predictive-text-AI-software-package does indeed produce a genre film. Sci-fi is a generous one in that it's very core is about innovation, boundary-pushing, and exploring the unexplorable. This would have been less successful of a test if the genre of choice had been, say, a romantic comedy.

Most of this blog focuses on the use of digital technologies in production and post-production. What's so valuable about this film is that it is a rare case in which the digital dominates in pre-pre-production. I, for one, look forward to seeing further experiments in digital film writing.

Click here to watch Sunspring.

Deep Content: Advances in Indexing Visual Media

You know that movie...the one with the explosions...and Tom Cruise...lots of action...What is that movie?! 

If you asked that question a decade ago, you'd have a few sites and a Leonard Maltin book as your resources. Now, of course, you can just google it. But if the search engine fails you there's another option. A beta-stage site offering "Deep Content" searching might be able to help you out. The site is called, fittingly, "What is my movie?" 

It prompts you to use conversational language, your own wording, to describe what you remember about the film. By way of extremely complicated algorithms, it produces an answer. I tested it with a number of vague phrases and so far it's been eerily spot-on. For example, I typed "a movie in which James Spader plays a rich jerk." The site gave me my answer: Pretty in Pink. I thought that might be too easy so I upped the game with this: "show me the movie about conquistadors that ends with monkeys on a raft." Much to my surprise, the site correctly gave me Aguirre, The Wrath of God. Amazing!

How does this work? Well, the engine was created by computer science researchers and engineers from a university in Finland. Here's what the site says about all the fancy tech involved:

“Whatismymovie.com has been developed by the tech team of Valossa that has its roots in the Computer Science and Engineering research conducted at the University of Oulu. We have an extensive research background on automatic content recognition and video data analysis. The demonstrations on this site have been developed for research purposes and Proof of Concept for the industry. Deep Content technology has also been piloted with the broadcasters for TV content.”

"Video data analysis" is what piques my interest. I know that programs have become able to recognize certain shapes with startling accuracy (this goes beyond Facebook recognizing faces).

The site says a little more about "Deep Content" as well.

"Deep Content technology can be used to discover and recommend semantically related content from large video collections."

This explains what it does, but not how. What is certain, however, is that we are entering a new stage in terms of indexing visual data. I can see great potential for this technology, particularly for students who want to search for visual details across an entire film, or an auteur's oeuvre. 

Manufactured Authenticity: Invisible Effects

I've posted on VFX a couple of times previous, but both cases were about spectacular ones. Effects of this ilk are showpieces and stand-outs, achievements in bringing fantasy to life. Most of the VFX work, however, is invisible. Nearly every movie utilizes some image tweak in DI. For example, a director may be unhappy with the color of a given scene, which is an easy fix by way of color correction. These "effects" are not evident for a reason: they are supposed to seem real in a way that doesn't dazzle you, but goes unnoticed instead. (Here we enter the murky territory of "realism" in the digital age.)

Click here to watch a VFX reel which showcases many invisible effects in Black Mass. Make note of the following changes: color correction, removing leaves from a tree, covering up (and in one case removing) a new building for a period film, adding snow, making the greyish Florida coastline blue, and adding crowds. These effects are not revolutionary but that's part of the point. Movies such as this one, period films in particular, aim toward authenticity. As I've discussed elsewhere, that authenticity is a style which mimics photographic realism. This example, and many others reveal the manipulation and handiwork that creates something so seemingly real. We're in a tough spot in which we need to problematize realism just as it's creation is growing faker and yet more convincing as technology improves.

VR + Documentary = Yes please.

Have you ever watched a movie using virtual reality goggles? It's pretty cool, but also pretty limited. They may be different, but the ones I've tried were shown through Oculus software. You are placed into a "theater," which incidentally is the coolest part: you get to look around a virtual theater that actually feels large and open. After selecting your film, you get to watch it on two tiny screens (in the goggles) that feel like an actual auditorium. My most recent experiment was watching The Lego Movie in 3D. The experience was pretty amazing, actually but as the technology stands right now, it's also awkward and fairly low-res. From my few forays into this arena, however, I can say with confidence that this will be a "thing" and perhaps even a game-changing one.

A new studio, Scenic, has recently launched with the express goal to foster the creation of non-fiction (yes!) films for VR viewing. Some of the directors on board include Amir Bar-Lev (The Tillman Story), Liz Garbus (What Happened, Miss Simone?), and Sam Green (The Weather Underground)...to name a few. The first round of films will be released this summer with a purported 40 in its first year.

A lot about this venture is innovative and extraordinary but the stand out, for me at least, is the focus on non-fiction media. Filmmaker Magazine's Paula Bernstein quotes filmmaker Gary Hustwit, saying:

“VR is kind of dominated by gaming and CGI stuff. I think there is a real potential for documentary film to be made with this technology. But again, it’s about getting a lot of filmmakers to try it out and wrap their minds around it and figure out how it fits into their creative process."

Bernstein notes that VR is fundamentally about taking us to places we otherwise can't go, real or virtual. One could argue that documentary shares a similar goal. It's also a genre that's embracing innovation. Two 3D docs come to mind: Pina (Wenders) and The Cave of Forgotten Dreams (Herzog). Both bring you into places you really can't go otherwise. The first takes you onto the stage during a dance performance and the latter takes you into a restricted-access French cave containing 30,000 year old paintings.

The marrying of VR and documentary is an exciting and fitting one to be sure. We should all keep our eyes on Scenic's upcoming releases.

For more information, click here to read Bernstein's article.

Interactive Film: The Twilight Zone

I can't think of a better candidate for interactive cinema adaptation than The Twilight Zone. This brilliant show was sort of horror meets sci-fi meets the surreal. In short, the whole schtick was "surprise me." The audience was rewarded in every episode with some kind of twist or paradigmatic shift by the episode's end. 

With narrative progression upended, this was truly a show that defied expectations. Interactive cinema, movies that allow viewers to make choices that impact the movie's progression/direction, is the ideal playground for expanding Twilight Zone's fundamental refusal to meet narrative expectations.

This article in Wired introduces us to the very beginnings of the project, which is truly in its infancy. As you can imagine, the biggest challenge lies in how to craft a film that entails interactivity that truly feels interactive. This means going a few steps beyond Choose Your Own Adventure stories in which the options (which did seem revolutionary to me at the time) were fairly obvious and ultimately it did read like a scripted novel with some room for reader "choices."

The creator, Ken Levine, has this to say about striking that delicate balance:

“Interactivity is a spectrum, it’s not binary,” he says. “I think of it as the viewer’s angle in the chair. When you watch something, you’re sitting back in the chair. When you’re gaming, you’re leaning forward in the chair. This is an interesting place in between … your brain is forward in the chair.”

Levine values the empathy that The Twilight Zone established between the viewer and characters, who are never reoccurring and only stay on screen for 30 minutes. His goal is to add "agency" to the mix. Frank Rose touched upon this in The Art of Immersion as he argues that enhanced engagement requires and creates empathy. The holy grail of interactive cinema is creating an empathy beyond film and TV, by allowing the viewer to have some agency, some say in the trajectory. 

This is one of many projects seeking this goal, but I would surmise that by adapting The Twilight Zone, Levine stands a solid chance of getting there, whatever "there" looks like.

Cinemetrics

The art of statistics reveals fascinating things about the art of cinema. Seriously, these number crunching cinephiles do amazing things to track the changes of the art form. The best resource, IMHO, is Cinemetrics. Check out the site to see info on shot duration averages for individual films and for the industry as well.

People tend to theorize that shot duration is shrinking as the medium changes over time. That's true, but the details of how and when those changes take place are interesting. For example, some believe that shots shortened up with the early advent of VFX. This was, apparently, due to the poor quality of the effects to the filmmakers would not want to linger on imperfect elements. This only applies to effects films, so you can see that the theory doesn't or shouldn't hold for all movies of that period.

The site provides a ton of data, and great instructions on how to interpret it from acronyms to graphs and more. You can also conduct your own analyses by using the site's free software and  following instructions provided.

Finally, don't miss the Measurement Theory page. It provides an impressive list of scholarly works that take statistical analysis into account within film analysis.

Click here to have fun with movie stats.

And Chill: A bot, not an innuendo.

No, this post is not about the symbolic meaning of "Netflix and Chill." The clever designers of this bot have named it as such in order to conjure the implication, but that part is not that interesting to me. This bot's functionality is what has my attention.

One of the greatest challenges of information glut is organizing it. We've all had that vertiginous sensation of knowing the there's SO MUCH available, but digging through multiple collections is stultifying. The big and small media companies are hard at work to find solutions to this problem, and while they scramble we'll probably continue to see projects like this one.

And Chill is a bot that utilizes a little input from you, and a lot of input from whoknowswhere, to recommend movie choices. You text the app (213-297-3673) asking for a suggestion, tell it something that you've liked, and presto: you get a recommendation. According to an article in Engadget, the app "uses a few different frameworks to detect patterns, attributes, and other factors" to produce suggested films.

The site tells you to either send a text or utilize Facebook Messenger. Hmm. Something's up with that. The developers are Sense Technologies, which does not appear to be a Facebook affiliate, but something tells me that it is. Well, Engadget straight up makes the connection, but I don't see factual evidence. But, such a link would answer my main question: how do you do it? It's entirely possible that this app is collecting enough user data to be able to tailor responses, but it doesn't appear have the massive audience it would need. If it's culling data from Facebook, in collusion with the site of course, *that* would provide an answer. If this bot is reading user data for media references and preferences (they ask you what you like when you sign up!), the bot would certainly have a lot of opinion to work with. If we are to believe Engadget's implication, Facebook is hoping to offer more bot's for media consumption in the future, and this is either one of them or an example of what's to come.

Deadpool and VFX

As a person who pretty much can't stand superhero movies, I think I've found my entry into the genre: Deadpool. My usual sense of agitation that develops after too many action sequences simply never happened when watching this particular movie. The intelligence of the writing, and the surprisingly strong performance by Ryan Reynolds (I wasn't a fan until now), kept me interested and delighted during and between action-heavy sequences. Take note, superhero filmmakers: make them smarter, like Deadpool!

While reveling in smart, dark ideas and words, I was also dazzled by the effects. This video compiled by Visual Effects: Behind the Scenes showcases just a few of the truly astonishing achievements by RodeoFX. You'll see examples of all kinds of VFX techniques in the video. The demonstration of additive/combined layers to create Colossus is particularly illuminating. 

Click here to watch view the a series of VFX befores, durings, and afters. One warning: for some reason people often like to overlay lame rock musical scores over compilations. This one has a track that's particularly grating and repetitive, so hit mute before pressing play.

Analog Aesthetics in the Digital Era

I was recently watching an episode of The Goldberg's when I got this feeling that something wasn't right. It took me a while, but then it donned on me: the image quality is too digital, too perfect for a show about the 1980s. It's clear that painstaking detail goes into recreating the 80s for authenticity but this one glaring but hard to notice (yes, I just said that) signifier of the present remains. For those of us who lived through the low-def era, we know the texture and quality of analog video intimately. And, although I noticed this contradiction in the program, I'm not at all suggesting that they revert to the previous paradigm. However, I do think a one-off episode that looked, sounded, and felt like VHS is not a bad idea (take note, showrunners!).

To meet the nostalgic needs of we older folks, developers have created a ton of "downgrading" filters and apps to make pictures and videos look like analog video and/or VHS. The same is, obviously, available to filmmakers. It happens all the time that a movie will show footage from the past that is clearly brand spanking new, but it perfectly resembles that old, hazy, muffled, scan-lined, footage.

Some clever pranksters have taken these tools to contemporary programs that we're used to seeing in HD/UHD. Check out the videos linked from the Boy Genius Report article on this phenomenon. Game of Thrones as an 80s/90s TV show is the most precious and hilarious thing ever. They not only downgraded the footage, but they also applied the formal properties of 80s/90s TV to the current credit sequence. This is a lesson not only in the flexibility of digital media, but also the dramatic changes televisual formats over time.

Click here to travel back in time with Game of Thrones.

Practical Effects, For a Change

Digital visual effects have gotten, quite literally, spectacular. If you've done some reading on this topic, you know that they still rely heavily on real world resources. In fact, many VFX producers and engineers have spoken loudly about the importance of working with plates (raw, unedited, photographic footage) and physical 3D models. In other words, the real is still an integral component of even the most fantastic visual effects sequences.

What we talk about less is the role that practical effects play in contemporary cinema. You'd be surprised (I constantly am) by how many amazing sequences are shot with real people or objects in real places. This video provided by Screen Rant (by way of Boy Genius Report), demonstrates examples of current and recent (by that I mean the last 20 or so years) examples of astonishing practical effects. Fun fact: the two actors pictured above are being held up by wires! From BB-8 (a real robot!?) to men dropping from a plane, onto another plane in Inception, this video provides a solid array of impressive practical effects.

Hot Cinema: Literally, Digital Projectors and Heat Dissipation

This piece by Engadget provides an excellent and clear (thankyouverymuch) description of how contemporary digital projectors work. Whether a technology has moving parts or not, heat is always an issue. For digital projectors that literally millions of mirrors that are in constant motion. The cooling approach is twofold: the microchips covered with mirrors have "spiky-looking heat sink" as well as liquid coolant piped throughout. While the piece covers heat management, it also provides general and user-friendly projector details. 

The "must" of this post is the video. Using CGI (naturally) Science Channel has modeled the inside of these complex machines.

Click here to read Engadget's article.

Participant Cinema: Make Your Avatar a Star

Ready Player One is one of the most enjoyable and intelligent trade sci-fi books to appear in years. Without giving away too much, the premise is about regular users occupying and hacking a virtual world named OASIS. Themes of user engagement, activity and collective intelligence run throughout the novel. 

Warner Brothers and Steven Spielberg are working on a film adaptation right now. As I read the novel, I could clearly imagine what a movie would look and feel like. It's a cinematic novel. When I heard that Spielberg is at the helm I felt pang of sadness as I wasn't sure his heavy-handed directorial style would work with the intelligence and subtleties found in the book. But, then I remembered Minority Report. Spielberg did not ruin that adaptation; in fact, I think he did a pretty decent job of depicting Phillip K. Dick's complex story.

One indicator that this movie has hope is the manner in which the folks involved (including the book's author) are including regular users, just as the OASIS engineers/owners do in the book. Follow this link to see the CFA (Call for Avatar). They're inviting gamers and designers to create their own avatars, a few of which will be chosen to appear in the film. The old commie in me sees a red (haha) flag in that the media companies are now asking us to produce content that we will then pay to consume later. But, another view, a less dark one, sees this gesture as one wholly appropriate to the story and our current context. 

 

Inexplicable Magic of Cinema: Internet Edition

I love listening to Werner Herzog. He has the most bizarre take on the world, one that is often dark and poignant. He's also unintentionally funny (although he knows this about himself as he appeared as himself/notself in an episode of Parks and Recreation). Whether he's talking about the madness of a man who lived and died amongst grizzly bears (Grizzly Man) or unlocking the mystery of 30,000 year old caves by way of digital 3D technology (Cave of Forgotten Dreams, his approach is unique, profound, and perplexing.

I also love watching his movies. The man literally moved a huge steamship over a mountain to demonstrate a point about madness in Fitzcarraldo. He expressed a similar sentiment about the Spanish conquistadors in Aguirre, The Wrath of God. The threads of civilization, destruction, madness, wonder, and beauty run steadily thoughout his works. And this is why I'm excited to see what he has to say about the internet. The trailer teases us with visions of crazy ("the internet is evil!"), technophobia ("the internet is killing us!"), and visionary possibility ("Elon, I'll take a ride to Mars!). One thing is for certain, this won't be a simplistic depiction of civilization on the brink of innovating itself out of existence (well, not entirely it appears), or how the internet will save us all. In true Herzog fashion, the trailer depicts a complex, fascinating, and confounding film.

Click here to view the trailer.

VFX Demo: Jurassic World

Behind the scenes footage is widely available for those interested in examining the making of VFX. This lot from Industrial Light and Magic (the Edison of VFX) is particularly valuable for the purposes of analyzing digitally manipulated footage. In specific I value the wide array of techniques presented in this 3 minute compilation. Therein you'll find visual examples of: compositing, 3D modeling, color correction, digital mattes, DI passes (a million of them), chroma key, green screen, and much more. You'll also see the various stages in compositing VFX into plates, process by process.

There's even a nice little lens flare at the beginning, completely manufactured of course, as there are no cameras, lenses, or real light involved in that part of the clip.

Click here to view the behind the scenes clip.

VR is coming to a theater near you!

It can be hard for old-timers like me to not think of cheesy Lawnmower Man and (awesome but simple) Battletech when the topic of VR emerges. We clearly need to don some goggles, open our eyes, and pay attention to this rapidly changing and expanding technology. The apparatuses and available content are definitely on the upswing but we're also clearly in a transitional moment.

This article from Variety reports on the likely (upcoming?) addition of location-based VR to IMAX movie theaters. In large part this seems like an obvious coupling: IMAX and VR are both about immersive media experiences. On the other hand, they are quite different and from what I can see the IMAX attributes are taking up the foreground space here.

VR has the promise to transport you elsewhere from wherever you are. Anchoring VR to a specific location makes sense for exploring a new technology, but it's really beside the point. IMAX, on the other hand, has to be location-specific. Special auditoriums are decked out with specific equipment from massive screens to audio to projection (and glasses of course). You can't move an IMAX theater without a substantial investment. 

In terms of content, movies are arguably less interactive then other digital media formats. VR is designed to place you in a virtual context in which you can interact. These two forms of media engagement are not in diametric opposition, but they are vastly different. Filmmakers have been trying to figure out how to make film more interactive for a while now, and this new move from IMAX may be a gesture toward a more interactive cinema.

What IMAX is proposing seems to pull aspects of VR into the theater context, but the experiential goal is unclear. Would you wear googles to see the movie, rendering projection and a big screen unnecessary? Would you play a game related to a film, right after seeing the movie (this seems like the idea, given the reference to the John Wick game)? We'll have to wait and see. For now, the plan raises compelling questions for content, exhibition, synergy, convergence, and engagement.

Read Variety's article by clicking here.

Streaming: It's a genre now?

This New York Times article argues that streaming media, TV in particular, is not just a medium, but a format as well. Poniewozik makes a solid case for his argument that streaming has introduced something entirely new to a medium that hasn't changed much in the last 80 or so years. The immediate release of an entire season for binge consumption certainly is new. Scholars are debating the merits of this new approach at this very moment. Do you have enough time to process a densely-packed hour of television if you start another episode 20 second later? The weekly format may be irritating but it did built in time for contemplation. TV has also gotten smarter (thank you, HBO) which should be coupled with even more time and space for analysis.

The word "genre" occupies a very specific place in film and media studies. For film its often a retroactive term utilized to examine trends and thematically significant moments within specific cultural and historical contexts. Genres are specific ways to package ideas that best expresses the content which always has social and cultural significance. With this discipline-specific definition in mind, is streaming a "genre?" I'm not convinced that it is. However, I do think Poniewozik has a point in regards to streaming's foundational impact. Existing genres may in fact transform in anticipation of a new viewing context. Do we need epic cliff hangers if we can just sit back and wait for the next episode to load, rather than holding onto that narrative question for a week? Poniewozik anticipates this by asking content producers to step up their game: "streaming needs to learn to use its supersized format better, not fight against it."

Read Poniewoziks' NY Times article by clicking here.

From Shaping Minds to Shaping Faces: Disney's FaceDirector

We don't normally think of Disney as a major player in the photographic plate software business, but they are clearly invested. FaceDirector synthesizes two facial expressions into one. Engadget explains how it works in greater detail and focuses on using it to lower production costs. Didn't get that expression quite right? No need to reshoot: just make it happen by morphing existing footage. While there are many implications to be drawn, I'm most interested in the fact that a studio identified mostly for animation has invested resources into something specifically for photographic filmmaking. One thing is for sure, FaceDirector is yet another indication that actors really don't have to fear being replaced by VFX programmers.

Read Engadget's article about Disney's Face Director