Introduction
Caveat: I’m working on a Hackintosh, so that may have something to do with this issue, but I am not completely sure. I have yet to test this out on my MBP but would be interested to hear if anyone else is experiencing this issue.
I’ve been working with Screenflow since v. 1, and am pleased with how the software has grown and developed over the past few years. It occupies a nice space that feels more powerful and intuitive than iMovie but less resource-hungry than FCP.
Recently I’ve been making videos for classes and to document my own teaching as part of my pre-certification service, and ran into this quirky little bug that has been driving me crazy for a couple weeks. Thankfully, I found a simple solution that I’m happy to share in case anyone is experiencing the same problem.
The Problem
I first noticed the problem when working with an MP4 scrape of a video from Youtube that I was editing down for time and content for class [Original Video: Lose Yourself Analysis by Unabashedly Reggie]. I thought his way of presenting the rhyme scheme of Eminem’s most famous song was both clear, creative, and adequately detailed for my grade 9 class, but well, not completely class appropriate. So I edited the video and exported it, only to find the image was completely garbled, and barely visible. It was as though the video had been run through an colour inversion filter and a “find edges” filter afterward. The results looked like this:
Completely useless. And strange. I tried every possible combination of export settings, exported to my desktop and published to Youtube, trying to figure out what magic combination would sort me out. In the end, I couldn’t find a solution, and had to go with another song to demonstrate meter instead of rhyme scheme, while still highlighting the relevance and importance of rap to contemporary poetry.
I set this project aside and moved on to another video in which I inserted some jpg photos as stills. At first, they weren’t appearing at all, but other times, they came out garbled, and I realised: this looks very familiar. Indeed! It was the same sort of rendering error that I was experiencing with the rap analysis video. Aha! This means it’s not merely related to the video encoding that I was working with before, and that it wasn’t necessarily because I was re-encoding the information in a way that was causing the corruption.
I decided to experiment. If the images are being inverted on export, what happens if I invert them on the video? Will they come out right?
Interestingly, they were exported as clear, inverted images with no artifacts. This lead me to wonder if the act of putting a filter on the images was enough to counteract the weird processing glitch that happened on export. I tested by removing the filter on all but one of the photos, which was the only successful output on the next test.
Aha! That seems to be a factor. Next, I looked for a non-destructive filter that I could use, and settled on the exposure filter with no change.
The solution
Through a little analysis and experimentation, I discovered that adding a filter – seemingly any filter – caused the image to be processed differently and render as the filter requested. By using the exposure filter and not actually adjusting the exposure, the images rendered correctly.
Buoyed by my success with the photographs, I tried the same solution on the video that had been giving me headaches for the last two weeks, and sure enough, it rendered correctly, too.
I was thrilled to have worked out a solution, tweeted at Telestream about the issue, and share my experience here with you, in case anyone else is having the same issue, as I could find no other documentation online.
If it helps you out, please let me know by leaving a comment below.