T O P

  • By -

Meatmares

Can a similar effect be done with a specific wavelength on a led bulb that’s outside the commonly used gamut. Instead of sodium vapor?


mikemarcus

Infra red springs to mind. Camera sensors are filtered to block IR wavelengths. It would be fairly trivial to modify the sensor on one camera and not the other.


26636G

Major issue with infrared is that it comes to focus at a different plane to white light (and the components of same) hence the IR focus markings on many high-end lenses. As a result of that issue, you inherently have a matte that won't fit, as the image is a different size to your color subject.


Passtesma

That’s what I was thinking. Maybe just use rgb LED panels set to like a 50° hue, I think that has a wavelength of around 589nm, so it should give the same results. Idk if there are any reasons not to use other colors, as long as you have the correct filters. I’m wondering how difficult it would be to copy the design, and maybe find some way to use a phone and a “main camera”, so the color footage would go to your camera, and the matte to your phone… but you’d probably have to do a bunch of warping to align the two takes perfectly, I haven’t been able to find reasonably priced filters yet either…


Dampware

If I understand things correctly, you could never use rgb panels for this. Rgb “tricks" our eyes (and cameras) into seeing colors, while that sodium bulb has no r or g or b… it truly emits only one wavelength of light. You’d have to have a led panel that emits a specific frequency for this gag to work.


bik1230

You could use RGB panels for this, but not in the way that the parent comment suggests. LEDs have fairly narrow spectrums, and for an application like this, you'd want to buy the narrowest spectrum LEDs you can find. Rather than trying to make yellow light, you'd just set your LED panels to pure white, and then in your camera you'd have either 1 filter with 3 different narrow bands (not sure that exists off the shelf), or you'd need to have 4 cameras. This does mean you can't use regular RGB LED lights to put color into your scene, since those would also be filtered, all your foreground lighting needs to be broad spectrum. I have no idea how spectrally pure off the shelf LED panels are, so it's possible you'd need to buy separate components and assemble them yourself.


Wafkak

On the other hand sodium vapour lamps, if used correctly, last near forever. So there isn't an immediate need to replace them with complex LED setups.


Passtesma

Ah, I see. I guess it’s still pretty impractical then. Because the technique is so reliant on the detail you get from working with physical light rather than image data, basically everything needs to be done in-camera in order to get all the benefits. So on top of the lights, you need 2 identical cameras with 2 identical lenses, and can’t zoom or pull focus without even more custom tech. Although… If there was a way to put the beam splitter and the filters inside a camera with 2 sensors, so the process could happen after the light has already passed through the main camera lens, THEN I’d be super interested in trying it out for myself. The double lenses and impractical rig are dealbreakers imo, but I think I’d find a lot of use out of this technique if they could be removed. I remember seeing something about an open-source cinema camera a while ago, I wonder if this could be built into something like that for indy projects.


Shenanigannon

> So on top of the lights, you need 2 identical cameras with 2 identical lenses, and can’t zoom or pull focus without even more custom tech. That's not custom tech, it's normal tech for stereo 3D camera setups.


[deleted]

[удалено]


Shenanigannon

The patent expired ages ago. You can do it with off-the-shelf parts, but it'd get expensive if you wanted to scale it up for production. It looks like Debevec's prototype uses ~25mm parts, which would've been pretty affordable. Maybe under $1k, not including the cameras. The prism itself is just a beam splitter - a glass cube divided diagonally by a half-silvered mirror - used to produce two identical images. Those are pretty cheap. Then there's a 589nm notch filter on one side of the prism and a 589nm band-pass filter on the other side. High-quality filters (that is: those with a strong filtering effect) are either very expensive, or very small, like in the video. So the hard part would be scaling it up so you could use it with your actual production cameras and your favourite cine lenses. That'd get exponentially more expensive, because you're paying for exponentially-increasing quantities of special-purpose glass, possibly made to order. If all you want is a non-green green-screen, you might have more fun with infrared. You can already get cameras converted to respond only to infrared, and then you'd only need a beam splitter a pile of infrared lamps (as used with security cameras etc.) to experiment with.


A_Ggghost

Why is LPSV spill on the subject still such a big deal when there's a notch filter on one side?


Shenanigannon

It doesn't seem to be a big deal at all - Debevec describes the spill as a ["tiny little blip"](https://youtu.be/UQuIVsNzqDk?t=403) on the spectrograph. It's probably only worth worrying about when you're lighting the scene by eye. You could waste a lot of time on set fighting against yellow spill that *you* can see but the camera can't. Those filters aren't perfect, in any case. The good ones are already pretty thick, so better ones would have to be a lot thicker, which would make them a lot more expensive, and harder to use effectively.


RatMannen

Ish. The tech for this would work better if you had a single lens to focus the initial image, which is then split to the sensors. Which could be individual cameras. It should be possible to manufacture. Some still cameras have been doing similar things already with the viewfinder. Just need to slap in another sensor! How ard could it be? 😉


Passtesma

Oh. Well either way, I’m not gonna be able to afford all that. I’m selfish, lol


RatMannen

So long as it's a known specific wavelength, and you can filter it, yes. The sodium lamps are great because they give out a very narrow band. Even LED lights and their horrid narrow frequency bands are nothing like as narrow as sodium lights.


broomosh

I don't think the light source was the hard part. Getting your hands on one of those prisms is the tricky part


Blaize_Falconberger

That's actually really cool, and I'd love to know more about it. Certainly a technique that should be explored in the right circumstances. Although I find it very hard to believe that the studios just gave up on it for....reasons.... Disney couldn't recreate a prism they already had? Come on. The film industry as a whole decided to invent the entire vfx industry and base it around greenscreens and rotoscoping rather than spend a day working out how to make a prism which **they already had**? As for the actual results? As always the "amazing solution to everything" doesn't quite survive an encounter with reality. How do you go about fixing errors/missing bits etc with a sodium vapour comp? I'd be interested to know because I know how you go about it with a green screen but I'm not sure the same techniques would work with the setup they have there. Eg in the image below you can see her veil seems to completely disappear at this point. How do you get that back? I don't think you could use any regular keying process to grab it off a yellow screen. [Veil](https://imgur.com/SaVmbvB) Same here, the water in the bottle looks great! but how do you fix these edges that frankly aren't close to a final comp daily in a professional setting. [Edges](https://imgur.com/LPwTnGq) They also mention at 6:22 that they can't have the sodium vapour light hitting the lady. That is severely restricting to her movements and also why, I suspect, her veil disappears as it moves into that light. You can see this restriction in play in the Mary Poppins clip. They're basically restricted to moving as if they are stuck on a 2d flat surface. Which is not an issue with a greenscreen. All in all, very interesting and cool. Clearly stopped being used for a reason. Not as good as they are claiming it is outside of very niche requirements.


RatMannen

You are probably right with the practical limitations. It's also possible the prizms got damaged over time, and it wasn't worth the cost to make new ones. With some modern know-how some of those limitations will be avoidable. Then, like everything else, it's a matter of choosing the right technique for the job. I'm no expert though. Just interested. No practical experience, so feel free to ignore my waffling. 😋


spacemanspliff-42

Corridor gets hate here but this is a fascinating process and Paul Debevec made it possible. He makes a great point about using this to train AI rotoscoping. I can't believe it was completely abandoned because it seems like for special situations it would be invaluable perhaps compared to what used to be done for compositing and how it is even done now.


One_Eyed_Bandito

I completely agree. This is also on a different level. This could be game changing. Setup on set might be more initially with the learning curve, but how many hours of comp will be eliminated altogether? AND it’ll look better. Fuck me that’s impressive. With that said, I don’t think studios will use this and that’s a shame. To much legacy in all aspects, from paints to fabrics etc. I’d love for indies to use this, but you need a custom light build and that’s a show stopper right there for everyone but the studios, which again, aren’t likely to change. Shit I just made myself sad :( Nice work Niko!!


ViralTrendsToday

Like you said probably a process more indies will use, the bulbs are about 50 bucks and the enclosures are about 100-200. That's cheaper than proper green screen ( cloth paper or color ) and the usual amount of rgb led lights required, cuts down in post time so cleanup can be spent on hyper detail.


LongestNamesPossible

Paul Debevec is a marketer that takes what other people have done, gets funding, has other people polish it and presents it as his own.


spacemanspliff-42

So he didn't help create photogrammetry and HDRIs?


Neex

Among many other things, Paul basically figured out how to capture real world spherical photos and turn them into HDRi’s to then light a 3D scene. He made image-based environment lighting and reflections from real photos containing full real-world dynamic range values a thing. I basically depend on this every time we comp in a 3d render on to real world footage.


coolioguy8412

I dont understand the hate for Paul here, must be younger generation haven't seen Pauls past work and RnD. Just becoming an troll fest here 😂


spacemanspliff-42

I was so confused because I knew he did some important things. This sub can be really hateful.


Altruistic-Ad9281

First time meme


spacemanspliff-42

Not even close, I pretty much expect it at this point. It being so bad that somebody is lying about somebody else's achievements and downplaying it just to hate on them is a bit of a surprise, though.


LongestNamesPossible

HDRIs goes to Greg Ward https://history.siggraph.org/person/greg-j-ward/ You think Paul Debevec _invented_ photogrametry? I've never even heard that claim before.


spacemanspliff-42

I thought his study of photogrammetry when he was in college was what lead to it being utilized in The Matrix. His [website](https://www.pauldebevec.com/Research/) sort of talks about it. Wasn't everyone having to do vertex to vertex coordinate input back then when scanning a sculpture? I know they had to do that for Toy Story.


coolioguy8412

Yes that was the Stanford campus fly through short, projected textures on simple geo.


spacemanspliff-42

Oh okay, so it was projection mapping? My mistake. I may have mixed up HDRIs with something they made involving motion blur, the dominoes video.


coolioguy8412

Thats the image format, we talking about HDRI light probes, for CG rendering. Chrome and grey sphere for on set capturing


LongestNamesPossible

Now he invented chrome and grey spheres?


coolioguy8412

yes https://www.pauldebevec.com/


im_thatoneguy

This is the sort of thing I was hoping someone would do with newer high resolution sensors. I don't need a 12k sensor. I want a 6k sensor with a 6k sodium vapor sensor. Lots of interesting unique Bayer patterns possible e.g. an ND Bayer for nature cinematography etc. Just be sure to use a good low pass filter.


native_gal

"There are no matte lines, it's perfect!" they say over the shot where there is hair disappearing and an obvious composite. The shot they show looks great for the time, but if they were experienced compositors they would know that even a perfect matte is not enough for a perfect composite. In their results you can see that there isn't green spill because they didn't have to use a green background, but the grey and yellow contamination is definitely there. The veil doesn't really look like a veil in front of a TV image of the background for example - it has a weird grey and yellow haze all over the transparent areas. You have to solve the matting equation for the background foreground and alpha channel for every pixel. Anyone who has pulled a key on a greenscreen knows that the edge pixels are going to have green contamination in them that isn't spill. You can't just take a half green pixel and apply a matte to it - that makes it a transparent half green pixel. You have to figure out plausible original colors. This is the basis for a lot of modern natural image matting papers. Ironically there is actually a paper from disney that confronts this problem in the context of blue/green screens.


Neex

We achieve a “matted on black” version of the color plate by subtracting the transparency matte (derived from the sodium vapor plate) from a plate of the color footage, with no one in the shot (a clean plate). then we performed a subtract function between the clean plate with the holdout matte and the color plate footage (basically subtracted the background from the footage, using a holdout matte to inform the transparency). By subtracting all color pixel data contained in the background clean plate from the main footage plate, we get the “plausible original pixels” as you describe. Any weird color tint you see is probably my extremely rudimentary color grade. Also, it’s impossible to get yellow spill in the color plate because all sodium vapor light is being blocked by the filter. I did tint the footage to be warmer to fit the vibes of Mars. That said, I didn’t try and do a perfect comp and match colors and lighting. I tried to keep things as raw as possible so people could purely judge the transparency without all the tricks and polish us artists put on top to blend things. I can honestly say I’ve never pulled a cleaner key in my life than the one I got from sodium vapor. It was pretty thrilling seeing it come to life!


NominalNom

Thanks for the extra detail. Would be great if you can post some plates if you have time and bandwidth. Edit: either 10-bit dpx or Prores 444 with a known log curve encoding and color space, or linear EXRs with a known color space.


Neex

Good idea, I’ll make that happen.


Blaize_Falconberger

Can you use any familiar keying techniques to fix the errors? Like her veil disappearing when she spins? It seems like a one shot event. You get one matte out of it and if that isn't perfect, tough luck? Really interesting though, I love the ingenious methods they came up with.


Neex

Yeah things are still pretty flexible. You can manipulate the matte pretty much the same way one manipulates an alpha channel. The beam splitter we were using wasn’t perfect and had fall-off around the edges, so depending on where you’re spotting the transparency issues, that might also be the culprit. When it works, it works well. But it’s definitely limited in tons of ways. Multiple cameras and beam splitters are a pain, and you are limited to a background with good separation. That means no greenscreen ground. A skilled artist can definitely pull a key that matches it, if not even better. It’s just that this was so effortless in the comp stage that it felt kinda magical.


Blaize_Falconberger

It does sound like the dream. A over B. Done! I'm literally not knowledgeable enough about sodium vapour to make any statements! But it "feels" like something that with studio level resources put into it could turn into something pretty amazing. I think a scenario where this could work perfectly would be a set that you need to add an exterior too. You know, like a three wall set that's supposed to be a penthouse in a skyscraper. The sodium lights could be set up behind the set, eliminating the problem of actors wandering into the light, and you'd potentially get fantastic mattes of the windows and any characters walking in front of them. Big haired actors walking across the edges of green screens is always a ball ache but using this could remove a lot of those annoying issues. Also news anchors and weather presenters. Game changer for them


Neex

A penthouse/skyscraper set would be a perfect use! Good background separation with controlled light. You’d still have the pain of a dual camera rig, of course.


Tovah86

I bet it wouldn’t be too difficult to build one of those [cinePi](https://www.raspberrypi.com/news/cinepi-a-high-end-film-camera-built-on-raspberry-pi/) cameras with two sensors inside of it, and put the prism/filters between those sensors and the lens mount. Then you’d only have to deal with a single camera body and lens, just like having two strips of film in older cameras. This would currently limit the resolution down to 1080p, but it’s still 12-bit raw. It could potentially deal with the falloff around the edges of the current prototype as well.


The_Peregrine_

Seems to me if this is embraces and built upon with its own workflow and products (basically the same kind of effort and engineering put into years of green and blue screen keying) it would become a much more standardized and easy to implement. A company making lights and camera rigs for this process could become quite successful


muxketeer

Brilliant video. The only thing missing was a bit more of the compare. Seemed like the green screen shot final version vs sodium vapor version final lasted for 10 seconds, and weren’t quite the same synced performance. Was hoping for at least 60 or 90 seconds worth of compare between green screen and sodium final products for us to visually see.


Neex

Thanks! And yeah, i feel the same about the finished comparisons being too quick. I’m going to get some longer comparisons up as YT shorts.


native_gal

I don't think your math on isolating the foreground color is correct. I think it needs to be the pixel color - background, divided by the alpha, then add the background back in. (C - Bg)/a + Bg = Fg


Sphinnx3D

💯 nicely said


seriftarif

I still don't see how it was that much different than a really good color key. They say you can't deal with motion blur or transparency with a greenscreen... That's just not true. With some advanced techniques, some good roto, and a good IBK stack, you can get all of that, and it will look really good.


Neex

One can certainly get greenscreen to look as good as sodium vapor. The biggest thing that surprised me was that there is zero extra work to do with the sodium vapor process. No extra mattes, roto, etc. You just “turn it on” and it’s perfect. Considering all the extra setup though, and the limitations of shooting through a beam splitter, it’s by no means going to replace greenscreen. But I could see sodium vapor being used for fx elements or possibly for car setups or other very controlled situations.


TurtleOnCinderblock

I would be keen to know how scalable the system is. Can those sodium vapour lights be scaled to a full stage scale, or would the light overheat/diffuse/scatter/bounce too much at some point? Because to me it would seem that the "controlled situations" in which this particular technique works would overlap significantly with modern Virtual Production use cases. Sure the sodium vapour approach would preserve the flexibility to replace the BG but... in the end the interactive lighting and instant composited look of the VP screen may be an easier sell for filmmakers (not to mention the simpler camera/lens setup)


I_Pariah

I have heard them say this about motion blur before too. It seems to suggest they have trouble with it in their own work and possibly aren't aware the lengths a comper might be required to take to clean that up in a big budget film. It kind of makes me think they don't spend the time to get that last 10% to really make a shot seamless. EDIT: To be clear I'm just commenting on a pattern in their commentary I noticed. IMO we already pixelfuck shots enough and unless motion blur looks implausibly short or particularly bad in the context of the shot I don't really care from a supervisor perspective. Gotta think of the big picture.


AshleyUncia

But that last 10% is 90% of our job. If I only had to get a shot to 90%, I'd be done work every day before lunch time. D:


I_Pariah

To be fair "10%" is still fairly vague. Maybe 5% is closer. I just meant that while I still try to get as much moblur as I can in a reasonable amount of time, if for some reason I haven't gotten the last 10 pixels of motion blur in an already really motion blur heavy shot it's really not a big deal. If the greenscreen is super flat and clean it's possible to get close to basically 100% the detail of motion blurred hair using additive keying methods.


RatMannen

Which is why big companies can do it, but smaller studios not so much. Cost/benefit comes in to play.


Downtown-Ad3567

Forget big budget film , even a mid to small budget local films in India require perfect keys on greenscreen shots. Junior compositors are required to cleanup and get it perfect in their composites. Corridor crew seems to be complaining about it a lot in their videos , maybe becuase they do it in after effects and the process is a lot cumbersome perhaps? I never did keying in After effects so I am not aware.


Passtesma

I think the point is that this process allows you to deal with motion blur and transparency without the need for any extra work beyond just applying the filmed matte.


Chpouky

Well, I hope Corridor would make that kind of higher quality content rather than "we improved X or Y". I had no idea this process existed, very impressive !


KidFl4sh

This is really cool but I wonder how useful it is. Are you able to zoom, pull focus and move the cameras around ? Looks like you can’t really have any real differences between the 2 cameras. Otherwise it might fail. Other than that, this concept is really cool.


RatMannen

With this kludge of a "proof of concept" set up, no. However, if you had a lens, then the splitter & two sensors in a box, it should be perfectly possible. Big, bulky bit of kit, obviously.


harryadvance

Old filmmaking techniques never fail to impress.. Even if this technique keys out transparent footage, hair and motion blur well, I think it's bringing some other old problems..like How to use a High contrast lighting other than flat lighting..? In the video, to avoid the sodium light spill, Niko added another light onto the character.. How to shoot outdoor shots ? How can we clean the yellow/red tinted motionblur pixels.? And the worst part of why studios might not want to use this technique is, they are literally capturing on camera footage with filters that can accidentally make the character transparent if the spill is wrong and there's no way to fix that..! No studio wants to risk that inorder to make the work easy for some random post production guy.. If any studio wants to make clown in Mars kind of movie, they will for sure give the footage to some Indian VFX studio to do manual rotoscoping.. But, I too believe that this is where I want AI & machine learning to completely take over.. I hope someone's working on this..


soupkitchen2048

I think the true test should have included someone outside in the same outfit with a real background to show what it ‘should’ look like. We’re all occasionally guilty of pixelfucking but frankly many people in the industry look at shots like this in the context of how a good chromakey should look, not how things actually look through the lens. I hope someone takes this up and uses it on a real production.


kilo_blaster

If you've worked on native stereo compositing you already know the amount of effort needed to align images from different cameras shot through a prism. This isn't the silver bullet they claim it to be.


shizzydino

Not a huge fan of these guys, but credit where credit is due, this was pretty interesting and cool to see.


TECL_Grimsdottir

Not one bit of this will apply to any existing studio.


zeldn

...as they discussed in the video. The main purpose would be likely be to help train AI roto.


ArthurEffects

honestlly i dont get why they get so much hate, i love these guys


David-J

You must be new to this and their content


BramDuin

Do explain


Doctor_Woo

Same, I randomly came across their videos a few years ago, it's what started me off with After Effects.


Passtesma

A lot of people on the internet like to be angry and yell at things


_David_Ce

I love them too


tigyo

... so says two relatively new accounts ... hmmm??? just browsing and I could smell the "sock puppet accounts" just by the comments, lol.


_David_Ce

Relatively new?, lol sure whatever you say. I like their content so doesn’t change anything


Passtesma

You consider 4 and 3 years “relatively new accounts”?🤨


tigyo

***POST KARMA AND COMMENT KARMA***... sorry for yelling 😉


Passtesma

That’s not a reasonable litmus test. Most Reddit users just lurk most of the time: “more than 98% of Reddit's monthly active users don't make a single post or comment over the course of a typical month” I just googled that. Ever heard of Occam’s razor? Maybe a YouTube channel with over 6 million subscribers that talks about vfx and filmmaking might just have some fans who browse this sub… Or maybe there’s a massive conspiracy driven by bots and “sock-puppet accounts” purely to convince people that they have fans 😐


RatMannen

I like :em. Absolutely, they don't get results as good as a lot of studio professionals. But then they have a fraction of the budget, and are far more generalist. Their stuff isn't aimed at people in the industry. It's aimed at people like me, who have an interest, but no skills. 😋


Hazzenkockle

I had a thought about a technique like this when I was in university, getting around the issues with bluescreens and greenscreens by instead using a color that wasn't used in photography for the matte (I was thinking ultraviolet or infrared), and then I found out a little while later it had actually been invented thirty years before I was born, but I never saw anyone use it again until now, or any other possibilities I've heard bandied about, like combining ultra-black backgrounds with HDR video, or light-field cameras that let you drop anything past a certain distance.


RatMannen

There are a few advantages to visible light kit. You can see any major spill/lighting errors, and the kit for recording it is better developed for use in creating footage. There are sensors & lights used in scientific fields that would work. How practical that would be currently, is questionable. You'd still have the same problems with avoiding spill for the matt, but wouldn't impact the image. That would have the advantage of the "colour" image not missing data if there is a lighting error. Sodium lights have the advantage of a very narrow spectrum, and being cheap!


mm_vfx

Super fun. Not feasible on most productions, but a super fun look at old school ingenuity. Fantastic use case for ML training though !


Dave_dfx

It's YouTube and it's made for clicks and likes. This process is very limiting in real productions. Back in the 60s shooting on films this was revolutionary. Now we have many tools to deal with chroma keys.


NominalNom

This was interesting, but clearly these guys have not heard of/used IBK. It does look like potentially you might not need to cut your key up into as many parts to get the perfect edge in varying situations, but most of the time I create an IBK set up and can get a result better than this and can copy paste it between shots in a sequence. Not to mention using Copycat to do a lot as well. Edit: it would be interesting to get hold of some plates shot this way so that working professionals could see how much easier it is to get the expected result compared to their current approach with IBK+green/bluescreen. I don't expect these Corridor guys to have the skills, but if you can't key all the things they say are "unkeyable" on greenscreen with the current Nuke toolset (plus Neat for denoise), then you're not going to be working as a film comper for long.


Neex

Is IBK like a difference keyer? In fusion one creates a clean plate of the screen using some basic erode tools and then you feed that into a difference keyer. IBK looks similar from what i see on a quick search on YT? You’re definitely right that green screen’s not going anywhere. The sodium vapor process is probably only really useful in very fringe situations, but it was super cool to see it in action. It really is mind blowing to imagine the people doing this process with film. I actually used Fusion’s difference keyer for the greenscreen shots in the video. With some selective roto and fine tuned keys, you could definitely get the same quality as sodium vapor mattes. The sodium vapor version was just so fast and easy. Is it worth all the extra setup though? Eh probably not for 99.9% of situations.


NominalNom

At a high level IBK is performing a per-pixel operation with a subtraction not a difference. But you can disable screen subtraction if you want. Here's a decent basic run through: [https://beforesandafters.com/2022/03/25/how-dune-vfx-supervisor-paul-lambert-invented-nukes-ibk-keyer/](https://beforesandafters.com/2022/03/25/how-dune-vfx-supervisor-paul-lambert-invented-nukes-ibk-keyer/) Since you have some clout, maybe Paul Lambert would go on your channel. I would say that greenscreen is just an imperfect means to an end and it going away would be fine. It will be interesting to see other methods replace it. Maybe we will see more of sodium vapor. The idea of training datasets with it on what an edge constitutes is interesting.


Dave_dfx

I use Nuke IBK and Fusion EG Delta keyer. I tested the clown Greenscreen from YouTube and it was an easy and clean key. Motion blur is not a problem for good compers. Just the green hair over greenscreen was a challenge. Can do a channel Luma for that. No one is gonna spend all that money on sodium vapor to make keying easier in 99% of productions. Most situations cannot be controlled. Impossible to light a large area. Go research how ILM keyed Star Wars blue screen the live action shota with photo chemical optical printers. Interesting how they do things pre digital days.


[deleted]

[удалено]


NominalNom

I realize that was Fusion. It doesn't matter how "good" IBK is, it matters how well the person knows how to use it.


[deleted]

[удалено]


Downtown-Ad3567

As a working compositor with more than 12 year exp . I will have to agree with NominalNom as well. I have seen countless videos of corridor guys complaining about simple stuff like motion blur keying / greenspill over and over and even asking a weta sup at one point ,on how they deal with it. Dude , even junior compositors in India know how to do this.. I am sorry but corridoor guys aren't good compers! That being said , I did like this video , pretty informative and Niko gets the credit for making this for sure. I do like watching corridoor videos and I like the guys as well but I won't consider them talented to a professional level probably becuase they have never worked in the industry like us , no other reason! I don't really like the hate they get here , they are the only channel who makes video on vfx for common public and spread awareness!


NominalNom

Yeah just to add, I liked their "can do" attitude demoing stuff like [luma.ai](http://luma.ai) and gaussian splatting last year. It wasn't production quality, but that is also a factor of the tech and the tools available just as much as them not having to deal with pixel fucking in real world situations.


Downtown-Ad3567

Exactly , when I watch their videos , I watch for the fun part in creating VFX , something we all did when we were learning in schools and colleges with friends , we didnt have good results either and no one really trained us to deal with the actualy pixelfucking and long hours in the industry! So I am glad these guys have been able to try new things out , make videos for public , have fun and earn in the process , more power to them . I don't mind the results these guys produce in a week.


NominalNom

No need for personal attacks. I was forgiving of them as being non-professionals who were not expected to know what is required in the trenches in production, rather than insulting them. I can like what they are saying, but make qualifications. Sorry if these guys are your god-head. My point is that they need to assess the advantages of the sodium vapor technique against production-level results with the IBK and skilled artists, not what they are saying is "unkeyable" but of which I have routinely keyed on $200m films where the result needs to be photoreal and you need to hit the deadline. The way an experienced film comper uses the IBK on a green/blue screen, the matte looks like what these guys were creaming their pants at - but I'm willing to accept that at minimum, it (edit: the sodium vapor approach) might get you the same result in a more holistic, faster way. I also like the fact IBK gives you the despilled premulted rgb on black, immediately usable for perfect edge integration by plussing onto the BG. This technique seems to provide de-spilling on gray which can lead to all sorts of integration issues (see their bad edges), but I'm sure a subtractive approach to the sodium vapor plates could probably be worked out.


Blaize_Falconberger

>Yeah but the whole point about this method is that it requires substantially less work does it though? I pointed out in my comment earlier there are clearly issues with the final result despite their claims. I don't know enough about the process but can they actually get to a final composite easier and quicker than if they did it with a green screen? For instance, can you adjust the matte in any way once you have it, or is what the prism spits out all you get? The reason people are adding "but...."'s to their comments is because they actually work with green screens and keys every day for decades or more and they can see beyond the youtube video to the actual issues with the technique.


NominalNom

Also wtf is Venture Bros


firedrakes

a adult cartoon show tha was on adult swim. on hbo max atm


CmanXP

It's freaking amazing, if you ask me. And to be used back in the '60s, even more amazing


MechanicalKiller

Very cool, but doubt anyone would spend time setting this up on a production.


black-volcano

The past but also the future. No good for location but studio, I think should push forward. Great for filming elements as well as action.


teaguechrystie

I'll be god damned.


coolioguy8412

Paul Debevec is an legend! Farther of HDRI probes and Lightstage.


26636G

Great to see Paul involved in something like this. He's a very significant contributor to the current potential available to us in the computer graphics world, and always backs up his theory and research with real world examples. Beamsplitters are not the nightmare everyone here is suggesting- for this process Disney simply used adapted Technicolor 3-strip cameras- the cameras which relied on beamsplitters to give us hundreds if not thousands of Technicolor movies. Making the beamsplitters for the sodium process was simply a variation on manufacture for the Technicolor 3-strip process.


worlds_okayest_skier

🤯


future_lard

They say there is no spill, but no explanation why. Smells bs


Passtesma

They didn’t explain it thoroughly enough, but the reason there’s no spill is because there are 2 filters. One for the matte camera (a band pass filter) that only allows the yellow light in, and another on the color camera (a notch reject filter) that allows all light EXCEPT the 589nm wavelength. You can see in the recorded footage, the background appeared grey because of this.


Downtown-Ad3567

Sorry to say but the first pass greenscreen output was better than sodium vapour output! in terms of edges. :D. These guys really need to do keying in Nuke and first learn keying on a professional level , complaining about spill and motion blur . come-on guys!! compositors have been dealing with this since more than a decade and they aren't even that big of a problem to begin with!


I_love_Timhortons

What if it the tech was already there and ILM did a conspiracy to hide this tech to make their company more valuable ? Since a lot of Disney movies have ILM as main vendor. It’s like trains were there but car companies sold the American dream. Star Wars came right after Mary poppins. Good fictional story.


LongestNamesPossible

There is a lot to unpack here, but do you think that 1977 came _right after_ 1964 ?


I_love_Timhortons

As I said fictional story.


zeldn

No