Welcome to our community

Be a part of something great, join today!

  • Hey all, just changed over the backend after 15 years I figured time to give it a bit of an update, its probably gonna be a bit weird for most of you and i am sure there is a few bugs to work out but it should kinda work the same as before... hopefully :)

4K display

Peter, I find your example rather confusing. Maybe you could edit it to make your point clearer?

As it is written right now I'd disagree with you, but I want to make sure I understand what it is that you are saying.
 
Interesting. As a gadget geek and early adopter, I'll be first in line to get a 4K monitor and 4K projector (hell, I already picked up a 4K projection screen recently).



At one point IMAX considered using Sony 4K projectors for 4K 2D and 3D, but after test screenings where few noticed any difference, they went with 2K Christies for their digital theaters.

It's the viewing distance + quality of source material and display. Sit close (1.0x screen width or less) and projected 4K is just as dramatic. That would be the first 3-4 rows in your average commercial theater. One has to sit 1.0-1.2x screen width in order to reap the full benefit of 4K. For a 1.78 40" screen (34.9" wide), that's about a 3ft viewing distance. 65" = 5ft. 4K necessitates larger displays and/or closer viewing distances, especially in the living room. Just as 1080p did.

resolution_chart_small.png


So for a 50" TV, the difference between 1080p vs 720p is only fully apparent at 6.5ft. How many people do you know with 50" TVs sit that close?
And as the resolution goes up, the closer you'll need to sit to get the full benefit of that resolution.


I sit 3 feet away from my GT25 42" plasma I use as a computer monitor. 4k and higher would make a big difference. 1080p is not enough workspace for most media and media editing applications
 
Interesting. As a gadget geek and early adopter, I'll be first in line to get a 4K monitor and 4K projector (hell, I already picked up a 4K projection screen recently).



At one point IMAX considered using Sony 4K projectors for 4K 2D and 3D, but after test screenings where few noticed any difference, they went with 2K Christies for their digital theaters.

It's the viewing distance + quality of source material and display. Sit close (1.0x screen width or less) and projected 4K is just as dramatic. That would be the first 3-4 rows in your average commercial theater. One has to sit 1.0-1.2x screen width in order to reap the full benefit of 4K. For a 1.78 40" screen (34.9" wide), that's about a 3ft viewing distance. 65" = 5ft. 4K necessitates larger displays and/or closer viewing distances, especially in the living room. Just as 1080p did.

resolution_chart_small.png


So for a 50" TV, the difference between 1080p vs 720p is only fully apparent at 6.5ft. How many people do you know with 50" TVs sit that close?
And as the resolution goes up, the closer you'll need to sit to get the full benefit of that resolution.

This is exactly what I meant, the science say this, then count in how many people actually don't care or see the difference. Those who care about a story or emotion more then high resolution, just because we care, most people don't.

I sit 3 feet away from my GT25 42" plasma I use as a computer monitor. 4k and higher would make a big difference. 1080p is not enough workspace for most media and media editing applications

I have a 32" TV as my screen too, but how many people use a TV as their monitor and sit close? There isn't a market for a 4K television. Closest thing to something that could work is a projector or a screen the size of a wall. Then we could speak 4K as something that the market want, but not even 1080p screens come that big.

I think there's a lot of unrealistic speculation about 4K, I think Red's approach is good, 4K is a very good mastering resolution and for theatre releases it's perfect, but if people can't see the difference between 720p and 1080p at a certain normal distance in their home it's useless as a delivery for home theatres. TV won't broadcast 4K for a long time because the market is still not demanding more then 1080p and it's a small percentage who actually demand something higher then 720p TV.

I think most people just scream 4K!!! 8K!!! but they don't understand the science behind it. The human eye is limited, why demand more then needed for what you can see? It's just weird.
The industry need to create a screen that would change a wall in your home to a screen, then we can talk 4K, but most TV's out there stops at around 50" or 65".
Something else must be brought to the market in terms of cheaper large screen sizes, before we could make use of 4K as a viable resolution in peoples homes.

Be realistic about what you are speaking about. We love 4K, would love to have a supersized screen with 10-bit colors etc. do the general market want it? No, they don't even care about it.

Technological advancements is only made when the market wants it. 3D started to be more developed for the entire market after people saw films like Avatar. When people demand and want something, technology starts to develop (or rather, get the funds to be developed). There isn't a demand for 4K so it will not happen any time soon.

This is like thunderbolt. How many will really use all that bandwidth? And then they speak about a fiber version ten times better, what will be streamed through it? A very small percentage of the market make use of thunderbolt bandwidth, filmmakers and editors might be those who use it the most. But when the common man plugs in his 720 RPM hard drive into the thunderbolt port he is only using a fraction of it's bandwidth. So why isn't thunderbolt "taking off" as the grand big and only port to ever be used? Because no one see how it benefits them.

Those who develop and manufacture new technology need a market to get their income from. If they develop technology no one has the need for but a very small percentage, I understand why things aren't taking off. Why is HP not convinced about Thunderbolt and stick with USB3.0 instead? Because it's closer to what consumers and most professionals need. Even USB3.0 use much higher bandwidth then most HDD's are capable of.

I love technology and I think I'm a bit of a futurist, but I try to keep my feet on the ground and be realistic. 4K is a dream to come true only when screens are larger and in use in another way then just a TV.
Think a wall opening up a 3D image of an office far away enabling you to have a conference like everyone is in the same room. You would need high resolution glass-free 3D the size of a wall. That, at a good price, would be used by the market and even people at home... think social networking in a new way.
 
Alex might be right about the inability of the GP to differentiate between lower & higher resolutions, but he's highly underestimating the "geek" factor (mostly male) in the purchasing of new products and driving of new technology.

For example, Canon & Nikon each sell something like 10 million DSLR's per year. In theory, millions and billions of beautiful photos should now be taken per annum, which all these cameras and lenses are certainly capable of producing. However, as anyone who has ever browsed flickr or a photography forum knows, this is really, really far from the case! So why don't these consumers purchase $150 point & shoots commensurate with their talent or ambition? It's due to their inexplicable (mostly male) geek need to own something with the best specs as determined by their peer group, so that they would have the potential to shoot many award winning photos if I, I mean they, so deigned.

This is the actual market for Canon or Nikon, not the professional photographers or cinephiles who shoot video. However, those people are the influencers of the consumers, so they are still important to Nik/Can-on.

It certainly works exactly the same for televisions and computer displays. Even moreso, because unlike DSLR's for example, it's actually possible for John Q. Pub to learn to enjoy and have a great experience with his new HD 1080P, soon to be 3k & 4K displays. Since he paid "a goddamn fortune" for it, he's going to damn well squint at it until he can see each pixel individually. Ask that guy after a year of that to do your 1080 vs. 720 test and see how he'd do.

If you think I'm wrong, ask yourself how many people on this forum actually own a Red camera, and how many actually use one to produce good, or paying work? And how many of us are men? Other than that one girl from Red team, I don't think I've seen one woman who's ever posted. (Full disclosure: I'm not one of the talented, Red-owning, or female readers.)

All these knowledgeable, savvy, and untalented (or unambitious) consumers probably influence the aforementioned older people and other television consumers who don't really give a fuck about pixels and K, about 50% of the time, at least in my experience. You can always tell when they don't, because Gramps ends up buying a Magnavox at CostCo.

That's why 4K displays will eventually become de rigeur. It's a cosmic inevitability, of which this discussion itself is prima fascia.
 
Alex might be right about the inability of the GP to differentiate between lower & higher resolutions, but he's highly underestimating the "geek" factor (mostly male) in the purchasing of new products and driving of new technology.

Well, I agree that male geekiness and the "beat the Joneses" syndrome might make for an uptick in 4K adoption.

I don't think its going to be significant, and I don't think it will overcome the limits of human perception.

That isn't why I posted

If you think I'm wrong, ... And how many of us are men? Other than that one girl from Red team, I don't think I've seen one woman who's ever posted. (Full disclosure: I'm not one of the talented, Red-owning, or female readers.)

I know only four women who want to be directors of photography. One is actually a Local 600 member as a DP/Operator. One is a Red One owner and works a lot as a DIT, the other is just wrapping up film school, and has worked for me as 2AC on occasion. One worked with me as a 2AC ... but I was sound recordist on that shoot. She worked her way into camera starting in electric carrying rolls of banded that were actually both bigger and heavier than she is.

All have worked with Red.

In speaking with them ... I find that very few women are given an opportunity to work in this business. Its not a choice. When they get on a set, as a PA or whatever, mostly people push them towards craft services and make up.

My 1AC was being shuffled towards makeup, and possibly leaving the set early, when I snatched her up as DIT and dailies on a P2 shoot. She's operated for me now, including shooting Red One, but she likes 1AC better.

I've only had one female work for me on G&E. That was actually an actress on the show we were shooting. She had a couple of dead days, so she threw in. Worked hard, did really well and liked it. To be fair she liked acting better - very few wouldn't.

I've had one other woman work with me as 1AC ... but she wants to work on set sound as her career.

So ... don't make an assumption that women don't want to work in this business or that they aren't able to.

For myself, I prefer females when faced with choosing between two equally qualified candidates - one female one male. So far, this has served me well.
 
I remain dubious.

I had an argument with Jim Jannard, the upshot being I don't see evidence for the notion that ordinary people (i.e. not image making professionals and enthusiasts) can tell the difference between 720p and 1080p in motion images - and I am dubious that they can appreciate 2K vs 4K delivery unless we are looking at IMAX size displays. (and I mean "real" IMAX screens, not the IMAX certified screens.).

Back in 2003 People said the same thing about 480p vs. to 720p and 1080i... they couldn't tell the difference. There was no need for HD. There are still some old forum posts still around that talk about it. Hell I remember even in the DVX100 days there was industry professionals trying to sell the same argument on DV vs HD.

Displays get better.. and Cheaper. We are fortunate to have a bit of a crystal ball on vendors that are preparing 4K for the home displays. Google has some pretty cool plans on 4K to the home. Its not a matter of " if ".. it is " when."

Whoever is first to make the affordable home 4K may or may not win the game.. but you can bet that once it begins it won't end.
 
I saw a demo of a 50" QHD display showing 4K medical imaging. The difference over 1080p was very obvious to me at a 6 to 10 foot viewing distance. My eyes aren't that good either. I don't know that you are actually seeing that much finer detail above 2k, but as Graeme's MTF chart comparing Epic and Alexa showed, there was much higher MTF in the 1k-2k range.
 
Back in 2003 People said the same thing about 480p vs. to 720p and 1080i... they couldn't tell the difference. There was no need for HD. There are still some old forum posts still around that talk about it. Hell I remember even in the DVX100 days there was industry professionals trying to sell the same argument on DV vs HD.

Displays get better.. and Cheaper. We are fortunate to have a bit of a crystal ball on vendors that are preparing 4K for the home displays. Google has some pretty cool plans on 4K to the home. Its not a matter of " if ".. it is " when."

Whoever is first to make the affordable home 4K may or may not win the game.. but you can bet that once it begins it won't end.

I'm all for technology evolving, but I'm not so sure about what the general mass really want. There are still people out there who can't see the difference between 480p and 720p/1080i.
What I mean is that the common man can see the difference between 480p and 1080p if they are watching TV and movies a lot (those who don't doesn't see or care about resolution), but it's harder to see the difference between 1080 and up. Yes, in a movie theatre the math is correct about what you can see on a normal TV screen at certain distances. You can see the difference between 480p and 1080p on a normal TV at a certain distance, but it's much harder to see the difference between 1080p and 4K on a display the same size at a normal TV-viewing distance.

Also, the only images where you could really see the difference when being close is images that has a long depth of focus, not so much movement and a perfectly sharp lens. If not there's a lot of blur reducing the effect of sharpness.

I'm just saying that I'm skeptical that the common people will see the need for 4K on the market. It's hard enough to make them take 3D into their home, even though that "step" is quite easier to "see" for them.

I think that the use for 4K will be notable for other areas. Like commercial screens outside to blend more into the real world, for photographers when working with images (a big retina like display would be revolutionary for those who work with Photoshop), IMAX and digital theaters etc. But I'm not sure it will even make it into the home of regular people. Not until screens get as large as a wall.
1080p hasn't even gotten into any type of standard in most peoples home. It might take too long for 4K to become one. There might be some other technology that comes along that makes 4K a thing of the past before it even reach people's homes. Like a "floating point resolution" which doesn't make use of pixels but a "detail-focusing" fluid instead of fixed dots. Just a theory, but it might happen, as with holographic images.

So I'm not so sure about the roadmap for how 4K will become a standard when even 1080 has a hard time still after ten years.
 
If you make the assumption that people will continue watch small screens from across a large room, then it makes sense to doubt that 4K will become a mainstream standard of any sort. I'm not sure why you'd make that assumption, though, since viewing habits have already changed significantly as a result of 1080p.

Also, I think it's an erroneous assumption that people need to be able to appreciate the difference to be willing to upgrade. There's a lot of high-end audio equipment that has been developed and marketed and sold that would fail any sort of double-blind test. 4K may not become as mainstream as 1080p (which in turn is unlikely to become as mainstream as 480 in the near future), but as long as the price can come down you can be sure that there *is* a market for it. 1080p is still only 2 megapixels, after all. If 15% of the US population decides they want a 4K system (whether or not they can appreciate the difference) and the price point comes within reach, that's still a huge market, and I would consider that a fairly modest goal if I were at Sony, etc.
 
The perceived value will be what sells it. Add in some fancy marketing words (4K is undoubtedly what it will not be called) and some commercials ridiculing 1080p as yesterday and you got sales. Yes, it will be slow. Yes, it will not kill off 1080p. No, not everyone and his grandmother will make the switch. Yes, I will be the first in line for a 4K display under ten grand.

See, there is a need for something to drive tv sales and right now that's 3D. But what happens after that? Size? Hardly likely. So it will be back to resolution.
 
Back in 2003 People said the same thing about 480p vs. to 720p and 1080i... they couldn't tell the difference. There was no need for HD. There are still some old forum posts still around that talk about it. Hell I remember even in the DVX100 days there was industry professionals trying to sell the same argument on DV vs HD.

I remember all that very well and going back even earlier than 2003.
 
Back in 2003 People said the same thing about 480p vs. to 720p and 1080i... they couldn't tell the difference. There was no need for HD.

That doesn't mean they were wrong. It does mean that ultimately, the consumer electronics industry and the broadcast industry migrated to an HD only system, and that marketing was done to support that, to the point where essentially the only thing you can buy now is HD. That didn't necessarily happen through consumer demand.

I was involved in some of the early Fox comparative tests and I can tell you that at the screen sizes generally used at the time (this was around 1995 or so), the statement you just made was absolutely true. As the screen sizes increase, it becomes a bit less true. But regardless of the view inside RedUser, many, many consumers do not have 50 inch or larger monitors, and many of them are not even watching HD feeds. I, of course, support the move to bigger and better. But one should understand that Joe Consumer doesn't necessarily see it that way. Or see it at all.
 
I think you give too little credit to Joe Consumer. The image quality of a 50" 4k display with 4k source material at normal home viewing distances of 10-15 feet is blatantly obvious to even untrained viewers. It is not about seeing ever finer small details, it is about the detail quality at lower spatial frequencies in the 1k to 1.5k range. In that sense the distance vs resolution chart posted above is misleading and out of date. Now whether the average consumer sees value in this or can afford it for home entertainment is a different question. But I certainly could use affordable high resolution displays, display cards, and image processors in the commercial systems I install, especially command and control multi-screen display systems, scientific and medical imaging for education, and high end telepresence systems.

Graeme's MTF comparison of the new Leica lenses on Epic and Alexa shows the real reason this is true. The same overwhelming advantage in MTF that Epic shows in the 1k-2k spatial detail range is also equally true on the display side of the equation. Just because we can't visually resolve a high level of spatial detail doesn't mean that having a system with the bandwidth to display that detail doesn't affect the quality of what we can resolve.

1304136624.jpg
 
Toshiba's 50-inch-plus autostereoscopic displays, out later this year, are based on a 4K panel. Even though Toshiba reps I've talked with could not comment on prices, the consensus between my ears says the 56-inch model is somewhere between $15,000 and $20,000. The question is, can the displays be used at 4K for 2D images?
 
Couple things to consider:

Tom made a great point already that media professionals will likely be the first adopters of 4k: that's pretty straightforward and almost a fact.
We'll call them "Group A". Their needs are simple: highest quality available for a single person to view it only. So, an expensive 30" monitor is a no brainer.

Then there are the high end Home Theatre enthusiasts. Will they likely invest in a 4k TV.? NO WAY! More like 4k PROJECTOR. That's where we'll see 4k in the home first and foremost, IMHO. This is "Group B". My guess is Epson will release an affordable 4k projector soon, and the home theater enthusiasts will gobble them up. But lets keep in mind this group is still very small.

Now the rest would be of course our Average Joe Consumers. Are they likely to invest in 4k TVs? Nope. Not unless you can convince them they have to sit an inch away from a 10" screen, or 3 feet from their 50", and so forth. This is "group C". You get the point; until we see 80" minimum screens for sub $3k prices, don't expect to see 4k widely adopted at the consumer level. It's insanity. It'll be a tough sell to convince someone they need 4k on an over priced TV that is tiny.

Then we have content: well, obviously as more and more films and TV shows are shot in 4k/5k, and RED RAY rears its face into reality, we'll see 4k content. So there will be more content eventually; that's a give.

I agree with those that say "its only a matter of time", but I think 5 years is very unlikely based on Group C's slow adoption rate.

It's not a question of "Is 4k good enough?". We all know it is. Of course it is. Its a question of whether or not it will be affordable to 80% (that's very generous) of the population at a screen size for a whole family.
 
Well, I agree that male geekiness and the "beat the Joneses" syndrome might make for an uptick in 4K adoption.

So ... don't make an assumption that women don't want to work in this business or that they aren't able to.

For myself, I prefer females when faced with choosing between two equally qualified candidates - one female one male. So far, this has served me well.

I'm not sure where you got that from the fact that I said that few if any women post on Red and that new technology is driven by geeky males. My argument was that you can't use logic to predict future buying patterns, as evidenced by the high sales of expensive DSLR's, which are absolutely unnecessary for the majority of consumers (who buy them anyway).

Saying that women "don't want to work in this business or that they aren't able to" are words you put into my mouth, thank you very much. I would wish that there were many more women posting here and working in camera departments.
 
I'll just say this. In visual effects we use 8k and above textures without a whole lot of consideration. It is obviously because there is a noticeable difference in perceived resolution to those using them. Whether the audience or anybody else recognizes that at the moment is irrelevant. It is possible for those with average eyesight to consciously and subconsciously discern between the higher resolution details, however subtle. I think this serves as a case in point.

If history, neuroscience, psychology, and specifically micro evolution has taught us anything that we can apply here, it's that the human race consistently learns (for better or for worse, even). People are just now starting to learn the difference between standard definition and high definition. There have been a lot of factors as to why it takes awhile for us to see things more clearly, least of which (i'd argue) have to do with visual perception. If there is said to be any perceived difference by the average human, that human will learn to see the difference, and some will appreciate that over time. Those that learn to appreciate it, and can afford it, will prefer to have the more pleasing experience, however slight it might be to most.

I'm not going to try to predict the timeline that will span to take place, but given the history and science of technological growth eventually 4k (which is just a nice number for now) display will become affordable to the general public, and it will be able to be easily received at full quality. Keep in mind that people do easily recognize the draw of bigger things, that's a very simple thing to understand, and explain in a sales pitch. Those bigger screens will need to have the same perceived quality as their high definition screens do know, or they will tell a difference, because everything will be blown up in physical size, including... pixels.

So to say that we're not sure if regular people can explain in words what they are seeing at 24 frames per second (with 180degree shutter I suspect) on a screen that's not filling up a good portion of their vision is quite a bit narrow sighted. Hell, I haven't even heard a good, widely accepted and understandable explanation of how and why 24vs25vs30vs48 frames actually looks or feels different. Yet we make huge decisions based upon our ideas and feelings about that. If you've ever written or directed anything you understand how difficult it is to describe in words what you are seeing... it's like explaining to a blind man what a tree or a river or a turtle looks like... it's like explaining the wind. These things are better felt than explained. Eventually, some people will notice that feeling, even if it's just us, staring at our own walls at home. But let's not be so arrogant as to think we're the only ones who can feel or perceive images... we just happen to spend a great deal of time with them... others will catch on, especially if we help them along with our vision(s).
 
Yeah but in vfx that is usually because there are sub pixel effects which have to be supersampled. The universe with trillions of photons is already super-duper oversampled. :P Or the camera is going to get so close to texture that less than 2k would otherwise be visible. Which is again something the real world doesnt really have a problem with since most of the universe is fractally in nature.
 
Back
Top