Billy 0 #1 November 11, 2004 OK I'm totally confused,,, I'm working on a project and looking at CCTV vs DV camera options. It seems the CCTV folks use "TV lines" as a resolution type of description, even when you do get to see any specs on the total number of pixels. Bullet Cams also use this I've noticed. DV seems it uses mostly pixels, and in editing video/jpeg images I usually use 720x480 as the image size. So WTF are TV lines and how do they relate to pixels or image size??? Natural Born FlyerZ.com Quote Share this post Link to post Share on other sites
LouDiamond 1 #2 November 11, 2004 Take a look at http://www.audiovideo101.com/learn/articles/hdtv/hdtv07.asp and http://www.epanorama.net/wwwboard/messages/9276.html If you Google it you can find even more info on it. Heres one more http://members.aol.com/ajaynejr/vidres.htm scroll down to Digital vs analog---The extended Kell factor"It's just skydiving..additional drama is not required" Some people dream about flying, I live my dream SKYMONKEY PUBLISHING Quote Share this post Link to post Share on other sites
Billy 0 #3 November 11, 2004 Thanks for the quick response Lou! Now I'll have to read some shit and make sense of it all... Natural Born FlyerZ.com Quote Share this post Link to post Share on other sites
quade 4 #4 November 11, 2004 Quote So WTF are TV lines and how do they relate to pixels or image size??? The confusion is understandable. When TV was invented a bit of thought went into how best to transmit the images. This was LONG before the concept of digital and all of the information was transmitted basically just the same way an AM radio signal is transmitted. As an electron beam moved in a line across the picture tube it reacted to the signal being broadcast. Take a felt tip pen and move it in a line across a piece of paper -- the harder you press, the darker the line. When the signal is stronger the picture is brighter. When the signal is weaker, so is the brightness of the picture down to a certain point, beyond that, it triggers the receiver to move on to a different part of the picture; the next line down (a scan line) or back up to the top of the picture to begin again (a new "field"). Additionally, each line and each field were offset by the space of one scan line. If you took two hair combs and fit the teeth of one into the teeth of the other, you'd see the relationship of two fields to each other. In analog video, each frame is made of a total of 525 scan lines from two fields. 525/2 = 262.5 so each field is made of 262.5 lines. (That's right, there -are- half lines) Now comes the confusing part. Resolution of analog video is defined by how many times the signal can go from black to white across the screen in 1 scan line. Go back to the felt tip pen and the piece of paper. As you go across the line, how many variations from dark to light can you get? -Excellent- and very high quality analog broadcast equipment can get up to about -900- lines of resolution. Most cheap home TV equipment will have about 400. If some CCD camera is claiming -525- lines . . . then they're preying on you're gullibility by using the number of -scan- lines as opposed to lines of horizontal resolution. So, remember that specific number, because it's a tip off that the guys selling the cameras either don't know what they're talking about or are just bullshitting. When we get into digital video . . . things get a little more "interesting". In the early days of digital video, most computers put out 640x480 pixels for their main (only) monitor signal and that then would be converted into analog video for display on TV monitors. There'd be some artifacting of the edges, but everyone was just so happy to have a computer image they didn't give a crap. As time went on, the computer wizards figured out that they had computers and could make the pixels any aspect ratio they wanted, so to squeeze a little more resolution out of it, they invented the non-square pixel which was 0.9 as wide as it was tall and giving us the familiar 720x480 non-square pixel size we're used to dealing with in DV25 (the signal that come out of out miniDV cameras via the FireWire). Things get "better" . . . If you work in digital "broadcast" formats they change things around a bit yet again. "Standard" definition digital video (aka D1) is 720x486. Yes, 6 stupid pixels more. Now, in all of this remember that we're ONLY concerned with the pixels and NOT the size of the screen itself. As the screen increases, the pixel count doesn't increase, just the size of the pixels does, so if you look at digital video on a HUGE monitor with a HUGE number of pixels the resolution isn't going to ever be any better, which is why when you enlarge a nice looking small sized QuickTime movie on you computer so that it fills the screen it probably looks like crap.quade - The World's Most Boring Skydiver Quote Share this post Link to post Share on other sites
AndyMan 7 #5 November 11, 2004 One more thing: Mini-DV Manufacturers also confuse things terrribly by branding their cams as having so many "Megapixels". They're being a TAD dishonest, because their implying that higher megapixel cameras have higher quality video - which is utter horseshit. The "Megapixel" in cameras being marketed to folks like us, only refers to the size of the digital stills when you use the digital still function, something that's usually COMPLETELY unrelated to the quality of the video, and completely useless to the primary function of the camera. _Am__ You put the fun in "funnel" - craichead. Quote Share this post Link to post Share on other sites