![]() ![]() It's less obvious today since the compression in JPG and MPG has kind of taken over, but those of us who recall comparing images from a GIF file that only had a small number of colours but could do flat surfaces and edges perfectly with a JPG image that could do many more colours but flat surfaces and edges abysmally will know what I mean. This is due to how the image is compressed and which algorithms are used to do so. When you heavily compress an image you often find horrific blocking artefacts in plain shadow areas, and also on flat surfaces next to edges. Obviously this is a ridiculous example, but when we compress video we are essentially doing two things, one is that we are choosing how much total deviation there will be (all else being equal, more compression = more deviation from the source material) and we are choosing where and how that deviation will be allocated, both in the frame and across frames. But if you made every pixel 2% brighter, but only on every second frame, it would flicker like mad and would be a total shit-show, despite the video quality actually having half as much total error. ![]() For example, if you took a video signal and made every pixel 2% brighter then we basically wouldn't notice and no-one would complain. In visual terms there are equivalents to this and certain changes do a lot more perceived 'damage' to an image than others. For example, mp3s are very highly compressed (down to single digit percentages) but the perceived sound quality remains a lot higher than the file sizes because they're very smart about what information they're removing, and have chosen to remove the things that humans are least sensitive to. It's more accurate to talk about how much visual loss is caused by how much compression. ![]() Interesting discussion, but we need to be careful we don't deviate from what actually matters, which is how the image looks to correct that there is lossless and lossy, but there is no real 'visually lossless' category, unless they saved space by removing the metadata or something, but that's basically pointless. they perfectly know their users are pixel peeper, how could they think they could get away with it with just a BRAW powder in the eyes ?Īnd you you are correct it was my mistake : "RAW" is not an acronym so we should rather write "raw" or "Raw" if its in the beginning of a sentence it is a step down from losless cinemadng raw because it is the exact same data but in a bigger file size but it is better than having nothing if they really have a patent claim litigation to avoid. What i don't get is that blackmagic could have minimized this BRAWgate but just adding a BRAW 1:1 consisting only of the uncompressed raw data for users who want maximum quality no matter the file size as i am sure there is no possible patent litigation involved on serializing data in a file, any coder can code that in 5 minutes. Sorry you contradict yourself, you say the definition of raw is that its "data not yet processed" and then you say" lossy raw is still raw" but lossy compression is a "processing" of the original data and not a "minimal processing" it is a "heavy processing" involving very complex algorithms (BTW, its not "RAW", its "raw" or "Raw" if its in the beginning of a sentence.) So no, compressed raw, lossless or lossy is not bullshit raw. Its that its not yet processed and is just a file with the code needed to create the image. The definition of raw isn't that its uncompressed. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |