Thread: Redray codec accuracy.

Reply to Thread
Page 1 of 4 1234 LastLast
Results 1 to 10 of 36
  1. #1 Redray codec accuracy. 
    Senior Member
    Join Date
    Jan 2008
    Posts
    5,457
    Let's talk business. Redray has been around for a while. Are there comparison tests for it's accuracy compared to different codecs?


    Thanks.
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  2. #2  
    Senior Member
    Join Date
    Jan 2008
    Posts
    5,457
    The redray player was out in the wild, so somebody must have dine some testing?
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  3. #3  
    From what I heard it was comparable to H265 which made it kind of irellevant since H265 is widely implemented in all new hardware whereas RedRay required a dedicated player.
    Reply With Quote  
     

  4. #4  
    Senior Member
    Join Date
    Jan 2008
    Posts
    5,457
    You are joking, all those guys raving on his lossless it looked at 9mb/s that is incredible they could get that wrong. H265 can't do that. Couldn't they tell the difference? Gavin, are there links to any actual tests?
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  5. #5  
    Senior Member
    Join Date
    Jan 2008
    Posts
    5,457
    Oh, and thanks for the heads up Gavin. :)
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  6. #6  
    Senior Member
    Join Date
    Jan 2008
    Posts
    5,457
    Hmm, does that mean if I put a demo of cows in a windy field up in 4k 9mb/s h265 in a dimly lit room at midnight and after serving drinks for an hour and put strippers besides the screen bathed in red light, and call it a revolutionary codec and claim to be Steve Jobs, people would say how ultra realistic lossless it would look, and how good it is for recording porn, and start asking me when the next Mac Pro was coming out? :).
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  7. #7  
    ive compressed every image i have on disk ..about 160k of them, 300tb worth..using my own compression algorithm , and also with jpeg2000.
    the images I encoded were 8bit, 24bit, 48biti colour depths, and my compression system is based on a simple Huffman coder, that encodes images in about 1/10 of the time the openjpeg encoder requires..but has produced a folder of images that is 55.12906% smaller than j2k .
    I can also at complete error detection and correction data to the file .. which increases filesizes by 2.5% (per megabyte of compressed data)

    I get the same results compressing raw DNG files, but ive not optimised for raw or single channel data yet

    ive noticed that the 8k weapon footage ive downloaded from reduser generally encodes very well, typically..165mb tiff-> 120mb j2k-> 47mb Huffman coder... I'm finding weapon footage compresses much easier than dragon orientated footage, and its not necessarily caused by a reduction in noise .. its slightly stranger than that ..

    better compression algorithms are easy to design, and improving existing ones too.. the hard bit is implementing them in hardware , whether it be via HDL code which is well beyond me, or other methods, which are also ..tricky!.. and for new formats..getting them adopted of course

    I got fedup of j2k lossless compression some time ago .. quite simply .. its a lossy format for practical purposes .. its lossless function creates bloated files (most of the time)
    replacing it as a backup format has been easy for me. shame no one else is interested ..although if anyone wants to be a windows beta tester .. i could arrange that .. ive ran out of test images!

    I think it was Graeme who said designing the .red codec was ..i para phrase "remarkably easy" .. which suggests to me they adapted an existing codec ..i doubt they knocked out a brand new/ground up hardware based codec .. more like "pre filtered an image and fed it into an existing codec" .. like they do with the redcode (pre-filter data -> jpeg2000 ) .. which still costs an arm and a leg in research and development to implement in hardware..and headaches.. but means you have off the shelf hardware support to use as a starting point
    Reply With Quote  
     

  8. #8  
    Senior Member
    Join Date
    Jan 2008
    Posts
    5,457
    If anybody wants to beta for Conrad that is good here.

    Now, Konrad. My head is a bit misty at the moment and I'm next to a busy road. So forgive the sloppiness of any supply. I can tell you some of the stuff about redcode, but first. The jpeg2k part of your compression comparison is a bit of 'how long is a piece of string' for calculating compression. How much compression on average do you need for true lossless, and how much do you need for a known form of visually lossless or compared to various visually lossless modern redcode data rates are basic sort of comparisons. An average loss db against an image is accurate but not so relevant, as it is where that loss occurs that is relevant. Preserving more significant parts of an image against less significant parts skews the perception of image quality. This is something I put forwards nearly 13 years ago.

    Now, redcode. I know people, and redcode was apparently originally based off of Jpeg2k, which is bloated slow etc. Then redcode had cineform related technology and vastly improved.

    Now, redray performs very high compression, by reports at least visually lossless (but I have not seen tests). So, does your codec perform 4k at less than 9mb/s visually lossless etc?

    Now, it is often difficult to get antiquated and fundamentally inefficient routines to high efficiency metrics. But a high efficiency routine can be different and simpler. So, second level of denial of interest in routines happens there (the first includes 'you can't x,y,z', 'another guy claiming something' , 'yada yada yada' 'blah, blah, blah' etc.) Levels include people with inadequate or no idea, or I'll equipped to do something with it financially or practically. The third level, is that it is a IP nightmare, where you can fall foul of patents, and its a lot of trouble and companies that know things, know it, and may not be interested once they do know, often. A simple Huffman encoder by itself, doesn't appeal as a potential redray challenger.

    But I'm open minded so I'm interested in what it is, but protect your IP. If you really have something and its all too difficult, maybe Google will be interested to pay, and roll it into their open source codecs, but then you get the chicken and eggs like difficulty of getting them to sign a non compete permanently, which companies without the utmost morality (like virtually NONE) would he interested in doing under legal advice. So, with a limited non compete clause, they likely will still he reluctant to sign, plus you have the issue of filing for patent and completing cost of patent rollout over the next year or so, in which time you may have nobody uptaking the technology to afford the patents, and then letting the patent lapse and companies you have under NDA Non compete previously swooping in to sweep it up. One engineer I knew was part of a group that had a new leading hardware technology and the big company they wanted to license to, apparently, was intent to just wait them out. We really do need a revision of patent law to have a free registration of IP, a patent period that starts upon a contract, commercial use of an IP. So you can present and companies can window shop, and never ha e the ability to wait out IP unless it is during the active IP period of commercialization. This is the sort of thing I'm putting forwards for IP reform. There also needs to be a more open market mechanism for these things after a first exclusive license period. Which is another reform I'm putting forwards. We might be looking at a 90% reduction in progress due to the way the current systems work. Which means, we are maybe a century or more behind in some areas (some newer areas may be more novel, harder and less legs to advance quickly with the aid of smaller competitors without many/millions/billions of project investment ).
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  9. #9  
    Senior Member
    Join Date
    Jan 2008
    Posts
    5,457
    I don't know, I might have before you Else lol!

    But the IP market gives me the creeps, it is heavily suited to less mindful very large companies and their drones, than to the masses with a lot of creative potential, making a real drain and bottleneck even for companies. You should not have to go into hock for heaps just to the establish and protect a patent world wide, before you go and do anything with it, or be required to go through very expensive NDA processes and further development in secret hunting for investors etc. A true market place is, you have it listed people look at it, and say, there's one, and contact you, rather then you hunting across the country or planet for somebody to NDA and license. It would make the rate of progress and change rapidly speedup and get rid of lesser solutions quicker. In an more open market place multiple companies could license at once without negotiation at set rates (the rate calculation is a little complex but something I come up with maybe in the 1990's). You never need to meet a licensee, but if they have any sense they would use you as a consultant.
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  10. #10  
    Quote Originally Posted by Wayne Morellini View Post
    If anybody wants to beta for Conrad that is good here.

    Now, Konrad. My head is a bit misty at the moment and I'm next to a busy road. So forgive the sloppiness of any supply. I can tell you some of the stuff about redcode, but first. The jpeg2k part of your compression comparison is a bit of 'how long is a piece of string' for calculating compression. How much compression on average do you need for true lossless, and how much do you need for a known form of visually lossless or compared to various visually lossless modern redcode data rates are basic sort of comparisons. An average loss db against an image is accurate but not so relevant, as it is where that loss occurs that is relevant. Preserving more significant parts of an image against less significant parts skews the perception of image quality. This is something I put forwards nearly 13 years ago.

    Now, redcode. I know people, and redcode was apparently originally based off of Jpeg2k, which is bloated slow etc. Then redcode had cineform related technology and vastly improved.

    Now, redray performs very high compression, by reports at least visually lossless (but I have not seen tests). So, does your codec perform 4k at less than 9mb/s visually lossless etc?

    Now, it is often difficult to get antiquated and fundamentally inefficient routines to high efficiency metrics. But a high efficiency routine can be different and simpler. So, second level of denial of interest in routines happens there (the first includes 'you can't x,y,z', 'another guy claiming something' , 'yada yada yada' 'blah, blah, blah' etc.) Levels include people with inadequate or no idea, or I'll equipped to do something with it financially or practically. The third level, is that it is a IP nightmare, where you can fall foul of patents, and its a lot of trouble and companies that know things, know it, and may not be interested once they do know, often. A simple Huffman encoder by itself, doesn't appeal as a potential redray challenger.

    But I'm open minded so I'm interested in what it is, but protect your IP. If you really have something and its all too difficult, maybe Google will be interested to pay, and roll it into their open source codecs, but then you get the chicken and eggs like difficulty of getting them to sign a non compete permanently, which companies without the utmost morality (like virtually NONE) would he interested in doing under legal advice. So, with a limited non compete clause, they likely will still he reluctant to sign, plus you have the issue of filing for patent and completing cost of patent rollout over the next year or so, in which time you may have nobody uptaking the technology to afford the patents, and then letting the patent lapse and companies you have under NDA Non compete previously swooping in to sweep it up. One engineer I knew was part of a group that had a new leading hardware technology and the big company they wanted to license to, apparently, was intent to just wait them out. We really do need a revision of patent law to have a free registration of IP, a patent period that starts upon a contract, commercial use of an IP. So you can present and companies can window shop, and never ha e the ability to wait out IP unless it is during the active IP period of commercialization. This is the sort of thing I'm putting forwards for IP reform. There also needs to be a more open market mechanism for these things after a first exclusive license period. Which is another reform I'm putting forwards. We might be looking at a 90% reduction in progress due to the way the current systems work. Which means, we are maybe a century or more behind in some areas (some newer areas may be more novel, harder and less legs to advance quickly with the aid of smaller competitors without many/millions/billions of project investment ).
    firstly .. where loss takes place is irrelevant .. my compression system is lossless..and for archive . my files and the j2k output is bit for bit identical images .. so the lengths of strings can be compared ..and measured in bytes ;)

    ..as for your lossy ideas 13years ago .. they caught on ;) .. lossy is a p*ece of p*ss to pull off ..

    does my codec perform 4k at less than 9mb/s visually lossless? .. no, its lossless .. not visually lossless .. but it would depend on the image, whether I was recording raw or RGB ..and you didn't specify the framerate ..or bit depth.. and is that megabits or megabytes? .. ill assume you mean 1fps and megabytes .. and say yes ;) .. of course 4096 x2160 pixels is fewer pixels than 4480x1920 (r1 wide screen) .. so in that scenario ..it could do 4.5k .. or a 28k image.. if its height is 1 pixel ..

    but seriously..i think i could modify it to do visually lossy encoding ..without complex math.. but i do not expect id meet the redcode datarate .. nor do i think redray would match my encoding quality ;p .. but i imagine i could produce a pleasing image at that datarate .. but i have little interest in doing so currently..go figure ..

    in my world, its all about lossless, high dynanmic range encoding ..and archiving .. ill leave playback to dvd ;)
    I'm not confined by hardware implementation .. i'm producing a 64bit image pipeline for storing very large numbers of pixels as densely as i can ..

    i'm going to launch a small kickstarter soon, with one goal being to enable compression of extremely large hdr panoramic /stitched images using my system (and small images too of course)..along with some downsampling, upsampling tools so you can view them at HD etc ..or SD ..without issues .. at the end of the project .. everyone can see the compression code..its going opensource ;)
    ..couldn't care less what google do .. the fuller their hard drives are ..the better .. i'm happy for them to choke on their own data ..

    .. its about 400lines of code so far .. written in free pascal ..ready for translation to other languages .. but i have a long list of possible improvements and tests to carry out first..including supporting floating point colour accuracy etc

    once the design is locked down (with some open hooks for adding features), then hopefully there will be a java multi platform decoder created.. keith long from east end studio in London has offered to do a translation, if anyone can, its probably him..he writes code for the banks ..untangles bad Russian multi-threading routines ..makes stuff work nice..and my code is peanuts easy ..and he likes the fact i use parenthesis comprehensively in my equations .. he says its professional ;)

    the adobe machine already issued me a private compression tag for the tiff format too, so implementing it within tiff is an option already .. though it wont be supported by third parties unless i write a patch for libtiff etc .. or they want to .assimilate were not interested in making work for themselves ..as they are a small team .. but i shall keep them posted when i'm done ..

    there are no patents involved .. its a simple system, just surprising that doing a few very things that seemed obvious to me.. that surely must have all been done before .. would leave jpeg2000 and every other variant of jpeg.. in the dust .. truly very strange, and surprising . i shall write a white paper regardless , called "don't believe every research paper you read, a guide to lossless HDR image compression and archiving "

    if the kickstarter gets funded .. i shall offer a prototype encoding /decoding tool on day one..and then regular updates.. with periodic function bonuses along the way .. the aim is to under promise , and over deliver .. and i'm happy i can deliver on what i promise etc ..based on what i have ..

    4k would fund 12months of development time currently .. so that's what ill be asking for .. trying to take a break from doing rentals for a while ..but have a 4k bank loan to service .. which is distracting me ..

    redcode compression is not based on jpeg2000 ..it is jpeg2000.. to become redcode, additional processing/filtering of the captured image is required of course ..which no doubt helps the compression engine ie jpeg2000 ...
    Last edited by konrad grant; 03-20-2017 at 10:43 PM.
    Reply With Quote  
     

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts