Thread: Redray codec accuracy.

Reply to Thread
Page 1 of 4 1234 LastLast
Results 1 to 10 of 36
  1. #1 Redray codec accuracy. 
    Senior Member
    Join Date
    Jan 2008
    Posts
    6,156
    Let's talk business. Redray has been around for a while. Are there comparison tests for it's accuracy compared to different codecs?


    Thanks.
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  2. #2  
    Senior Member Elsie N's Avatar
    Join Date
    Oct 2009
    Posts
    6,658
    Redray and the codec it used, .RED seem to have just faded away but I remember them showing some data rates years ago when first unveiled. Probably this is something best addressed by Graeme, but as we all know he is up to his elbows in IPP2 stuff.

    I suspect someday we will see employment contracts stipulating that the employee will be required to take on cyborg properties either by implant, hardwire plug-in or wi-fi. Until then, we are stuck with Analog Graeme and his limitation of doing the work of just one person. '-)
    Last edited by Elsie N; 02-18-2017 at 09:32 AM.
    One camera is a shoot...but four (or more'-) Hydrogens is a prohhhh-duction... Elsie the Wraith
    Reply With Quote  
     

  3. #3  
    Senior Member
    Join Date
    Jan 2008
    Posts
    6,156
    The redray player was out in the wild, so somebody must have dine some testing?
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  4. #4  
    From what I heard it was comparable to H265 which made it kind of irellevant since H265 is widely implemented in all new hardware whereas RedRay required a dedicated player.
    Reply With Quote  
     

  5. #5  
    Senior Member
    Join Date
    Jan 2008
    Posts
    6,156
    You are joking, all those guys raving on his lossless it looked at 9mb/s that is incredible they could get that wrong. H265 can't do that. Couldn't they tell the difference? Gavin, are there links to any actual tests?
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  6. #6  
    Senior Member
    Join Date
    Jan 2008
    Posts
    6,156
    Oh, and thanks for the heads up Gavin. :)
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  7. #7  
    Senior Member
    Join Date
    Jan 2008
    Posts
    6,156
    Hmm, does that mean if I put a demo of cows in a windy field up in 4k 9mb/s h265 in a dimly lit room at midnight and after serving drinks for an hour and put strippers besides the screen bathed in red light, and call it a revolutionary codec and claim to be Steve Jobs, people would say how ultra realistic lossless it would look, and how good it is for recording porn, and start asking me when the next Mac Pro was coming out? :).
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  8. #8  
    Banned
    Join Date
    Jul 2016
    Posts
    64
    ive compressed every image i have on disk ..about 160k of them, 300tb worth..using my own compression algorithm , and also with jpeg2000.
    the images I encoded were 8bit, 24bit, 48biti colour depths, and my compression system is based on a simple Huffman coder, that encodes images in about 1/10 of the time the openjpeg encoder requires..but has produced a folder of images that is 55.12906% smaller than j2k .
    I can also at complete error detection and correction data to the file .. which increases filesizes by 2.5% (per megabyte of compressed data)

    I get the same results compressing raw DNG files, but ive not optimised for raw or single channel data yet

    ive noticed that the 8k weapon footage ive downloaded from reduser generally encodes very well, typically..165mb tiff-> 120mb j2k-> 47mb Huffman coder... I'm finding weapon footage compresses much easier than dragon orientated footage, and its not necessarily caused by a reduction in noise .. its slightly stranger than that ..

    better compression algorithms are easy to design, and improving existing ones too.. the hard bit is implementing them in hardware , whether it be via HDL code which is well beyond me, or other methods, which are also ..tricky!.. and for new formats..getting them adopted of course

    I got fedup of j2k lossless compression some time ago .. quite simply .. its a lossy format for practical purposes .. its lossless function creates bloated files (most of the time)
    replacing it as a backup format has been easy for me. shame no one else is interested ..although if anyone wants to be a windows beta tester .. i could arrange that .. ive ran out of test images!

    I think it was Graeme who said designing the .red codec was ..i para phrase "remarkably easy" .. which suggests to me they adapted an existing codec ..i doubt they knocked out a brand new/ground up hardware based codec .. more like "pre filtered an image and fed it into an existing codec" .. like they do with the redcode (pre-filter data -> jpeg2000 ) .. which still costs an arm and a leg in research and development to implement in hardware..and headaches.. but means you have off the shelf hardware support to use as a starting point
    Reply With Quote  
     

  9. #9  
    Senior Member
    Join Date
    Jan 2008
    Posts
    6,156
    If anybody wants to beta for Conrad that is good here.

    Now, Konrad. My head is a bit misty at the moment and I'm next to a busy road. So forgive the sloppiness of any supply. I can tell you some of the stuff about redcode, but first. The jpeg2k part of your compression comparison is a bit of 'how long is a piece of string' for calculating compression. How much compression on average do you need for true lossless, and how much do you need for a known form of visually lossless or compared to various visually lossless modern redcode data rates are basic sort of comparisons. An average loss db against an image is accurate but not so relevant, as it is where that loss occurs that is relevant. Preserving more significant parts of an image against less significant parts skews the perception of image quality. This is something I put forwards nearly 13 years ago.

    Now, redcode. I know people, and redcode was apparently originally based off of Jpeg2k, which is bloated slow etc. Then redcode had cineform related technology and vastly improved.

    Now, redray performs very high compression, by reports at least visually lossless (but I have not seen tests). So, does your codec perform 4k at less than 9mb/s visually lossless etc?

    Now, it is often difficult to get antiquated and fundamentally inefficient routines to high efficiency metrics. But a high efficiency routine can be different and simpler. So, second level of denial of interest in routines happens there (the first includes 'you can't x,y,z', 'another guy claiming something' , 'yada yada yada' 'blah, blah, blah' etc.) Levels include people with inadequate or no idea, or I'll equipped to do something with it financially or practically. The third level, is that it is a IP nightmare, where you can fall foul of patents, and its a lot of trouble and companies that know things, know it, and may not be interested once they do know, often. A simple Huffman encoder by itself, doesn't appeal as a potential redray challenger.

    But I'm open minded so I'm interested in what it is, but protect your IP. If you really have something and its all too difficult, maybe Google will be interested to pay, and roll it into their open source codecs, but then you get the chicken and eggs like difficulty of getting them to sign a non compete permanently, which companies without the utmost morality (like virtually NONE) would he interested in doing under legal advice. So, with a limited non compete clause, they likely will still he reluctant to sign, plus you have the issue of filing for patent and completing cost of patent rollout over the next year or so, in which time you may have nobody uptaking the technology to afford the patents, and then letting the patent lapse and companies you have under NDA Non compete previously swooping in to sweep it up. One engineer I knew was part of a group that had a new leading hardware technology and the big company they wanted to license to, apparently, was intent to just wait them out. We really do need a revision of patent law to have a free registration of IP, a patent period that starts upon a contract, commercial use of an IP. So you can present and companies can window shop, and never ha e the ability to wait out IP unless it is during the active IP period of commercialization. This is the sort of thing I'm putting forwards for IP reform. There also needs to be a more open market mechanism for these things after a first exclusive license period. Which is another reform I'm putting forwards. We might be looking at a 90% reduction in progress due to the way the current systems work. Which means, we are maybe a century or more behind in some areas (some newer areas may be more novel, harder and less legs to advance quickly with the aid of smaller competitors without many/millions/billions of project investment ).
    An explorer explores new ideas and things, but somebody stuck in mud, complains.
    - But the explorer complains about how much better the territory he has discovered is instead.

    -Since when has denying that the possible is possible been reason? Such is the life of some skepticism.
    -Normalcy involves what is realistically possible, not just what you say.
    Reply With Quote  
     

  10. #10  
    Senior Member Elsie N's Avatar
    Join Date
    Oct 2009
    Posts
    6,658
    Quote Originally Posted by Wayne Morellini View Post
    If anybody wants to beta for Conrad that is good here.

    Now, Konrad. My head is a bit misty at the moment and I'm next to a busy road. So forgive the sloppiness of any supply. I can tell you some of the stuff about redcode, but first. The jpeg2k part of your compression comparison is a bit of 'how long is a piece of string' for calculating compression. How much compression on average do you need for true lossless, and how much do you need for a known form of visually lossless or compared to various visually lossless modern redcode data rates are basic sort of comparisons. An average loss db against an image is accurate but not so relevant, as it is where that loss occurs that is relevant. Preserving more significant parts of an image against less significant parts skews the perception of image quality. This is something I put forwards nearly 13 years ago.

    Now, redcode. I know people, and redcode was apparently originally based off of Jpeg2k, which is bloated slow etc. Then redcode had cineform related technology and vastly improved.

    Now, redray performs very high compression, by reports at least visually lossless (but I have not seen tests). So, does your codec perform 4k at less than 9mb/s visually lossless etc?

    Now, it is often difficult to get antiquated and fundamentally inefficient routines to high efficiency metrics. But a high efficiency routine can be different and simpler. So, second level of denial of interest in routines happens there (the first includes 'you can't x,y,z', 'another guy claiming something' , 'yada yada yada' 'blah, blah, blah' etc.) Levels include people with inadequate or no idea, or I'll equipped to do something with it financially or practically. The third level, is that it is a IP nightmare, where you can fall foul of patents, and its a lot of trouble and companies that know things, know it, and may not be interested once they do know, often. A simple Huffman encoder by itself, doesn't appeal as a potential redray challenger.

    But I'm open minded so I'm interested in what it is, but protect your IP. If you really have something and its all too difficult, maybe Google will be interested to pay, and roll it into their open source codecs, but then you get the chicken and eggs like difficulty of getting them to sign a non compete permanently, which companies without the utmost morality (like virtually NONE) would he interested in doing under legal advice. So, with a limited non compete clause, they likely will still he reluctant to sign, plus you have the issue of filing for patent and completing cost of patent rollout over the next year or so, in which time you may have nobody uptaking the technology to afford the patents, and then letting the patent lapse and companies you have under NDA Non compete previously swooping in to sweep it up. One engineer I knew was part of a group that had a new leading hardware technology and the big company they wanted to license to, apparently, was intent to just wait them out. We really do need a revision of patent law to have a free registration of IP, a patent period that starts upon a contract, commercial use of an IP. So you can present and companies can window shop, and never ha e the ability to wait out IP unless it is during the active IP period of commercialization. This is the sort of thing I'm putting forwards for IP reform. There also needs to be a more open market mechanism for these things after a first exclusive license period. Which is another reform I'm putting forwards. We might be looking at a 90% reduction in progress due to the way the current systems work. Which means, we are maybe a century or more behind in some areas (some newer areas may be more novel, harder and less legs to advance quickly with the aid of smaller competitors without many/millions/billions of project investment ).
    I invented that years ago. '-)
    One camera is a shoot...but four (or more'-) Hydrogens is a prohhhh-duction... Elsie the Wraith
    Reply With Quote  
     

Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts