Vous êtes sur la page 1sur 33
oe $ 2015024505081 us United States cz) Patent Application Publication — co) Pub. No.: US 2015/0245050 Al Tourapis et al. (4s) Pub, Date: Aug. 27, 2015 (4). ADAPTIVE TRANSFER FUNCTION FOR osx roe (2006.01) VIDEO ENCODING AND DECODING ow 19% o06.01) : 2) Us.cL Applicat: Apple Ine Cuperin, CA (US) ce ‘sx 1982 2018.1), Hoan 197% 0) an @ () on} (2014.11); HOdN 19/182 (2014.11), HOAN Inveators: Alexandros Tourapls, Sunnyvale, CA 19/14 (2014.11), (Us); David Singer. San Francisco, CA (ws) on ABSTRACT Assignee: Apple Ine., Cupertino, CA (US) Appl. No 141631410 A video encoding and decoding system that implements an Bile: Provisional api ‘adapive transfer function method inteally within the code 25,2015 Torsignal representation. foeus dynamic range representing an effective dynamic range of the human visual system may Related US. Application Data be dynamically determined for each scene, sequence, Irae jon No, 61/944,484, fled on Feb. oF epion of input video. The video data may be cropped and Fa 25, 2014, provisional application No, 61/946.638, quantized into the bit depth ofthe codec according (oa trans fisd on Feb 28, 2014, Trovisional application No, fe fuetion for encoding within the codec. The transfer ane (61/946.633, filed on Feb. 28,2014. tion may be the same as the transfer Function of the iaput In Cl. ‘ion method enables the code to use Fewer bits fr the inter video data or may be a transfer funetion intemal to the codec. The encoded video data may be decoded and expanded into the dynamic rnge of display(s). The adaptive transfer func- Publication Classifica HOIN 19/52 (2006.01) tal representation ofthe signal while stil representing the HOSN 19/182 (2005.01) entire dynamie range ofthe signal in ouput cor oor 1 fea a wt * a te a ine adnte || create dese toncortrcton | t | a Fy a or tt | ra st] ia |, oy is vate 5 we eo | US 2015/0245050 AI Aug. 27,2015 Sheet 1 of 14 Patent Application Publication L Old 9p008p zt weays epoous ia ‘ynpou uorauny saysueg anndepe papooue DOT 20000, 204 nd US 2015/0245050 AI Aug. 27,2015 Sheet 2 of 14 Patent Application Publication Suyewwosas xm ne ; epereu aauaigjas je} Sena mz wz cd sogansue09s Ed ogous ainoou poe wospe auey uoourny woyesuadtuoa pow -sau pue vosuen | 202 vonous eweyp2nuy ‘anpdepe | and ver Lixd poo zz | “ome ae wears nda 90-9 papcoua Bz awd Busseooxd 002 48pooue US 2015/0245050 AI Aug. 27,2015 Sheet 3 of 14 Patent Application Publication € Old bee a eeperous yu Fereip TEU) OE ampow wer 1: od vojouny wo,suen pue a (overdsp 19}8ue} vonezquenb ae foams onndepe ‘asvanut sau cee cee NdyNo OBpIA 119-0 weais papo29p 4d O0€ sapoo@p 206 weays papooue Patent Application Publication Aug. 27,2015 Sheet 4 of 14 US 2015/0245050 Al wanister function code values 0 1000 ©2000 3006 4000 5000 6000 7000 8000 9000 10000 luminance (cd/m) input video dynamic range FIG. 4 Patent Application Publication Aug. 27,2015 Sheet Sof 14 US 2015/0245050 Al c code values o 1000 1500 2000 2500 3000 3500 4000 4500 5000 luminance (cd/m) FIG. 5 Patent Application Publication Aug. 27,2015 Sheet 6 of 14 US 2015/0245050 Al code values © 1000 2000 3000 4900 5000 6000 7000 s000 9000 10000 Juminance (cd/m?) dlisplay dynamic range FIG. 6 Patent Application Publication Aug. 27,2015 Sheet 7 of 14 US 2015/0245050 Al . focus range ma 7108 focus range 710 FIG. 7A ‘frame 700K frame 700 frame 7001 (focus range 7108) (focus range 7100) ocus range 7100) FIG. 7B frame 700L vegion 2024 region 2028 (focus range 710F) (focus range 7106) region 7026 (focus range 710H) FIG. 7C Patent Application Publication Aug. 27,2015 Sheet 8 of 14 US 2015/0245050 Al tert ¥ receive N-bi video data 800 x adaptive determine a focus range forthe video data transfer 802 function ¥ ap the N-it video data within the focus rege to (C-bit video data according toa transfer function 804 ves more [> ideo data? 06 No ¥ process the C-bit video data to generate encoded| encoder video date output including format metadata 810 done FIG. 8 Patent Application Publication Aug. 27, 2015 Sheet 9 of 14 decoder inverse adaptive transfer function obtain encoded data 900 y decode the encoded data to generate C-bit video data and format metadata 902 y ‘expand the focus range C-bit video data into ful: range D-bit video data according to the format ‘metadata and display information 910 ¥ ‘output the D-bit video data ae FIG. 9 US 2015/0245050 AI YES ‘more encoded data? 904 NO done Patent Application Publication Aug. 27,2015 Sheet 10 of 14 US 2015/0245050 Al ernry 8800 suze 306 aced| P| @ yc gus0 ae CPU Complex 6020 Communication Fabric 6010 Perioherah Peripheral, Perera! 0404 doe nao extemal interface(s) 8900 FIG. 10 Patent Application Publication Aug. 27,2015 Sheet 11 of 14 US 2015/0245050 Al PMU 9010 soc sooo tama Mery Petia * ¥ 9000 ~_A FIG. 11 Patent Application Publication Aug. 27,2015 Sheet 12 of 14 US 2015/0245050 Al Processor Prcssor (2210 (29106 Witsoe 2230 Tf [ t Tie ar | ees nu ev) (2940 (2950 Pega ee 2922 Gare U Keybosra Display) | camer) sensors) (2960 20 2360 2390 2992 _ 2285 FIG, 12 Patent Application Publication Aug. 27,2015 Sheet 13 of 14 US 2015/0245050 Al Portable Memory 2102 Multifunction ‘Operaing System 2126 Device ‘Communication Module 2128 eid ‘antaciotion Module 2120 (Graphs Module 2132 Ter Input Module 2138 GPS Module 2135 Aoplications 2136 Telephone Mode 2138 Video Conference Module 2138 Camera Module 2122 Tage Management Module 2124 Video & Music Player Module 2152 Onine Video Module 2155 ‘Search Module D151 Browser Module 247 Pong Stn Deniee/Globa intemal Slate IST a ai] 2103 A Y 7 1 BF Crea Conuater 4 id att “ark etry ot 2103 | Peripherals ono zie oie |, «| Prony Sensor fa 2166 Fromaseans 1. | Orta aT t 2103-4 170 Subsystem 2106 Optical Sensors) | [Other Inout Coniroer Controter(s) 2156 2160 t t + 7 Optical ‘Other input Sensor(s)/Camera | Conto Devices FIG. 13 Patent Application Publication Aug. 27,2015 Sheet 14 of 14 US 2015/0245050 Al Portable Multifunction Device 2100 226 220 Ze Proximily Sensor 2166 Extemal Pons) 2124 FIG. 14 US 2015/0245050 AI R FUNCTION FOR. ‘VIDEO ENCODING AND DECODING PRIORITY INFORMATION [0001] This application claims benefit of priority of US. Provisional Application Ser. No, 61/944,484 entitled “DIS- PLAY PROCESSING METHODS AND APPARATUS filed Feb. 25, 2014, the content of which is incorporated by reference herein in its entirety, U.S. Provisional Appl tion Sex. No, 61/946,638 ented “DISPLAY PROCESSING METHODS AND APPARATUS" filed Feb, 28, 2014, the ‘content of which is incorporated by reference herein in its ‘entirety, and 0 US. Provisional Application Se. No, 619 633 entitled “ADAPTIVE METHODS AND APPARATUS” filed Feb. 28, 2014, the content of which is incorporated hy reference herein in its entirety, BACKGROUND 0002} 1 Technical Field [0003] This disclosure eates generally to digital video or Image processing and display. [0003] 2. Description of the Related Art [0005] Various devices including but not Timited to per Sonal computer systems, desktop computer systems, laptop ‘and notebook computers, tablet or pal devices, digital cane ‘eas, digital video recorders, and mobile phones or smart Phones may include sofiware and/or hardware that my Implement video processing method(s). For example, & device may include an apparatus (eg an integrated cect ((C), suchasa system-onra-chip (SOO), ora subsystem of an IC), that may receive and process digital video input fom one ‘or more sources and output the processed video frames according to one oF more video processing, methods. As fother exumple,asoftware program may be implemented on ‘adevice thatmay reveive and process digital video input fom, ‘one or more sources according to one oF more video process ng methods and output the processed video frames to one oF sore destinations, 10006] As an example, @ video encoder may be imple- mented 38 an apparams, oF altematively a8 a software pro- zram, in which digital video input i eneoded or converted ‘nto another format according t a video encoding method, {or example a compressed video format such as H.264/Ad- vanced Video Coding (AVC) format, or 1.265 High Este ciency Video Coding (HEV) format As another example, a video decoder may be implemented as an apparatus, o alter natively as a software program, in which video in a com- pressed video format such as AVC of HEV is eosived and ‘decoded or converted into another (decompresse) fomat seconding to. video decoding method for example a display Tormat used hy a display device. The 1.264/AVC standard is published by ITU-T ina document titled "ITU-T Recomimen- ation 1.264 Advanced video coding for generic audiovisual services The H.268/HEVC standard is published by ITU-T jn document titled “ITU-T Recommendation H.265: High Ficieney Video Coding” {0007} In many systems, an apparatus or software program ‘ay implement both video encodr companent and a Video ‘decoder component; soch an apparatus or program is com- monly refered to a¢ 2 codec, Nate that a codee may encode! decode both visual/image data and audio/sound data in 3 Video sea, Aug. 27, 2015 [0008] Generally defined, dynamic range is the ratio ‘between the langest and smallest possible values of a cht able quantity, such in signals ike sound and ight. In digital mage and video processing, extended or high dynamic range (HDR) imaging refers to technology and techniques that pro- ‘ducea Wider range of laminanoein electronic images (e388 splayed on display sereens or deviees) than is obtained vusing standard digital imaging technology and techniques (refered fo as standard dynamic range, or SDR, imaging). [009] Am electro-optical transfer function (EOTF) may ‘map digital code values to light valves, for example to Tum nance values. Aninverse process, commonly referred oa aa ‘opto-electrical transfer function (OFTE), maps light valuos to clecirnic/digital values, EOTFsand OETPs may collectively be referred to. transfer functions. The ST unit for luminance is eandela per square meter (edim?). A non-SI term for the same unit is"NIT" I standard dynamic range (SDR) image og systems, to simplify the eneoding and decoding processes ‘in encoding) decoding systems or codes, fixed transfer fune- ‘ions, for example fixed power-law gamma transfer functions have generally been used for the intemal representation of video image content. With the emergence of high dynamic range (HDR) imaging techniques, systems and displays, a ‘eed for more flexible trnsfer Functions has emerged. SUMMARY OF EMBODIMENTS [0010] Exsbodiments of video encoding and decoding sys- ‘ems and methods are described that implement an adaptive ‘ransfer function for the representation of video image eon- ‘ent intemally within the video encoding and decoding sys- tem or eodee. Embodiments may dynamically determine « focus dynamic range fora current scene, soquence, frame, oF ‘gion ofa frame in input video based on one or more char- acteristics ofthe image data (e. brghiness, texture, et.) ‘rop the input video dynamic rage into that foews range, and thea appropriately map (ex. quantize) the values in the ‘cropped range from the bt depth of the input video othe bit depth of the eodee according to a transfer Function used t0 represent video data inthe code. [011] In embodiments, various transfer funetions may be ‘used to represent the input video data and the focus range video data in the codec. In some embodiments, the transfer unetion used w represent video data in te codec may’be the ‘sue as the transfer funetion used to represent the input Video data. In some embodiments, a different transfer function (e- ‘ered tos an intemal or secondary trustee function) may be ‘sed to represent the video data in the eodee than the transfer ‘unetion used to represent the input video data (referred to as the primary transfer function). [0012] Inembodiments, the focus range, transfer function, ‘quantization parameters, and other Format information used by the encoder fora scene, sequence, frame, or region may be signaled to a decoder component, for example by metadata embedded in the output bit stream. At the decoder, the encoxled it stream may be decoded and dynamically ‘expanded othe fill dynamic range of a target device sch a high dyuanie range (1HDR)-enabled display according t0 the signaled focus range(s) for seenes, sequences, frames, oF regions ofthe video. [0013] By dynamically adapting the transfer function tothe ‘input video data, embodiments may allow the vidso data to be represented with fewer bils in the coder than are used 10 represent the inp vidoo data, while als allowing the vider ata output by the code to be expanded to fil the dynamic US 2015/0245050 AI range of HDR devices such as HDR-enabled displays Embodiments of an adaptive transfer fuicion method may. orexample, enable the video encoding and decoding systems to use 10 oF fewer bits forthe intemal representation and processing ofthe video data within a code, while epeesent- ing the extended or high dynansie range of the ideo dat, for ‘example using 12 or more bits, when ourputing the video t0 HDR devices such as HIDR-enabled displays, Embodiments ‘ofan adaptive transfer function method for video encoding, 1d decoding systems may thus simply the implementation, and consequently the adoption, of HDR technology, espe- ‘ily in the consumer space. BRIEF DESCRIPTION OF THE DRAWINGS. [0014] FIG. 1iusrates an example codec or video encod- ‘ng und decoding system that implements an embodinent of an adaptive transfer fuetion [0013] 1G. 2iluseatesan example encoder that applies an adaptive transfer fimetion method to video input data and penerates encoded video data, aevordng to some embodie ments 10016] FIG. 3 illustrates an example decoder that devedes ‘encoded video data and expands the decoded video data ‘according (oan adaptive transfer faction method to wenerate lisplay format video data, according to some embodiments 10017] FIG. 4 ilostates an example full range for input Video data and shows an example focus range forthe video data, aecoring to some embod [0018] FIG. Silusratesanexampleof mapping N-itinpat Video data within a Toeus range to wenerate C-bit video dat, according © some embodiments [0019] FIG. 6 graphically iustates an example of expand- ing C-bit decoded video into the full dynamic range of an HDReenabled device to generate D-bit video data for the device, acconling to some embodiments. 0020] FIGS.7A through 7C graphically illstrateapplying diffrent focus ranges to diflerent portions of video sequence of video frame sevoréng to embodiments of an adaptive transfer function method [0021] FIG. 8 isa flowchart of a video encoding method that applies an adaptive transfer funetion method Wo video input data and generates encoded video data, according to some embodiments [0022] FIG. 9 isa flowchart of a video decoding method that decodes encoded video data andl expands the decoded Video data according to an adapive trnserfanetion method to generate display format video data, according to some ‘embodiments 10023] FIG. 10 isa block diagram of one embodiment of a system on a chip (SOC) that may be configured to implement aspects of the systems and methods described herein [0024] FIG. 11 isa block diagram of one embodiment of a system that may’ include one or more SOCs 10028] FIG. 12ilustratesanexample computer system that may be configured t implement aspect othe systems and methods desorbed herein, according to some embodiments 10026] FIG. 13 itlostrates a block diagram of » portable nulfnetion device in accordance with some embodiments 10027] FIG. 14 depicts a portable mulifanetion device in ‘accordance with some embodiments. 0028] White the invention is susceptible to various modi- feations and alternative forms, specific embodiments thereof are shown by way of example inthe drawings and will herein be described in detail, Itshoukl be understood, however, that Aug. 27, 2015 the drawings and detailed deseripion thereto are not intended to limit the invention tothe particular form disclosed, but ox the contrary, the intention is 10 cover all modifications, ‘equivalents and altematives falling within the spirit and scope ‘ofthe present iavention, As sed throughout ths application, the word “may” is used ina permissive sease (J, meaning Jhaving the potential to), rather than the mandatory sense (he, ‘meaning must). Similarly the words “includ.” “including. and “ineludes” mean including, bot no imited to, [0029] Various units, czcuits, or oer components may be eseribed ws “configured to” perform a tsk oF asks, [a such contexts, “configured 10” is broad recitation of stecmnre ‘gencmlly meaning “having circuitry that” perfonms the task ‘rtasks during operation. As such, theuniicreui/eomponent fan he configured to perform the task even when the init cireuiteomponentis not currently on. In general, the circuitry that forms the structure corresponding to"eonligured to" may include hardware circuits. Similarly, various unitseireuits ‘components may be described as performing a tsk or tsk, {or eomvenience inthe description, Such descriptions shoul be interpreted as including the phrase “configured to,” Rec ‘nga univcieut/component that isconfigured to performone ‘or more tasks is expressly intended not to invoke 35 USC. ‘$112, paragraph six, interpretation for tht univeieuitcom: DETAILED DESCRIPTION [0030] Embodiments of video encoding and decoding sys- tems and methods are described that implement an adaptive transfor funetion for the representation of video image con- tent intemally within the video encoding and decoding sys- {em or code. Embodiments may allow the adaptation ofthe dynamic range ofthe input video data tothe codec during the encoxling and decoding processes. In embodiments, the trans- fer function may be dynamically adapted for each scene, sequence, or frame of input video to the codec, In some cembodimenis, the transfer function may be dytansieally adapted within regions of a frame, Embodiments: may dynamically adapt the transfer function tothe input video data, keeping only the information within a dynamically {determined effective dynamic range ofthe human visual sy$- ‘em (referred to herein as focus dynamie range o ust focus range) and mapping the data within the focus range from the bit depth of the input video into the bit depth of the codec fceording to a transfer function for processing within the codec, The output of the eodee may be expanded to fill the dynamic range of output or target deviees, including but not limited to extended or high dynamic minge (HDR) devices sueh as HDR-enabled displays, FIGS. 1 through 3illusieate ‘example video encoding and decoding systems or eodees in ‘hich embodiments ofthe adaptive transfer funetion method ‘may be implemented, [0031] The human visual system covers a significant dynamic range asa total. However, the human visual system tends 1 adapt to and limit the dynamic range based on the current scene or imae being viewed, for example according ‘o brighiness (luminance) and texture caracteristies of the seene or image. Thus, eventhough the total dynamic range of the human visual system is relatively lange, the effective tive tanfer fanetion modale or eompanentand passed tthe encoder as C-bit video data aecoring to locks of pixels (6g ‘macroblocks, CUs,PUs, or CTUs) {0069} _As indicated at 810 of FIG. 8, one or more compo- rents of an encoder may process the C-bit video data to genome encod and compressed) video data ouput (3, CAVLC or CABAC output). In some embodiments, the cocodermsy encode the Chit video input data according 0 compressed video format such ss H.264AVC format, or US 2015/0245050 AI H1.26SVTIEVC format. However, other encoding formats may be used. In some embodiments, the focus range, transfer Junedon, quantization parameters, and oer format informa tion used in encoding each scene, sequence, frame, or region ‘may be embedded as metadata inthe eneoded ourput stream, ‘or muy be olhenvise signaled to decoder(s). An example ‘encoder for processing C-bit video data to generate an ‘encoded ontpt stream is illustrated by the encoder 200 FIG. 2. In some implementations, the encoded output stream Jnchiding the metadata may be wrillen to & memory, for ‘example via direet memory access (DMA). In some imple~ ‘mentions, instead of or in addition to writing the output stream and metadata toa memory, the encoded output stream ‘and metadata may be seat dreely fo atleast one decoder. The ‘decade may beimplemeatedon the same device or apparats ‘a the encoder or on a different device ar apparatus 10070] FIG. 9s high-level owehart of 2 video decoding method that decodes encoded video data and expands the ‘decode video data according oan adaptive transfer function method to yenerte display format video data, according to some embodiments. As indicated at 900 of FIG. 9, a decoder may obtain encoded data (ex., CAVLC of CABAC encoded and compressed data). The encoded data may. for example, be read from a memory. received from an encoder, or otherwise ‘obtained. As indicated at 902 of FIG. 9, the decoder may ‘decode the encoded data to generate decoded C-bit video data and format metadata. In some embodiments, the decoder may ‘decode the encoded data aevording to a compressed video format such as H.268/AVC format, of H.265/HEVC format. However, other encoding/decoding Formats may'be used The format metadata extracted from the encoded data may, for ‘example, include the fous range, transfer funeton, quanti ‘ation parameters, and other format information that was used in eneoding each scene, sequence, frame, or region 10071] As shown by element 904 of FIG. 9, the decoding Performed at 900 and 902 may contin as long as there is ‘encoded data, Each unit of decoded data (eg, each scene sequence, frame, of region) that is decoded by’ the decoder maybe output to and processed aeconling to an inverse adap- tive transfer funetion method, a indicated at 10 of FIG. 9. At Jeast part ofthe format metadata extracted from the encoded data may also be output othe inverse adaptive transfer func tion method. 10072] As indicated at 910 of FIG. 9, the inverse adaptive transfer function method may dynamically expand the ‘decadesd C-bit video data from the focus angeto generate fll, ‘dynamic mnge D-hit vdeo data for output to a tanget device sveh as an HDR-enabled display device. Fora given scene, Sequence, frame, or region, the expansion may be dynami= cally performed according to respective format metadata ‘exicied from the encoded data by the decoder. FIG. 6 raphically illustrates expanding decoded video data into the {ull dynamic range of an HDR device, according to some ‘emboxtiments Primary and Secondary Transfer Punetions 10073] Examples of transfer function dat may be used to represent video dat in embodiments of an adaptive taser fanetion method as described herein my incl, bot are not Jimited to, poker las ganums-hased transfee fnetons loga- rithmie (og-hased) rane fnetons, and transfer Tunctions Based on Inuman visal perception such as the Perceptual ‘Quantizer (PQ) transfer Function proposed by Dolby Labora Aug. 27, 2015 [0074] In some embodiments, the transfer function used to represent video data inthe codec may be the sume as the transfer unetion used (0 represent the input video data (e- Teed to asthe primary trinsfer funtion). Ia these embodi- ‘ments, the input video data may be cropped aecording to the determined focus range for a scene, sequence, frame, oF region, and then mapped (e.g. quantized) into the available bits of the codce according tothe primary transfer function. The focus range, cropping, and mapping e-z., quantization) parameters may be signaled to a decoder, for example as ‘metadata in the output encoded stream, so that an inverse ‘adaptive transfer funtion method may be performed atthe ‘decoder to generate full-range video data fora device sich as ‘an HDR enabled display. [0078] However, in some embodiments, different transfer Tnetion (refered to asan intemal or secondary teansfer fune- tion) than the primary trinsfer function may be used t rep- resent the video data within the codee. In these embodiments, the input video data may be exopped to the determined focus range for a scene, sequence, frame, of region, and then ‘mapped, sealed or quantized into the available bits of the codec acconding othe secondary transfer funetion. The sec- ondary transfer funtion may, for example, allow the video Signal to be represented with higher precision within the codec than could bedone using the primary transfer funtion. In these embodiments in addition ta the fous mine, erop- ping and mapping (eg, quantization) parameters, informa- tion on how to convert the video data from the secondary ‘cansfer function tothe primary transfer funtion may also be signaled to the decoder, for examples metadata inthe output steam. This infomation may inelude, but is not Timited 10, the type of the secondary transfer funetion (ee. power kv gamma, Tog, PQ. ete.) and one or more contol parameters for the transfer fiction, In some embodiments, information tionto achieve beter encoding elicieny and beter rendering se final unser fonction target {0081} Ia some embodiments, determination of and ado ment othe internal wansfer fonction (eg. range. type, bit depth, ct.) may be based on charcteratos of the vdeo data being encoded. In some embodiments, compression eapabi tic or characterises of particular transfer functions forthe signal in question may also cousidewed in selecting andor agjsting internal ransfer function In some embodies, determination of and adjustments t0 the intemal transfer Tnetion may instead or in ation be based om one oF Mare target displays and their characteristics or limitations. For example its known hata dsply is being targeted that has 2 limited dynamic range compared to the dynamic range Supported by the primary wansfer function, then there my be point including valves that are ou-ofange (tthe aget splay inthe encode, compressed sgl. Instead, an ile. al transfer ancton representation tat provides a best ft of ‘hesignaltothe displays dynanse range may be determine, ‘which may’ allow fora beter compressed representation of {be signal for that parclar displays capes. In some embodiments, it molple displays are tobe supported by @ nee thena dyuanie range may beseested forthe displys, nan intemal wansfer anetion representation tht provides ‘best fit deselected dynamic range my be determined Forexampl, the play with the best eapabilis (eg. with thehighest dynamie range) maybe selected, and is. dynamic range may be sed in adjusting the intemal transfer Foetion to gencate video output forall ofthe displays. As another ccample, the dynamic ring sclction may be based on 8 Pricing model in which one or more charactecsies (eg the US 2015/0245050 AI ‘dynamic range) ofthe displays may be weighed (eg, based ‘on a determined or indicated ranking ofthe displays in terms ‘ofimportance or othe fetors) and the intemal transfer fune~ tion may be adjusted accord. 10082] "Embodiments of the adaptive transfer function methods for video encoding and decoding systems or codes as described herein may provide the ability of signaling (om the encoder) and supporting tom the encoder and decoder) ‘anextended orhigh dynamic range while keeping complexity ‘of the codce (eg. bit depth) within reasonable hounds. This may be achieved by dynamically determining focus rangesat the region, frame, scene, or sequence level, cropping the input ‘data to the focus range, and mapping the eropped data fom the bit dep ofthe input video into the bit depth af the codec, hl signaling the appropriate parameters (eg. focus range, ‘quantization parameters, lc.) from the encoder to the ‘decoder athe region, frume, seene or sequence level 0 that the dynamie range can be extended to a full range of an HDR-enabled display. In addition, in some embodiments, appropriate paramcters (eg. weights, te) for performing ‘weighted prediction may be signaled when diferent frames are encoded using different ransfr funtion representations Example Devices and Apparatus [0083] | FIGS. 10 chrough 14 show non-limiting examples of ‘devices and apparatus in or with which embodiments ot com= ponents of the various digital video or image processing and ‘splay methods and spparatus as deseribed herein may be ‘implemented, FIG, 10 ilusizates an example SOC, and FIG. Hillusrates an example device implementing an SOC. FIG. 12 illustates an example computer system that may imple= ‘ent the methods and appamtis described herein, FIGS. 13 and 14 illustate example multifunction devices that may Jimplement the methods and apparatus described berci, Example System on a Chip (S00) 10084) Turning now t FIG. 10, a block disgram of one ‘embovdiment of system-on-chip (SOC) 8000 that may be twsed in embodiments. SOC 8000 is shown coupled to @ memory $800, As implied by thename, the components ofthe SOC 8000 may be integrated onto a single semiconductor substrate as an integrated circuit “chip” In some embodi- rents, the components may be implemented on two or more “discrete chips in a system, However, the SOC 8000 wil be tusedas an example herein In the illustrated embodiment, dhe ‘componeats of the SOC 8000 include a central processing unit (CPU) complex 8020, on-chip peripheral components ‘8040-8040C (more briefy, “peripherals memory con= troller (MC) 8030, and a communication fabric 8010. The ‘components 8020, 8030, 8040-8040, may’all be coupled. to the communication fabric 8010, The memory controller 8030 may be coupled tothe memory 8800 during use, and the peripheral 80402 may be coupled to an extemal interface ‘8900 during use. In the illustrated embodiment, the CPU ‘complex 8020 includes one or more processors (P) 8024 and a level two (1.2) cache 8022. 10085] The peripherals 8040-80403 may be any set of ditional hardware functionality inchuded in the SOC 8000. For example, the peripherals 80404-80408 may inelude video peripherals sich as an image signal processor config- ured to process image capture data from a camera or other ‘mage sensor, display controllers configured to display video data on one or more display deviees. graphics processing Aug. 27, 2015 ‘mits (GPUS), video encoderidecoders oF codes, sealers, rolators, blenders, etc. The peripherals may include audio peripherals suel ‘as microphones, speakers, interfaces 10 ficrophoaes and speakers, audio processors, digital signal processors, mixers, ete, The peripherals may include periph- nil interface contollers for various interfaces 8900 external to the SOC 8000 (eg the peripheral 89408) including inter faces such as Universal Serial Bus (USB), peripheral compo- ‘ent interconnect (PCI) inchoding PCT Express (PCIe), seal and parallel ports, ete. Te peripherals may include netw ing peripherals such as media aceess controllers (MACS). Any set of hardware may be include [086] Tae CPU complex 8020 may inchude one or more CPU processors 8024 that serves the CPU of the SOC 8000. ‘The CPU of the system includes the processors) that execute the min control software ofthe system, such as an operating system, General, software executed by the CPU during se ‘may control the other components of thesystem to realize the desired functionality of thesystem. The processors 8024 may also execite other software, such as application programs. ‘Theapplication programs may provide user fuetionality. and ‘may rely on the operating system for lower level device control. Accordingly, the processors 8024 may also be referred toa appleation processors. The CPU complex 8020 ‘may further include other hardware such asthe L2 cache 8022 andlor and interface to the other components of the system (eg. an interface to the communication fbrie 8010). Gener- ally, processor may elude any circuitry andr mierocade ‘configured to execute instruction defiaed ia aninstrvetion set architecture implemented by the processor. The instructions ‘nd data operated on by the processors in response fo exect ‘ng the instructions may generally be stored inthe memory ‘8800, although certain inseuctions may be defined for direct processor access to peripherals as well. Processors: may encompass processor cores implemented on an integrated circuit with other components asa system on a chip (SOC £8000) or other levels of integration. Processors may furher ‘encompass diserete microprocessors, processor cores andor microprocessors integrated into multichip module imple- ‘meattions, processors implemented 25 multiple integrated circus, ete. [0087] ‘Tae memory controller 8030 may generally include {he circuit for receiving memory operations from the other ‘components ofthe SOC BO00 and for accessing the memory £8800 to complete the memory operations, The memory con- troller 8030 may be configured to access any type of memory ‘8800, For example, the memory 8800 may be static random access memory (SRAM), dynamic RAM (DRAM) such as Synchronous DRAM (SDRAM) including double data rate (DDR, DDR2, DDR3. ete.) DRAM. Low power/mobile ver sions ofthe DDR DRAM may be supported (eg. LPDDR, mDDR. ete). The memory controler 8030 may inelude {queues for memory operations, for ordering (and potentially reonlering) the operations and presenting the operations 10 the memory 8800, The memory controller 8030 may further include data bulfers to store Write data awaiting wite to memory and read data awaiting return tothe source of the ‘memory operation, Ia some embadimens, the memory con- troller 8030 may include a memory cache to store recently accessed memory data. In SOC implementations, for ‘example, the memory eache may reduee power consumption in the SOC by avoiding re-access of data from the memory ‘8800 ft is expected to he accessed again soon, Insomecases, the memory eaehe may also be referred to asa system cache, US 2015/0245050 AI as opposed to private caches such as the L2 eache 8022 oF ‘caches in the processors 8024, which serve only certain com- ponents. Additionally, in some embodiments, a system cache ‘eed not be located within the memory controller 8030, 10088] _Inan embodiment, the memory 8800 may'be pack- aged with the SOC 8000 in a chip-on-ship or package-on- package configuration, A mulichip module configuration of the SOC 8000 and the memory S800 may be used! as well ‘Such configurations may be relatively more secure (in tems ‘fda observability) than transmissions to other components in the system (e.g. fo the end points 16A-16B), Accordingly, protecied data may reside in the memory 8800 unencrypted ‘whereas the protected data may be encrypted for exchange between the SOC 8000 and extemal endpoints 10089] The communication fabric 8010 may be aay com ‘munication intecomicet and protocol for communicating ‘among the components ofthe SOC 8000. The communication fabric 8010 may be bus-based, including shared bus configu- rations, cross bar configurations, and hierarchical buses with bridges. The communication frie 8010 may also be packel- based, and may’be hierarchical with bridges, crossbar point- ‘o-point, oF other interconnect, 10090] 11 is noted that the mumber of components of the SOC 8000 (and the number of subcomponents for those showin FIG. 10,suehas withia the CPU complex 8020) ay ‘ary from embodiment to embodiment, There may bemoreor fewer of each compenent/subcomponent than the number shown in FIG, 10. [0091] FIG. 11 a block diagram of one embodiment of a system 9000 that includes atleast one instance of an SOC ‘8000 coupled to ane oF moee external peripherals 9020 and theexteml memory 8800. power management unit (PMU) ‘9010 is provided whieh supplies the supply voltages to the SOC #000 as well as one or more supply vollages to the ‘memory 8800 andior the peripherals 9020, In some embodi- mens, more than one instance of Uke SOC 8000 may be ‘included (and more than one memory 8800 may be incled as well. 0092] The peripherals 9020 may include any desired c= ‘culty, depending onthe typeof system 9000, Forexample ‘one embodiment the system 9000 may be a mobile device (eg personal digital assistant (PDA), smart plone, et.) and the peripherals 9020 may include devices for various types oF ‘wireless communication, such as wifl, Bluetooth, cellular, _lobal positioning systom, ete. The peripherals 9020 may also Include additional storage, including RAM storage, solid state storage, or disk storage. The peripherals 9020 may include user interface devices sueh as a display screen, jnchiding touch display screens or multitouch display sereens, Keyboard or other input deviees, microphones, speaker, ec, Inother embodiments, the system 9000 may be any type of computing system (e.g. desktop personal com- Pte, laptop, workstation, net top 10093] "The extemal memory 8800 may include any type of ‘memory. For example, the external memory 8800 may’ be SRAM, dynamic RAM (DRAM) such as synchronous DRAM (SDRAM), double data rate (DDR, DDR2, DDR3, ‘etc.) SDRAM, RAMBUS DRAM, low power versions ofthe DDR DRAM (eg. LPDDR, mDDR, et.) ete. The extemal ‘memory 8800 may include one oF more memory modes to which the memory devices are mounted, such as singe inline memory modules (SIMMS), dual inline memory modules (DIMMs), ete Alternatively, the external memory 8800 ny Aug. 27, 2015 jnclude one or more memory deviees that are mounted on the ‘SOC 8000 ina chip-on-chip or package-on-package imple- ‘mentation, Motifune Device Examples [0094] FIG. 13 illustrates block divgram of a portable ‘multifunction device in accordance with some embosiments {In some embodiments, the device is «portable communics- tions device, such as 4 mobile telephone that also contains other functions, seh as PDA. eames, video capture andlor playback, andor musie player funetios, Example embod ‘ments of portable multfusetion deviees include, without limitation, the iPhones, iPod Touch, and iPad devices from Appl inc. of Cupertino, Calif Other portable electronic devices, such as laptops, eel phones, smartphones, pad or tablet computers with touch-sensitive surfaes (ea, Ouch sereen displays andior touch aus). may also be used. It should also be understood tha, in some embodiments, the device is not a portable communications devie, but is 8 desktop computer with atouchesensitive surface e 2, touch Ssereen display andor touch pad). Insome embodiments, the device is & gaming computer with orientation sensors e ‘orientation sensors ina gaming controller). In other embod the device is no a portable communications deviee, js. camera andior video camera {0098} Inthe discussion that follows, an electronic device {at includes a display and a touch-sensitive surface is described. It should be understood, however, that the elee- twonie device may include one oF more other physical user interface devices, such asa physical keyboard, a mouse and Fa joystick, [0096] ‘The device typically supports a variety of applica tions, suchas one oF more ofthe following: a drawing app cation, a presentation application, a word processing app cation,» website creation application, a disk authoring application, a spreadsheet application, a yaming application, aelephone application, a video conferencing application, an «email application, an instant messaging application, a work- ‘ut support application, x photo management application, a Aigital eamera application, digital vdeo eamera application, fa wch browsing application, a digital musi player applica. tion, andor a digital video player application, [0097] The varius applications that may be executed on the device may use atleast one common piyscal usernter face device, sich asthe touch-sensitive surface, One or more functions of the touch-sensitive surface as well as corre- sponding information displayed on the device maybe adjusted and/or varied fromone application tothe next andlor within a respective application. In this way: a common phy al architecture (such a8 the touch-sesitive surface) ofthe device may suppor the variety of applications with ser inter ces that are intuitive and trasparent to the user [0098] Device 2100 may include memory 2102 (which ‘ay include one or more computer readable storage medi- ums), memory controller 2122, one or mare processing Units (CPU's) 2120, peripherals interface 2118, RF circuitry 2108, audio circuitry 2110, speaker 2111, touch-sensitive display system 2112, microphone 2113, inpavoutput (V0) sub- system 2196, other input contol deviees 2146, and extemal port 2124, Device 2100 may inelude one oF more optical Sensors or cameras 2164, These components may commvni- ate aver one oF more communication buses or signal lines 2103, US 2015/0245050 AI 10099] It should be appreciate that device 2100s only one ‘example of a portable multifunction device, and that device 22100 may have more or fewer components than shown, nity ‘combine two or more components, or may hve a different ‘configuration orarrangement ofthe components, The various ‘components shown in FIG. 13 may be implemented in han ‘ware, softwar, ora combination of hardware and software, inchiding one or more signal processing and/or application specific integrate eiruits [0100] Memory 2102 may include high-spesd random ‘access memory andl may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 2102 by other components of ‘device 2100, sues as CPU 2120 andthe peripherals interface 2118, may be controlled by memory controller 2122, [0101] Peripherals interface 2118 can be used to couple ‘input and output peripherals ofthe device to CPU 2120 and memory 2102. The one oF more processors 2120 run oF ‘execute various soltware programs andr sets of instructions stored in memory 2102 to peeform various funetions for ‘device 2100 and fo process dita [0102] In some embodiments, peripherals interface 2118. ‘CPU 2120, and memory controller 2122 may be implemented ‘ona single chip, suchas chip 2104. In some other embodi- ments, they may be implemented on separate chips 10103] RF (radio frequency) circuitry 2108 receives and sends RF signals, also called electromagnetic signals. RF ‘ircuitty 2108 converts electrical signals to/from electromag netic signals and communicates with communications net- orks and other communications devices via the electromag netic signals. RF circuitry 2108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RP transceiver, one or more amplifiers, a tuner, one or more oscillators, a gia signal processor, aeoder/decoder (code) chipset, asubseriberiden- tity module (SIM) card, memory, and so forth. RF ciruitry 2108 may communicate with networks, such asthe Intec, ‘also refered o as the World Wide Web (WWW), am intranet andr a wireless network, such as a celular telephone net- ‘work, a wireless local area network (LAN) andor a metro politan area network (MAN}, and ether devices by wireless ‘Communieation. The wieeless communication may useany of variety of communications standards, protocols and tech- nologies, iacuding but not Himited to Global System for Mobile Communications (GSM), Enhanced Data GSM Eavie ronment (EDGE), high-speed downlink packet access (HS- DPA), highspeed uplink packet access (HSUPA), wideband ‘ode division multiple access (W-CDMA), code division multiple access (CDMA), time division snultiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg, IEEE S02.11a, IEEE 802.11b, IEEE 802.11g andlor IEEE 802. 1p), voiceover Internet Protocol (VoIP), WieMAX, a proto= col fore-mail (eg. Intemet message access protocol (IMAP) andor post office protocol (POP)) instant messaging (ez. ‘extensible messaging and presence protocol (XMPP), Ses- son Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS), andlor Short Message Service (SMS), or any other suitable communication protocol, including communication protocols nat yet developed as of the filing date ofthis document 10108) Audio circuitry 2110, speaker 2111, and micro- phone 2413 provide an audio interface between a user and Aug. 27, 2015 eve 2100, Audio circuitry 2110 receives audio data fro peripherals interlace 2118, converts the audio data to an elec {eical signal, and transmits the eleoical signal to speaker 2111, Speaker 2111 converts the eleczial signal to human- Audible sound waves, Audi circuitry 2110also receives elec- ‘ical signals converted by microphone 2113 from sound waves. Auto circuitry 2110 converts the electrical signal 10 ‘dio data and transmits the sudio data to peripherals inter ace 2118 for processing. Audio data may be retrieved from andr transmitted to memory 2102 and/or RF circuitry 2108 by peripherals interface 2118. In some embodiments, audio circuitry 2110 also includes a headset jack. The headset jack provides an interface between audio circuitry 2110 and removable audio inpuvoulput periperls, sue as output ‘only headphones ora headset with both ouiput (e.g, a head- phone for one or both ears) and input (ee, a mierophone) [0108] V0 subsystem 2106 couples inputioutpatperipher- als on device 2100, suchas tone sereen 2112 and other ioput control devices 2116, to peripheral interface 2118. 1 sub- system 2106 my include display controler 2186 and one or ‘more input controllers 2160 for other input contol devices 2116. The one or more input controllers 2160 receve/send lectrical signals from/to other input control devices 2116. ‘The other input contol devices 2116 may include physical buttons (e-, push buttons, rocker buttons, ec), das, slider switches, joysticks, click wheels, and so forth. In some alter ‘ate embovliments, iapit comtrolier(s) 2160 may be coupled to any (or none) ofthe following: a keyboard, infaned por, USB por, and a pointer device such 2s a mouse. The one oF ‘more buttons may include an upidown button for volume ‘control of spesker 2111 andor microphone 2113. The one oF ‘more buttons may include a push button, [0106] “Touch-senstive display 2112 provides an input interface and an output interlace between the deviee and a user. Display controller 2186 receives andr sends eletecal signals from/to towch sersen 2112. Touch screen 2112 di plays visual output to the user. The visual output aay nel sraphies, txt, icons, vidoo, and any combination thereof (collectively. temned graphics"). In some embotiments, some of all of the vistal output may correspond to user face objects. [0107] Touch sercen 2112 has a touch-sensitive surface. sensor or set of sensors that accepts input fom the user based ‘on haptic and/or tactile contact. Touch sercen 2112 and dis play controller 2486 (along with any associated modules andlor sets of instructions in memory 2102) detect contact (and any movement or breaking of the contact) on touch sereen 2112 and converts the detected contact into iteraction with user-interface objec (eg. one or more soft Key icons, web pages or images) that are displayed on touch sereen 2112. Inan example embodiment, point of contact berween touch sercen 2112 and the user corresponds toa finger ofthe [0108] Touch seven 2112 may use LCD (liquid erystal sisplay) technology, LPD (ight emitting polymer display) technology, or LED (ight emiting diode) technology, although other display technologies may be used in other ‘embodiments. Touch seveen 2112 and display controller 2156 ‘may detect contact and any movement or breaking thereof using any of a variety of touch sensing technologies now known of later developed, including hut not imited to capac tive, resistive, infrared, and surface acoustic wave technolo- gies, as well as other proximity sensor arrays or other ele- s for determining one or more points of contact with US 2015/0245050 AI ‘ouch screen 2112. In an example embodiment, projected ‘mutual capacitance seasing technology is used, such 35 that ‘ound in the WPhone®, iPod Touch, snd iPad from Apple Inc. of Cupertino, Calif. [0109] Touch screen 2112 may have a video resolution ‘excess of 100 di. Insome embosdiments, thetouch seen has ‘Video resolution of approximately 160 dpi. The user may make contact with touch sereen 2112 using any suitable ‘objector appendage such asa stylus, a finger, andso fort, In some embodiments, the user interface is designed to work primarily with fingee-hase contoets and gestures, which can bless procise than stylus-based inp die tothe lagerarea of ‘contactoffingeron the touch screen. In some embodiments, the device translates the rough Finger-based input into 8 pre= ‘ise pointerieursor position or command for perfomning the actions desired by the use [0110] In some embodiments, in addition to the touch fereen 2112, device 2100 may include a tovchpad (not Shown) foe activating or deactivating particular Funetions la some embodiments the touchpad i a touch-sensitive ares oF the device that, unlike the touch sereen, does not display visual ouput. The touchpad may bea touehsensitive surface that is separate from ouch sereen 2412 oran extension ofthe touch-sensitive surface formed by the touch sereen [0111] Device 2100 also includes power system 2162 for Powering the various components, Power system 2162 may Include a power management system, one or more power sources (eg. battery, altemating current (AC)), a recharging system, a power failure detection circuit, power converter or ierter,a power status indicator (ea light-emitting diode (LED)) and any other components associated with the pen- ‘eration, management and distribution of power in portable ‘devices. [0112] Device 2100 may also include one oF more optical sensors or cameras 2164, FIG, 13 shows an optical sensor ‘coupled to optical sensor controler 2158 in VO subsystem 2106, Optical sensor 2164 may, for example, include charge- ‘coupled device (CCD) or complementary metal-oxide semi ‘conductor CMOS) phototransistors or hotosensors. Optical sensor 2164 receives light ffom the environment, projected through one or more lenses, and converts the light to data representing an image. In conjunction with imaging module 2143 (also called a camera module), optical seasor 2164 may ‘capture sill images andlor video sequences. In someembodi- mics, a east ane optical sensor may be located on the back ‘of device 2100, opposite touch sereen display 2112 on the front ofthe device. In some embodiments, the touch sereen display may be used as a viewfinder for sil andor video mage acquisition. In some embodiments at last one optical sensor may instead or also be located on the front of the device [0113] Device 2100 may als include one or more proxim- fay sensors 2166, FIG. 13 shows proximity sensor 2166 ‘coupled to peripherals imerface 2118, Alternately, proximity sensor 2166 may be coupled to input controller 2160 in VO subsystem 2106, In some embadintens, the proximity sensor tums off and disables touch screen 2112 when the mulifine- tion devices placed near the user's ear (e.g, when the user is ‘making a phone cll) [0114] Device 2100 may also include one or more orenta- tion sensors 2168, In some embodiments the one or more ‘orientation sensors include one oF more accelerometers (@ 2. ‘one or more linear accelerometers andar one or more roa tional aecelerometers). In some embodiments, the one or Aug. 27, 2015 ore orientation sensors inelnde one or more gyroscopes. some embodiments, the one or more orientation senso include one or more magnetometers. In some embodiments, the one or more orientation sensors include one or more of global positioning system (GPS), Global Navigation Satellite System (GLONASS), andor other plobal navigation system receivers, The GPS, GLONASS, andor other global naviga- tion system receivers may be used for obtaining information ccancerning the location and orientation (eg, portato land- scape) of device 2100. Insome embodiments, the anc ormore ‘orientation sensors inelude any combination of orientstion! ‘lation sensor, FIG. 13 shows the one or more orientation sensors 2168 coupled to peripherals interface 2118. Aker rutely, the one of more orientation sensors 2168 may be ‘copied to an input controler 2160 in VO subsystem 2106, In sonic embodiments, information is displayed on the touch screen display ina portait view ora landscape view based on ‘an analysis of data received from the one oF more orientation [0118] In some embodiments, device 2100 may also Jnclude one or more other sensors (not shown) including but ‘ot limited to ambient light sensors and motion detectors These sensors may be coupled to peripherals interlace 2118, ‘or alternately, may be coupled to an input controller 2160 in VO subsystem 2106. For example, in some embodiments, device 2100 may include atleast one forwand-facing (away from the user) nd at least one backward-facing (towards the ‘user light sensors that may be used to colfect ambient light- ing metrics from the environment ofthe device 2100 for use in Video and image eapture, processing, and display appica- [0116] In some embodiments, the software components ored in memory 2102 include operating system 2126, com. ‘munication module 2128, contactimotion module (or set of instructions) 2130, graphics module 2132, text input module 2134, Global Positioning Systom (GPS) module 2138, and ‘applications 2136, Furthermore, in some embodiments ‘memory 2102 stores devieeglobal intemal state 2187 Devicelglobal internal state 2187 includes one or more of active application state, indicating which applications, if any, ‘are curently active; display state, indicating what aplica- ‘ions, views or other information occupy vatious regions of ‘ouch sereen display 2112; sensor state, ineluding informa tion obtained from the device"s variogs sensors and int contol devices 2116; and location information conceraing the device's location andor attitude [0117] Operating system 2126 (eg, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded opera ng system sucl as VxWorks) includes various software com- ponents andor drivers for controlling and managing general system tasks (eg, memory management, storage device con- ‘wo, power management, etc.) and facilitates communication between various hardware and software components. [0118] | Communication module 2128 facilitates communi- cation with other devices over one oF more external ports 2124 and also inchides various software c ling data received by RF circuitry 2108 a 2124, External port 2124 (eg. Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling droctly to other ‘devices oF indirectly over a network (ea, the Tnteret, wire Jess LAN, ct.) In some embodiments the external pot isa rmulté-pin (eg, 30-pin) connector that is the same as, or similar to andor compatible wth the S0-pin connector vised ‘on iPod (trademark of Apple Inc.) devices. US 2015/0245050 AI 4 10119] Contsevmotion module 2130 may deteet contact ‘with touch screen 2112 (i conjunction with display contol- Jer 2156) and other touch sensitive devices e.2 2 touchpad for physical click whee). Contacvimotion module 2130 includes various software components for performing various ‘operations ratedto detection of contac, suchas determining if contact has occurred (eg, detecting finger

Vous aimerez peut-être aussi