O      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVW X Y Z [ \ ] ^_`abcdefghijklmnopqrstuvwxyz{|}~                                   ! " # $ % & ' ( ) * + , - . / 0 1 2 3 4 5 6 7 8 9 : ; < = > ? @ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z [ \ ] ^ _ ` a b c d e f g h i j k l m n o p q r s t u v w x y z { | } ~                                                                                                                                            !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~                   !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !!!!!!!!!!!!! !!!""#"$"%"&"'"(")"*"+","-"."/"0"1"2#3#4#5#6#7#8#9#:#;#<#=#>#?#@$A$B$C$D$E$F$G$H$I$J$K$L$M$N$G%Safe,-;<=FNSTUV]EO&Safe,-;<=FNSTUV]Nd<Copy source to destination using C++'s placement new featureP<Copy source to destination using C++'s placement new featureLThis method is intended for types that are proxies for actual types in C++. new(dst) CType(*src)WThe copy should be performed by constructing a new object in the memory pointed to by dst4. The new object is initialised using the value of src\. This design allow underlying structures to be shared depending on the implementation of CType.PSource DestinationPQPQ'None,-;<=FNSTUV]O^R(Safe,-;<=FNSTUV]P+STUVWXY)Safe,-;<=FNSTUV]QNormalization type Comparison type Z[  *Safe,-;<=FNSTUV]S\]^_`abcdefghij+None ,-;<=FNSTUV]Uu!Compression (run length encoding)Binary#Quality [1..100], > 100 == lossless+0..100, !"#$%&'()*+,-./01234567k8lm9no  !"#$%&'()*+,-./01234567,None,-;<=FNSTUV]W@Wrapper for mutable values :<;=>?@pq:;<=>@pq-None,-;<=FNSTUV]}A[Types of which a value can be constructed from a pointer to the C equivalent of that value!Used to wrap values created in C.BBPerform an IO action with a pointer to the C equivalent of a valuerPPerform an action with a temporary pointer to the underlying representation of apThe pointer is not guaranteed to be usuable outside the scope of this function. The same warnings apply as for withForeignPtr.sEquivalent type in CJActually a proxy type in Haskell that stands for the equivalent type in C.C9Information about the storage requirements of values in C!This class assumes that the type a7 is merely a symbol that corresponds with a type in C.t@Computes the storage requirements (in bytes) of values of type a in C.uCallback function for trackbarsv"Callback function for mouse eventsw$Haskell representation of an OpenCV cv::CascadeClassifier objectx$Haskell representation of an OpenCV cv::VideoWriter objecty$Haskell representation of an OpenCV cv::VideoCapture objectz$Haskell representation of an OpenCV cv::Ptr cv::BackgroundSubtractorKNN object{$Haskell representation of an OpenCV cv::Ptr cv::BackgroundSubtractorMOG2 object|$Haskell representation of an OpenCV cv::FlannBasedMatcher object}$Haskell representation of an OpenCV  cv::BFMatcher object~$Haskell representation of an OpenCV cv::DescriptorMatcher object$Haskell representation of an OpenCV cv::Ptr cv::SimpleBlobDetector object$Haskell representation of an OpenCV cv::Ptr cv::ORB object$Haskell representation of an OpenCV  cv::DMatch object$Haskell representation of an OpenCV  cv::Keypoint object$Haskell representation of an OpenCV cv::Mat object$Haskell representation of an OpenCV cv::Scalar_<double> object$Haskell representation of an OpenCV  cv::Range object$Haskell representation of an OpenCV cv::TermCriteria object$Haskell representation of an OpenCV cv::RotatedRect object-Haskell representation of an OpenCV exceptionHMutable types use the same underlying representation as unmutable types. is represented as a .u+Current position of the specified trackbar.Optional pointer to user data.v One of the cv::MouseEvenTypes constants.$The x-coordinate of the mouse event.$The y-coordinate of the mouse event. One of the cv::MouseEventFlags constants.Optional pointer to user data.[ABrsCtuvwxyz{|}~ABrCt.None,-;<=>?FNSTUV]DFEGHIDEFGI/None,-;<=FNSTUV]%Vec type name, for both Haskell and C Vec dimensionDepth type name in HaskellDepth type name in C0None+,-;<=>?FNSTUV]JKLMNJKLMN1None,-;<=FNSTUV]B&Size type name, for both Haskell and CDepth type name in HaskellDepth type name in C2None+,-;<=>?FNSTUV]' OPQRSTUVQRSTV3None,-;<=FNSTUV]'Point type name, for both Haskell and CPoint dimensionPoint template name in CDepth type name in HaskellDepth type name in C4None,-;<=>?FNSTUV]\ WXYZ[\]WXYZ]5None,-;<=FNSTUV]&Matx type name, for both Haskell and C Row dimensionColumn dimensionDepth type name in HaskellDepth type name in C6None,-;<=FNSTUV].Context useful to work with the OpenCV library Based on ,  and .8: converts OpenCV basic types to their counterparts in OpenCV.Internal.C.Inline.No .None+,-;<=>?FNSTUV]DFEGHI^chmrw|IHDEFG^chmrw|None,-;<=>?FNSTUV]BJKLMNNJKLMNone,-;<=>?FNSTUV]bOPQRSTUVVUQRSTPO7None+,-345;<=>?FNSTUV],Native Haskell represenation of a rectangle.8None,-;<=FNSTUV]7+Rectangle type name, for both Haskell and CDepth type name in HaskellDepth type name in CPoint type name in CSize type name in C None+,-;<=>?FNSTUV]$9None,-;<=FNSTUV]Initialize the state and the mask using the provided rectangle. After that, run iterCount iterations of the algorithm. The rectangle represents a ROI containing a segmented object. The pixels outside of the ROI are marked as obvious background .-Initialize the state using the provided mask.Combination of GCInitWithRect and GCInitWithMaskN. All the pixels outside of the ROI are automatically initialized with GC_BGD.Just resume the algorithm.      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~:None,-;<=>?FNSTUV]Ǖ.A continuous subsequence (slice) of a sequence@The type is used to specify a row or a column span in a matrix (Mat ) and for many other purposes. mkRange a b is basically the same as a:b in Matlab or a..b in Python. As in Python, start is an inclusive left boundary of the range and end is an exclusive right boundary of the range. Such a half-opened interval is usually denoted as  [start, end). Phttp://docs.opencv.org/3.0-last-rst/modules/core/doc/basic_structures.html#rangeOpenCV Sphinx doc-Termination criteria for iterative algorithms Whttp://docs.opencv.org/3.0-last-rst/modules/core/doc/basic_structures.html#termcriteriaOpenCV Sphinx doc1Rotated (i.e. not up-right) rectangles on a planedEach rectangle is specified by the center point (mass center), length of each side (represented by $) and the rotation angle in degrees. Vhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/basic_structures.html#rotatedrectOpenCV Sphinx doc6A 4-element vector with 64 bit floating point elements The type / is widely used in OpenCV to pass pixel values. Qhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/basic_structures.html#scalarOpenCV Sphinx docSpecial < value which means "the whole sequence" or "the whole range"@Perform an action with a temporary pointer to an array of values>The input values are placed consecutively in memory using the  mechanism.wThis function is intended for types which are not managed by the Haskell runtime, but by a foreign system (such as C).pThe pointer is not guaranteed to be usuable outside the scope of this function. The same warnings apply as for .Rectangle mass center!Width and height of the rectanglevThe rotation angle (in degrees). When the angle is 0, 90, 180, 270 etc., the rectangle becomes an up-right rectangle.5Optionally the maximum number of iterations/elements. Optionally the desired accuracy.Inclusive start Exlusive end None,-;<=>?FNSTUV]IWXYZ[\]  !"&'+,0156:;?@DEIJNOSTXY]^bcghlmqrvw{|I]\[WXYZ !&+05:?DINSX]bglqv{ "',16;@EJOTY^chmrw|;Safe,-;<=FNSTUV](<Safe,-;<=FNSTUV] C     =None,-;<=FKNSTUV]j2Auto detect == 0#Video For Windows (platform native)%V4L/V4L2 capturing support via libv4lSame as CAP_V4LIEEE 1394 driversSame as CAP_FIREWIRESame as CAP_FIREWIRESame as CAP_FIREWIRESame as CAP_FIREWIRE QuickTimeUnicap driversDirectShow (via videoInput)PvAPI, Prosilica GigE SDKOpenNI (for Kinect)OpenNI (for Asus Xtion)Android - not usedXIMEA Camera APIAAVFoundation framework for iOS (OS X Lion will have the same API)Smartek Giganetix GigEVisionSDK+Microsoft Media Foundation (via videoInput)0Microsoft Windows Runtime using Media FoundationIntel Perceptual Computing SDKOpenNI2 (for Kinect)8OpenNI2 (for Asus Xtion and Occipital Structure sensors)gPhoto2 connection GStreamer=Open and record video file or stream using the FFMPEG library"Image Sequence (e.g. img_%02d.jpg)3Current position of the video file in milliseconds.70-based index of the frame to be decoded/captured next.ORelative position of the video file: 0=start of the film, 1=end of the film.(Width of the frames in the video stream.)Height of the frames in the video stream. Frame rate.4-character code of codec.#Number of frames in the video file.?Format of the Mat objects returned by VideoCapture::retrieve().;Backend-specific value indicating the current capture mode.+Brightness of the image (only for cameras).)Contrast of the image (only for cameras).+Saturation of the image (only for cameras).$Hue of the image (only for cameras).%Gain of the image (only for cameras).Exposure (only for cameras).CBoolean flags indicating whether images should be converted to RGB.Currently unsupported.aRectification flag for stereo cameras (note: only supported by DC1394 v 2.x backend currently).]DC1394: exposure control done by camera, user can adjust reference level using this feature.pPop up video/camera filter dialog (note: only supported by DSHOW backend currently. Property value is ignored)GAny property we need. Meaning of this property depends on the backend.K( None+,-3;<=>?FNSTUV]X AType level to value level conversion of numbers that are either ynamically or tatically known. 7toNatDS (Proxy ('S 42)) == S 42 toNatDS (Proxy 'D) == DHeterogeneous listsImplemented as nested 2-tuples. >f :: Int ::: Bool ::: Char ::: Z f = 3 ::: False ::: 'X' ::: Z End of listynamically or tatically known valuesMainly used as a promoted type.Operationally exactly the  typeSomething is dynamically known.Something is statically known, in particular: a4Converts a DS value to the corresponding Maybe value+type level: reify the known natural number nvalue level: identity'type level numbers are statically known)value level numbers are dynamically known55>None,-;<=FNSTUV]@?None,-;<=>?FNSTUV]MbHGives the number of channels associated with a particular color encodingNames of color encodings(hB) Bayer pattern with BG in the second row, second and third column (iB) Bayer pattern with GB in the second row, second and third column (jB) Bayer pattern with GR in the second row, second and third column (kB) Bayer pattern with RG in the second row, second and third column (l2) 24 bit RGB color space with channels: (B8:G8:R8) (m) 15 bit RGB color space(n) 16 bit RGB color space(o6) 32 bit RGBA color space with channels: (B8:G8:R8:A8)(p)(q)(r)(s)(t)(u)(v)(w)(x)(y)(z)({)(| ) Edge-Aware(})(~)() ()!()"()#()$()%()&()'()(())()*()+(") 8 bit single channel color space,()-().()/()0()1()2()3()4()5()6()7()8()9():();()<()=()>()?()@()A()B(2) 24 bit RGB color space with channels: (R8:G8:B8)C()D()E()F()G()H()I()J()K()L()M()N()O()P( ) Edge-AwareQ()R()S()T()U()V()W()X()Y()Z()[()\()]()^()_()`()a()b()c()d()e()f()g9Valid color conversions described by the following graph: doc/color_conversions.pngfedcba`_^]\[ZYXWVUTSRQPONMLKJIHGFEDA@?>=<;:9876543210/.-,+*)('&%$#"!      CBghijklmnopqrstuvwxyz{|}~_      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefgNone,-;<=FNSTUV]BC      !"#$%&'()*+,-./0123456789:;<=>?@ADEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~g      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefhijklmnopqrstuvwxyz{|}~@None,-;<=>?FNSTUV]ANone,-3;<=FNQSTUV]  None,-3;<=FNQSTUV]"BNone,-;<=FNSTUV]$B!Number of channels"#!$CNone+,-;<=FNQSTUV]?C Tests whether a * is deserving of its type level attributes2Checks if the properties encoded in the type of a  correspond to the value level representation. For each property that does not hold this function will produce an error message. If everything checks out it will produce an empty list.%The following properties are checked:DimensionalitySize of each dimensionNumber of channelsDepth (data type of elements);If a property is explicitly encoded as statically unknown (ynamic) it will not be checked."Relaxes the type level constraintsOOnly identical or looser constraints are allowed. For tighter constraints use .MThis allows you to 'forget' type level guarantees for zero cost. Similar to , but totally safe.  Identicala to b with a ~ bLooser(' a) to ' or (' a) to (' b) with  a bTighter' to (' a)% Similar to r in that it keeps the &L alive during the execution of the given action but it doesn't extract the ' from the &.(Deallocates the matrix data.aHighly unsafe. Subsequent operations that need the data will generate exceptions (or segfaults).)RAll possible positions (indexes) for a given shape (list of sizes per dimension). ydimPositions [3, 4] [ [0, 0], [0, 1], [0, 2], [0, 3] , [1, 0], [1, 1], [1, 2], [1, 3] , [2, 0], [2, 1], [2, 2], [2, 3] ] * fold over +empty ,-fold over the type level list.empty ,/direct conversion to ,0identity The matrix to be checked.Error messages. Original . with relaxed constraints.123The matrix to be checked.Error messages. Original . with relaxed constraints.;"#!$45%6137()45None,-;<=FNSTUV]Navier-Stokes based method. Method by Alexandru Telea. GRestores the selected region in an image using the region neighborhood.Example: inpaintImg :: forall h h2 w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Bikes_512x341 , h2 ~ ((*) h 2) , w2 ~ ((*) w 2) ) => Mat ('S ['S h2, 'S w2]) ('S c) ('S d) inpaintImg = exceptError $ do maskInv <- bitwiseNot mask maskBgr <- cvtColor gray bgr maskInv damaged <- bitwiseAnd bikes_512x341 maskBgr repairedNS <- inpaint 3 InpaintNavierStokes damaged mask repairedT <- inpaint 3 InpaintTelea damaged mask withMatM (Proxy :: Proxy [h2, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) damaged Nothing matCopyToM imgM (V2 w 0) maskBgr Nothing matCopyToM imgM (V2 0 h) repairedNS Nothing matCopyToM imgM (V2 w h) repairedT Nothing where mask = damageMask w = fromInteger $ natVal (Proxy :: Proxy w) h = fromInteger $ natVal (Proxy :: Proxy h)  %doc/generated/examples/inpaintImg.png inpaintImg {Perform fastNlMeansDenoising function for colored images. Denoising is not per channel but in a different colour spaceExample: OfastNlMeansDenoisingColoredImg :: forall h w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Lenna_512x512 , w2 ~ ((*) w 2) ) => Mat ('S ['S h, 'S w2]) ('S c) ('S d) fastNlMeansDenoisingColoredImg = exceptError $ do denoised <- fastNlMeansDenoisingColored 3 10 7 21 lenna_512x512 withMatM (Proxy :: Proxy [h, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) lenna_512x512 Nothing matCopyToM imgM (V2 w 0) denoised Nothing where w = fromInteger $ natVal (Proxy :: Proxy w)  9doc/generated/examples/fastNlMeansDenoisingColoredImg.pngfastNlMeansDenoisingColoredImg Perform fastNlMeansDenoisingColoredMulti function for colored images. Denoising is not pre channel but in a different colour space. This wrapper differs from the original OpenCV version by using all input images and denoising the middle one. The original version would allow to have some arbitrary length vector and slide window over it. As we have to copy the haskell vector before we can use it as  `std::vector`X on the cpp side it is easier to trim the vector before sending and use all frames.Example: lfastNlMeansDenoisingColoredMultiImg :: forall h w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Lenna_512x512 , w2 ~ ((*) w 2) ) => Mat ('S ['S h, 'S w2]) ('S c) ('S d) fastNlMeansDenoisingColoredMultiImg = exceptError $ do denoised <- fastNlMeansDenoisingColoredMulti 3 10 7 21 (V.singleton lenna_512x512) withMatM (Proxy :: Proxy [h, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) lenna_512x512 Nothing matCopyToM imgM (V2 w 0) denoised Nothing where w = fromInteger $ natVal (Proxy :: Proxy w)  >doc/generated/examples/fastNlMeansDenoisingColoredMultiImg.png#fastNlMeansDenoisingColoredMultiImg Perform denoise_TVL1Example: = denoise_TVL1Img :: forall h w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Lenna_512x512 , w2 ~ ((*) w 2) ) => Mat ('S ['S h, 'S w2]) ('S c) ('S d) denoise_TVL1Img = exceptError $ do denoised <- matChannelMapM (denoise_TVL1 2 50 . V.singleton) lenna_512x512 withMatM (Proxy :: Proxy [h, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) lenna_512x512 Nothing matCopyToM imgM (V2 w 0) denoised Nothing where w = fromInteger $ natVal (Proxy :: Proxy w)  *doc/generated/examples/denoise_TVL1Img.pngdenoise_TVL1ImgPerform decolorVDecolor a color image to a grayscale (1 channel) and a color boosted image (3 channel)Example: decolorImg :: forall h h2 w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Bikes_512x341 , h2 ~ ((*) h 2) , w2 ~ ((*) w 2) ) => Mat ('S ['S h2, 'S w2]) ('S c) ('S d) decolorImg = exceptError $ do (bikesGray, boost) <- decolor bikes_512x341 colorGray <- cvtColor gray bgr bikesGray withMatM (Proxy :: Proxy [h2, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) white $ \imgM -> do matCopyToM imgM (V2 0 0) bikes_512x341 Nothing matCopyToM imgM (V2 0 h) colorGray Nothing matCopyToM imgM (V2 w h) boost Nothing where w = fromInteger $ natVal (Proxy :: Proxy w) h = fromInteger $ natVal (Proxy :: Proxy h)  %doc/generated/examples/decolorImg.png decolorImg oinpaintRadius - Radius of a circular neighborhood of each point inpainted that is considered by the algorithm. Input image.Inpainting mask. Output image. Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noiseThe same as h but for color components. For most images value equals 10 will be enough to remove colored noise and do not distort colorstemplateWindowSize Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixelssearchWindowSize. Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels"Input image 8-bit 3-channel image.)Output image same size and type as input. Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noiseThe same as h but for color components. For most images value equals 10 will be enough to remove colored noise and do not distort colorstemplateWindowSize Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixelssearchWindowSize. Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels5Vector of odd number of input 8-bit 3-channel images.)Output image same size and type as input. details more is more 20Number of iterations that the algorithm will run5Vector of odd number of input 8-bit 3-channel images.)Output image same size and type as input. Input image.Output images.           None+,-.;<=>?CFNSTUV]dRepresentation tag for Repa 8s for OpenCV s.Converts an OpenCV rix into a Repa array.This is a zero-copy operation.9:None,-;<=FNSTUV]positionchannelpositionchannel>==>DNone,-;<=FNSTUV]None+,-;<=>?FNSTUV]Identity matrix Rhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/basic_structures.html#mat-eyeOpenCV Sphinx doc -Extract a sub region from a 2D-matrix (image)Example: matSubRectImg :: Mat ('S ['D, 'D]) ('S 3) ('S Word8) matSubRectImg = exceptError $ withMatM (h ::: 2 * w ::: Z) (Proxy :: Proxy 3) (Proxy :: Proxy Word8) white $ \imgM -> do matCopyToM imgM (V2 0 0) birds_512x341 Nothing matCopyToM imgM (V2 w 0) subImg Nothing lift $ rectangle imgM subRect blue 1 LineType_4 0 lift $ rectangle imgM (toRect $ HRect (V2 w 0) (V2 w h) :: Rect2i) blue 1 LineType_4 0 where subRect = toRect $ HRect (V2 96 131) (V2 90 60) subImg = exceptError $ resize (ResizeAbs $ toSize $ V2 w h) InterCubic =<< matSubRect birds_512x341 subRect [h, w] = miShape $ matInfo birds_512x341  (doc/generated/examples/matSubRectImg.png matSubRectImg"<Converts an array to another data type with optional scaling lhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/basic_structures.html?highlight=convertto#mat-converttoOpenCV Sphinx doc#9Create a matrix whose elements are defined by a function.Example: ;matFromFuncImg :: forall size. (size ~ 300) => Mat (ShapeT [size, size]) ('S 4) ('S Word8) matFromFuncImg = exceptError $ matFromFunc (Proxy :: Proxy [size, size]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) example where example [y, x] 0 = 255 - normDist (V2 x y ^-^ bluePt ) example [y, x] 1 = 255 - normDist (V2 x y ^-^ greenPt) example [y, x] 2 = 255 - normDist (V2 x y ^-^ redPt ) example [y, x] 3 = normDist (V2 x y ^-^ alphaPt) example _pos _channel = error "impossible" normDist :: V2 Int -> Word8 normDist v = floor $ min 255 $ 255 * Linear.norm (fromIntegral <$> v) / s' bluePt = V2 0 0 greenPt = V2 s s redPt = V2 s 0 alphaPt = V2 0 s s = fromInteger $ natVal (Proxy :: Proxy size) :: Int s' = fromIntegral s :: Double  )doc/generated/examples/matFromFuncImg.pngmatFromFuncImg%Transforms a given list of matrices of equal shape, channels, and depth, by folding the given function over all matrix elements at each position.!"Optional scale factor.*Optional delta added to the scaled values.#$%><; !"#$%> !"#;<$%None,-;<=FNSTUV]~&'Reads an image from a buffer in memory.The function reads an image from the specified buffer in the memory. If the buffer is too short or contains invalid data, the empty matrix/image is returned. bhttp://docs.opencv.org/3.0-last-rst/modules/imgcodecs/doc/reading_and_writing_images.html#imdecodeOpenCV Sphinx doc(&Encodes an image into a memory buffer.WARNING:" This function is not thread safe! bhttp://docs.opencv.org/3.0-last-rst/modules/imgcodecs/doc/reading_and_writing_images.html#imencodeOpenCV Sphinx doc)&Encodes an image into a memory buffer.See (+ !"#$%&'()*+,-./0123456789&'()+1234567&')*+,-./08#$%&'( !"9()None,-;<=FNSTUV]y*Callback function for trackbars+"Callback function for mouse events,"More convenient representation of 44Context for a mouse 5RInformation about which buttons and modifier keys where pressed during the event.C)Create a window with the specified title.<Make sure to free the window when you're done with it using D or better yet: use E.DFClose the window and free up all resources associated with the window.EwithWindow title act# makes a window with the specified title and passes the resulting B to the computation act-. The window will be destroyed on exit from  withWindowU whether by normal termination or by raising an exception. Make sure not to use the Window outside the act computation!F&Resize a window to the specified size.*+Current position of the specified trackbar.+0What happened to cause the callback to be fired.$The x-coordinate of the mouse event.$The y-coordinate of the mouse event.SContext for the event, such as buttons and modifier keys pressed during the event.P Trackbar name Initial value Maximum valueQR)*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQR)BCDEFG56789:;<=>?@A4HIJKLM,-./0123N+O*PQR,-./01234;5 6789:;<=>?@AB<=>?@ABCNone"#,-;<=FNSTUV]`U*Data structure for salient point detectors Shttp://docs.opencv.org/3.0-last-rst/modules/core/doc/basic_structures.html#keypointOpenCV Sphinx docWRectangle mass centerX!Width and height of the rectangleYThe rotation angle (in degrees)UWhen the angle is 0, 90, 180, 270 etc., the rectangle becomes an up-right rectangle.Z?The minimal up-right rectangle containing the rotated rectangle_Class for matching keypoint descriptors: query descriptor index, train descriptor index, train image index, and distance between descriptors Qhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/basic_structures.html#dmatchOpenCV Sphinx DocbCoordinates of the keypoints.c1Diameter of the meaningful keypoint neighborhood.dComputed orientation of the keypoint (-1 if not applicable); it's in [0,360) degrees and measured relative to image coordinate system, ie in clockwise.exThe response by which the most strong keypoints have been selected. Can be used for the further sorting or subsampling.fBOctave (pyramid layer) from which the keypoint has been extracted.gRObject class (if the keypoints need to be clustered by an object they belong to).uQuery descriptor index.vTrain descriptor index.wTrain image index.VRectangle mass center!Width and height of the rectanglevThe rotation angle (in degrees). When the angle is 0, 90, 180, 270 etc., the rectangle becomes an up-right rectangle.\5Optionally the maximum number of iterations/elements. Optionally the desired accuracy.:<;=>?@ABCDFEGHIJKLMNOPQRSTUVWXYZ[\]^chmrw|  !"&'+,0156:;?@DEIJNOSTXY]^bcghlmqrvw{| !"#$%UVWXYZ[\]^_`abcdefghipqrstuvwxyz3@?:;<=>VWXYZ[\]^U`abcdefghi_stuvwxyzpqrBACUDE_FG`abcdefgpqrstuvwxENone,-;<=FNSTUV]iHHNone,-;<=FNSTUV]None,-;<=FNSTUV]EIJKNone,-;<=FNSTUV]!^  1D example: iiiiii|abcdefgh|iiiiiii with some specified i 1D example: aaaaaa|abcdefgh|hhhhhhh 1D example: fedcba|abcdefgh|hgfedcb 1D example: cdefgh|abcdefgh|abcdefg 1D example: gfedcb|abcdefgh|gfedcba 1D example: uvwxyz|absdefgh|ijklmnodo not look outside of ROINearest neighbor interpolation.Bilinear interpolation.Bicubic interpolation.Resampling using pixel area relation. It may be a preferred method for image decimation, as it gives moire'-free results. But when the image is zoomed, it is similar to the  method.+Lanczos interpolation over 8x8 neighborhoodFNone,-;<=FNSTUV]"LMNOPQRSTUVWXYNone,-;<=FNSTUV]T KStores absolutely all the contour points. That is, any 2 subsequent points (x1,y1) and (x2,y2)T of the contour will be either horizontal, vertical or diagonal neighbors, that is, max(abs(x1-x2),abs(y2-y1)) == 1.Compresses horizontal, vertical, and diagonal segments and leaves only their end points. For example, an up-right rectangular contour is encoded with 4 points.*Retrieves only the extreme outer contours.RRetrieves all of the contours without establishing any hierarchical relationships.-Retrieves all of the contours and organizes them into a two-level hierarchy. At the top level, there are external boundaries of the components. At the second level, there are boundaries of the holes. If there is another contour inside a hole of a connected component, it is still put at the top level.SRetrieves all of the contours and reconstructs a full hierarchy of nested contours.Oriented area flag.Return a signed area value, depending on the contour orientation (clockwise or counter-clockwise). Using this feature you can determine orientation of a contour by taking the sign of an area.%Return the area as an absolute value.Calculates a contour area.3The function computes a contour area. Similarly to moments!, the area is computed using the  /https://en.wikipedia.org/wiki/Green%27s_theorem Green formula[. Thus, the returned area and the number of non-zero pixels, if you draw the contour using  drawContours or fillPolyu, can be different. Also, the function will most certainly give a wrong results for contours with self-intersections. http://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html?highlight=contourarea#cv2.contourAreaOpenCV Sphinx doc!Performs a point-in-contour test.The function determines whether the point is inside a contour, outside, or lies on an edge (or coincides with a vertex). It returns positive (inside), negative (outside), or zero (on an edge) value, correspondingly. When measureDist=false , the return value is +1, -1, and 0, respectively. Otherwise, the return value is a signed distance between the point and the nearest contour edge. whttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html#pointpolygontestOpenCV Sphinx doc?Approximates a polygonal curve(s) with the specified precision.The functions approxPolyDP approximate a curve or a polygon with another curve/polygon with less vertices so that the distance between them is less or equal to the specified precision. It uses the <http://en.wikipedia.org/wiki/Ramer-Douglas-Peucker_algorithmDouglas-Peucker algorithm http://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html?highlight=contourarea#approxpolydp-Input vector of 2D points (contour vertices).Signed or unsigned areaContour.!Point tested against the contour.If true, the function estimates the signed distance from the point to the nearest contour edge. Otherwise, the function only checks if the point is inside a contour or not.epsilon is closed is closedNone,-;<=FNSTUV]~+"Whether to use normalisation. See . [http://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/object_detection.html#matchtemplateOpenCV Sphinx doc not normed: ]http://docs.opencv.org/3.0-last-rst/_images/math/f096a706cb9499736423f10d901c7fe13a1e6926.pngnormed: ]http://docs.opencv.org/3.0-last-rst/_images/math/6d6a720237b3a4c1365c8e86a9cfcf0895d5e265.png not normed: ]http://docs.opencv.org/3.0-last-rst/_images/math/93f1747a86a3c5095a0e6a187442c6e2a0ae0968.pngnormed: ]http://docs.opencv.org/3.0-last-rst/_images/math/6a72ad9ae17c4dad88e33ed16308fc1cfba549b8.png not normed: ]http://docs.opencv.org/3.0-last-rst/_images/math/c9b62df96d0692d90cc1d8a5912a68a44461910c.pngwhere ]http://docs.opencv.org/3.0-last-rst/_images/math/ffb6954b6020b02e13b73c79bd852c1627cfb79c.pngnormed: ]http://docs.opencv.org/3.0-last-rst/_images/math/235e42ec68d2d773899efcf0a4a9d35a7afedb64.png [http://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/object_detection.html#matchtemplateOpenCV Sphinx doc5Compares a template against overlapped image regions.MThe function slides through image, compares the overlapped patches of size  ]http://docs.opencv.org/3.0-last-rst/_images/math/d47153257f0243694e5632bb23b85009eb9e5599.png w times h against templ using the specified method and stores the comparison results in result . Here are the formulae for the available comparison methods ( ]http://docs.opencv.org/3.0-last-rst/_images/math/06f9f0fcaa8d96a6a23b0f7d1566fe5efaa789ad.pngI denotes image,  ]http://docs.opencv.org/3.0-last-rst/_images/math/87804527283a4539e1e17c5861df8cb92a97fd6d.pngT template,  ]http://docs.opencv.org/3.0-last-rst/_images/math/8fa391da5431a5d6eaba1325c3e7cb3da22812b5.pngRH result). The summation is done over template and/or the image patch: ]http://docs.opencv.org/3.0-last-rst/_images/math/ff90cafd4a71d85875237787b54815ee8ac77bff.pngx' = 0...w-1, y' = 0...h-1MImage where the search is running. It must be 8-bit or 32-bit floating-point.\Searched template. It must be not greater than the source image and have the same data type.+Parameter specifying the comparison method. NormaliseZMap of comparison results. It must be single-channel 32-bit floating-point. If image is  ]http://docs.opencv.org/3.0-last-rst/_images/math/e4926c3d97c3f7434c6317ba24b8b9294a0aba64.png and templ is  ]http://docs.opencv.org/3.0-last-rst/_images/math/d47153257f0243694e5632bb23b85009eb9e5599.png , then result is  ]http://docs.opencv.org/3.0-last-rst/_images/math/e318d7237b57e08135e689fd9136b9ac8e4a4102.png. None,-;<=FNSTUV]A Connectivity value. The default value of 4 means that only the four nearest neighbor pixels (those that share an edge) are considered. A connectivity value of 8 means that the eight nearest neighbor pixels (those that share a corner) will be considered.MValue between 1 and 255 with which to fill the mask (the default value is 1).If set, the difference between the current pixel and seed pixel is considered. Otherwise, the difference between neighbor pixels is considered (that is, the range is floating).If set, the function does not change the image ( newVal is ignored), and only fills the mask with the value specified in bits 8-16 of flags as described above. This option only make sense in function variants that have the mask parameter.1Converts an image from one color space to another;The function converts an input image from one color space to another. In case of a transformation to-from RGB color space, the order of the channels should be specified explicitly (RGB or BGR). Note that the default color format in OpenCV is often referred to as RGB but it is actually BGR (the bytes are reversed). So the first byte in a standard (24-bit) color image will be an 8-bit Blue component, the second byte will be Green, and the third byte will be Red. The fourth, fifth, and sixth bytes would then be the second pixel (Blue, then Green, then Red), and so on.;The conventional ranges for R, G, and B channel values are: 0 to 255 for Z images0 to 65535 for [ images 0 to 1 for \ imagesIn case of linear transformations, the range does not matter. But in case of a non-linear transformation, an input RGB image should be normalized to the proper value range to get the correct results, for example, for RGB to L*u*v* transformation. For example, if you have a 32-bit floating-point image directly converted from an 8-bit image without any scaling, then it will have the 0..255 value range instead of 0..1 assumed by the function. So, before calling ), you need first to scale the image down: * cvtColor (img * 1/255) 'ColorConvBGR2Luv' If you use  with 8-bit images, the conversion will have some information lost. For many applications, this will not be noticeable but it is recommended to use 32-bit images in applications that need the full range of colors or that convert an image before an operation and then convert back.pIf conversion adds the alpha channel, its value will set to the maximum of corresponding channel range: 255 for Z , 65535 for [, 1 for \.Example: cvtColorImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Birds_512x341 , width2 ~ (width + width) ) => Mat (ShapeT [height, width2]) ('S channels) ('S depth) cvtColorImg = exceptError $ withMatM ((Proxy :: Proxy height) ::: (Proxy :: Proxy width2) ::: Z) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do birds_gray <- pureExcept $ cvtColor gray bgr =<< cvtColor bgr gray birds_512x341 matCopyToM imgM (V2 0 0) birds_512x341 Nothing matCopyToM imgM (V2 w 0) birds_gray Nothing lift $ arrowedLine imgM (V2 startX midY) (V2 pointX midY) red 4 LineType_8 0 0.15 where h, w :: Int32 h = fromInteger $ natVal (Proxy :: Proxy height) w = fromInteger $ natVal (Proxy :: Proxy width) startX, pointX :: Int32 startX = round $ fromIntegral w * (0.95 :: Double) pointX = round $ fromIntegral w * (1.05 :: Double) midY = h `div` 2  &doc/generated/examples/cvtColorImg.png cvtColorImg http://goo.gl/3rfrhuOpenCV Sphinx Doc The function S fills a connected component starting from the seed point with the specified color.The connectivity is determined by the color/brightness closeness of the neighbor pixels. See the OpenCV documentation for details on the algorithm.Example: floodFillImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Sailboat_768x512 , width2 ~ (width + width) ) => Mat (ShapeT [height, width2]) ('S channels) ('S depth) floodFillImg = exceptError $ withMatM ((Proxy :: Proxy height) ::: (Proxy :: Proxy width2) ::: Z) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do sailboatEvening_768x512 <- thaw sailboat_768x512 mask <- mkMatM (Proxy :: Proxy [height + 2, width + 2]) (Proxy :: Proxy 1) (Proxy :: Proxy Word8) black circle mask (V2 450 120 :: V2 Int32) 45 white (-1) LineType_AA 0 rect <- floodFill sailboatEvening_768x512 (Just mask) seedPoint eveningRed (Just tolerance) (Just tolerance) defaultFloodFillOperationFlags rectangle sailboatEvening_768x512 rect blue 2 LineType_8 0 frozenSailboatEvening_768x512 <- freeze sailboatEvening_768x512 matCopyToM imgM (V2 0 0) sailboat_768x512 Nothing matCopyToM imgM (V2 w 0) frozenSailboatEvening_768x512 Nothing lift $ arrowedLine imgM (V2 startX midY) (V2 pointX midY) red 4 LineType_8 0 0.15 where h, w :: Int32 h = fromInteger $ natVal (Proxy :: Proxy height) w = fromInteger $ natVal (Proxy :: Proxy width) startX, pointX :: Int32 startX = round $ fromIntegral w * (0.95 :: Double) pointX = round $ fromIntegral w * (1.05 :: Double) midY = h `div` 2 seedPoint :: V2 Int32 seedPoint = V2 100 50 eveningRed :: V4 Double eveningRed = V4 0 100 200 255 tolerance :: V4 Double tolerance = pure 7  'doc/generated/examples/floodFillImg.png floodFillImg http://goo.gl/9XIIneOpenCV Sphinx Doc5Applies a fixed-level threshold to each array element?The function applies fixed-level thresholding to a single-channel array. The function is typically used to get a bi-level (binary) image out of a grayscale image or for removing a noise, that is, filtering out pixels with too small or too large values. There are several types of thresholding supported by the function.Example: }grayBirds :: Mat (ShapeT [341, 512]) ('S 1) ('S Word8) grayBirds = exceptError $ cvtColor bgr gray birds_512x341 threshBinaryBirds :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) threshBinaryBirds = exceptError $ cvtColor gray bgr $ fst $ exceptError $ threshold (ThreshVal_Abs 100) (Thresh_Binary 150) grayBirds threshBinaryInvBirds :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) threshBinaryInvBirds = exceptError $ cvtColor gray bgr $ fst $ exceptError $ threshold (ThreshVal_Abs 100) (Thresh_BinaryInv 150) grayBirds threshTruncateBirds :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) threshTruncateBirds = exceptError $ cvtColor gray bgr $ fst $ exceptError $ threshold (ThreshVal_Abs 100) Thresh_Truncate grayBirds threshToZeroBirds :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) threshToZeroBirds = exceptError $ cvtColor gray bgr $ fst $ exceptError $ threshold (ThreshVal_Abs 100) Thresh_ToZero grayBirds threshToZeroInvBirds :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) threshToZeroInvBirds = exceptError $ cvtColor gray bgr $ fst $ exceptError $ threshold (ThreshVal_Abs 100) Thresh_ToZeroInv grayBirds  ,doc/generated/examples/threshBinaryBirds.pngthreshBinaryBirds  /doc/generated/examples/threshBinaryInvBirds.pngthreshBinaryInvBirds  .doc/generated/examples/threshTruncateBirds.pngthreshTruncateBirds  ,doc/generated/examples/threshToZeroBirds.pngthreshToZeroBirds /doc/generated/examples/threshToZeroInvBirds.pngthreshToZeroInvBirds dhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/miscellaneous_transformations.html#thresholdOpenCV Sphinx docIPerforms a marker-based image segmentation using the watershed algorithm.The function implements one of the variants of watershed, non-parametric marker-based segmentation algorithm, described in [Meyer, F. Color Image Segmentation, ICIP92, 1992].0Before passing the image to the function, you have to roughly outline the desired regions in the image markers with positive (>0) indices. So, every region is represented as one or more connected components with the pixel values 1, 2, 3, and so on. Such markers can be retrieved from a binary mask using  findContours and  drawContoursO. The markers are seeds  of the future image regions. All the other pixels in markers , whose relation to the outlined regions is not known and should be defined by the algorithm, should be set to 0 s. In the function output, each pixel in markers is set to a value of the seed  components or to -1 at boundaries between the regions. dhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/miscellaneous_transformations.html#watershedOpenCV Sphinx doc Runs the  bhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/miscellaneous_transformations.html#grabcutGrabCut algorithm.Example: ]grabCutBird :: Birds_512x341 grabCutBird = exceptError $ do mask <- withMatM (Proxy :: Proxy [341, 512]) (Proxy :: Proxy 1) (Proxy :: Proxy Word8) black $ \mask -> do fgTmp <- mkMatM (Proxy :: Proxy [1, 65]) (Proxy :: Proxy 1) (Proxy :: Proxy Double) black bgTmp <- mkMatM (Proxy :: Proxy [1, 65]) (Proxy :: Proxy 1) (Proxy :: Proxy Double) black grabCut birds_512x341 mask fgTmp bgTmp 5 (GrabCut_InitWithRect rect) mask' <- matScalarCompare mask 3 Cmp_Ge withMatM (Proxy :: Proxy [341, 512]) (Proxy :: Proxy 3) (Proxy :: Proxy Word8) transparent $ \imgM -> do matCopyToM imgM (V2 0 0) birds_512x341 (Just mask') where rect :: Rect Int32 rect = toRect $ HRect { hRectTopLeft = V2 264 60, hRectSize = V2 248 281 }  &doc/generated/examples/grabCutBird.png grabCutBird=Returns 0 if the pixels are not in the range, 255 otherwise.  Convert from &. Make sure the source image has this  Convert to . Source imageInput/output 1- or 3-channel, 8-bit, or floating-point image. It is modified by the function unless the FLOODFILL_MASK_ONLY flag is set.Operation mask that should be a single-channel 8-bit image, 2 pixels wider and 2 pixels taller than image. Since this is both an input and output parameter, you must take responsibility of initializing it. Flood-filling cannot go across non-zero pixels in the input mask. For example, an edge detector output can be used as a mask to stop filling at edges. On output, pixels in the mask corresponding to filled pixels in the image are set to 1 or to the a value specified in flags as described below. It is therefore possible to use the same mask in multiple calls to the function to make sure the filled areas do not overlap. Note: Since the mask is larger than the filled image, a pixel (x, y) in image corresponds to the pixel (x+1, y+1) in the mask.Starting point.)New value of the repainted domain pixels.Maximal lower brightness/color difference between the currently observed pixel and one of its neighbors belonging to the component, or a seed pixel being added to the component. Zero by default.Maximal upper brightness/color difference between the currently observed pixel and one of its neighbors belonging to the component, or a seed pixel being added to the component. Zero by default.Input 8-bit 3-channel image9Input/output 32-bit single-channel image (map) of markersInput 8-bit 3-channel image.Input/output 8-bit single-channel mask. The mask is initialized by the function when mode is set to GC_INIT_WITH_RECT. Its elements may have one of following values:,GC_BGD defines an obvious background pixels.4GC_FGD defines an obvious foreground (object) pixel..GC_PR_BGD defines a possible background pixel..GC_PR_FGD defines a possible foreground pixel.cTemporary array for the background model. Do not modify it while you are processing the same image.dTemporary arrays for the foreground model. Do not modify it while you are processing the same image.Number of iterations the algorithm should make before returning the result. Note that the result can be refined with further calls with mode==GC_INIT_WITH_MASK or mode==GC_EVAL.Operation mode Lower bound Upper boundBC      !"#$%&'()*+,-./0123456789:;<=>?@ADEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~None,-;<=FNSTUV]|$An opening operation: dilate . erode#A closing operation: erode . dilate(A morphological gradient: dilate - erode"top hat": src - open"black hat": close - src"A rectangular structuring element.An elliptic structuring element, that is, a filled ellipse inscribed into the rectangle Rect(0, 0, esize.width, 0.esize.height).#A cross-shaped structuring element.*Calculates the bilateralFilter of an imageMThe function applies bilateral filtering to the input image, as described in  Phttp://www.dai.ed.ac.uk/CVonline/LOCAL_COPIES/MANDUCHI1/Bilateral_Filtering.htmlBilateral_Filtering bilateralFilter can reduce unwanted noise very well while keeping edges fairly sharp. However, it is very slow compared to most filters. Example: ebilateralFilterImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Birds_512x341 , width2 ~ ((*) width 2) -- TODO (RvD): HSE parse error with infix type operator ) => Mat (ShapeT [height, width2]) ('S channels) ('S depth) bilateralFilterImg = exceptError $ withMatM (Proxy :: Proxy [height, width2]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ imgM -> do birdsFiltered <- pureExcept $ bilateralFilter (Just 9) Nothing Nothing Nothing birds_512x341 matCopyToM imgM (V2 0 0) birds_512x341 Nothing matCopyToM imgM (V2 w 0) birdsFiltered Nothing where w = fromInteger $ natVal (Proxy :: Proxy width)  -doc/generated/examples/bilateralFilterImg.pngbilateralFilterImg Whttps://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#bilateralfilterOpenCV Sphinx doc$Calculates the Laplacian of an imageThe function calculates the Laplacian of the source image by adding up the second x and y derivatives calculated using the Sobel operator.Example: laplacianImg :: forall shape channels depth . (Mat shape channels depth ~ Birds_512x341) => Mat shape ('S 1) ('S Double) laplacianImg = exceptError $ do imgG <- cvtColor bgr gray birds_512x341 laplacian Nothing Nothing Nothing Nothing imgG  'doc/generated/examples/laplacianImg.png laplacianImg Phttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#laplacianOpenCV Sphinx doc&Blurs an image using the median filterExample: 8medianBlurImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Birds_512x341 , width2 ~ ((*) width 2) -- TODO (RvD): HSE parse error with infix type operator ) => Mat (ShapeT [height, width2]) ('S channels) ('S depth) medianBlurImg = exceptError $ withMatM (Proxy :: Proxy [height, width2]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do birdsBlurred <- pureExcept $ medianBlur birds_512x341 13 matCopyToM imgM (V2 0 0) birds_512x341 Nothing matCopyToM imgM (V2 w 0) birdsBlurred Nothing where w = fromInteger $ natVal (Proxy :: Proxy width)  (doc/generated/examples/medianBlurImg.png medianBlurImg Qhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#medianblurOpenCV Sphinx doc"Blurs an image using a box filter.Example: @boxBlurImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Birds_512x341 , width2 ~ ((*) width 2) -- TODO (RvD): HSE parse error with infix type operator ) => Mat (ShapeT [height, width2]) ('S channels) ('S depth) boxBlurImg = exceptError $ withMatM (Proxy :: Proxy [height, width2]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do birdsBlurred <- pureExcept $ blur (V2 13 13 :: V2 Int32) birds_512x341 matCopyToM imgM (V2 0 0) birds_512x341 Nothing matCopyToM imgM (V2 w 0) birdsBlurred Nothing where w = fromInteger $ natVal (Proxy :: Proxy width)  %doc/generated/examples/boxBlurImg.png boxBlurImg Khttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#blurOpenCV Sphinx doc7Erodes an image by using a specific structuring elementExample: MerodeImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Lambda , width2 ~ ((*) width 2) -- TODO (RvD): HSE parse error with infix type operator ) => Mat (ShapeT [height, width2]) ('S channels) ('S depth) erodeImg = exceptError $ withMatM (Proxy :: Proxy [height, width2]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do erodedLambda <- pureExcept $ erode lambda Nothing (Nothing :: Maybe Point2i) 5 BorderReplicate matCopyToM imgM (V2 0 0) lambda Nothing matCopyToM imgM (V2 w 0) erodedLambda Nothing where w = fromInteger $ natVal (Proxy :: Proxy width)  #doc/generated/examples/erodeImg.pngerodeImg Lhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#erodeOpenCV Sphinx doc#Convolves an image with the kernel.Example: gfilter2DImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Birds_512x341 , width2 ~ ((*) width 2) -- TODO (RvD): HSE parse error with infix type operator ) => Mat (ShapeT [height, width2]) ('S channels) ('S depth) filter2DImg = exceptError $ withMatM (Proxy :: Proxy [height, width2]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do filteredBird <- pureExcept $ filter2D birds_512x341 kernel (Nothing :: Maybe Point2i) 0 BorderReplicate matCopyToM imgM (V2 0 0) birds_512x341 Nothing matCopyToM imgM (V2 w 0) filteredBird Nothing where w = fromInteger $ natVal (Proxy :: Proxy width) kernel = exceptError $ withMatM (Proxy :: Proxy [3, 3]) (Proxy :: Proxy 1) (Proxy :: Proxy Double) black $ \imgM -> do lift $ line imgM (V2 0 0 :: V2 Int32) (V2 0 0 :: V2 Int32) (V4 (-2) (-2) (-2) 1 :: V4 Double) 0 LineType_8 0 lift $ line imgM (V2 1 0 :: V2 Int32) (V2 0 1 :: V2 Int32) (V4 (-1) (-1) (-1) 1 :: V4 Double) 0 LineType_8 0 lift $ line imgM (V2 1 1 :: V2 Int32) (V2 1 1 :: V2 Int32) (V4 1 1 1 1 :: V4 Double) 0 LineType_8 0 lift $ line imgM (V2 1 2 :: V2 Int32) (V2 2 1 :: V2 Int32) (V4 1 1 1 1 :: V4 Double) 0 LineType_8 0 lift $ line imgM (V2 2 2 :: V2 Int32) (V2 2 2 :: V2 Int32) (V4 2 2 2 1 :: V4 Double) 0 LineType_8 0  &doc/generated/examples/filter2DImg.png filter2DImg Ohttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#filter2dOpenCV Sphinx doc8Dilates an image by using a specific structuring elementExample: RdilateImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Lambda , width2 ~ ((*) width 2) -- TODO (RvD): HSE parse error with infix type operator ) => Mat (ShapeT [height, width2]) ('S channels) ('S depth) dilateImg = exceptError $ withMatM (Proxy :: Proxy [height, width2]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do dilatedLambda <- pureExcept $ dilate lambda Nothing (Nothing :: Maybe Point2i) 3 BorderReplicate matCopyToM imgM (V2 0 0) lambda Nothing matCopyToM imgM (V2 w 0) dilatedLambda Nothing where w = fromInteger $ natVal (Proxy :: Proxy width)  $doc/generated/examples/dilateImg.png dilateImg Mhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#dilateOpenCV Sphinx doc/Performs advanced morphological transformations Shttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#morphologyexOpenCV Sphinx docZReturns a structuring element of the specified size and shape for morphological operationsExample: type StructureImg = Mat (ShapeT [128, 128]) ('S 1) ('S Word8) structureImg :: MorphShape -> StructureImg structureImg shape = exceptError $ do mat <- getStructuringElement shape (Proxy :: Proxy 128) (Proxy :: Proxy 128) img <- matConvertTo (Just 255) Nothing mat bitwiseNot img morphRectImg :: StructureImg morphRectImg = structureImg MorphRect morphEllipseImg :: StructureImg morphEllipseImg = structureImg MorphEllipse morphCrossImg :: StructureImg morphCrossImg = structureImg $ MorphCross $ toPoint (pure (-1) :: V2 Int32)  'doc/generated/examples/morphRectImg.png morphRectImg  *doc/generated/examples/morphEllipseImg.pngmorphEllipseImg (doc/generated/examples/morphCrossImg.png morphCrossImg \http://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/filtering.html#getstructuringelementOpenCV Sphinx doc Diameter of each pixel neighborhood that is used during filtering. If it is non-positive, it is computed from sigmaSpace. Default value is 5.Filter sigma in the color space. A larger value of the parameter means that farther colors within the pixel neighborhood (see sigmaSpace) will be mixed together, resulting in larger areas of semi-equal color. Default value is 50?Filter sigma in the coordinate space. A larger value of the parameter means that farther pixels will influence each other as long as their colors are close enough (see sigmaColor ). When d>0, it specifies the neighborhood size regardless of sigmaSpace. Otherwise, d is proportional to sigmaSpace. Default value is 50=Pixel extrapolation method. Default value is BorderReflect101tAperture size used to compute the second-derivative filters. The size must be positive and odd. Default value is 1.MOptional scale factor for the computed Laplacian values. Default value is 1.GOptional delta value that is added to the results. Default value is 0.Pixel extrapolation method.SInput 1-, 3-, or 4-channel image; when ksize is 3 or 5, the image depth should be Z, [, or \-, for larger aperture sizes, it can only be Z.QAperture linear size; it must be odd and greater than 1, for example: 3, 5, 7...Blurring kernel size.Blurring kernel size.sigmaXsigmaY Input image.)Structuring element used for erosion. If  is used a 3x3G rectangular structuring element is used. Kernel can be created using .anchor iterations Input image.convolution kernel (or rather a correlation kernel), a single-channel floating point matrix; if you want to apply different kernels to different channels, split the image into separate color planes using split and process them individually.anchordelta Input image.*Structuring element used for dilation. If  is used a 3x3G rectangular structuring element is used. Kernel can be created using .anchor iterations Source image."Type of a morphological operation.Structuring element. Anchor position with the kernel.1Number of times erosion and dilation are applied.None,-;<=FNSTUV]d Resize to an absolute size.?Resize with relative factors for both the width and the height.Resizes an image5To shrink an image, it will generally look best with N interpolation, whereas to enlarge an image, it will generally look best with  (slow) or  (faster but still looks OK).Example: resizeInterAreaImg :: Mat ('S ['D, 'D]) ('S 3) ('S Word8) resizeInterAreaImg = exceptError $ withMatM (h ::: w + (w `div` 2) ::: Z) (Proxy :: Proxy 3) (Proxy :: Proxy Word8) transparent $ \imgM -> do birds_resized <- pureExcept $ resize (ResizeRel $ pure 0.5) InterArea birds_768x512 matCopyToM imgM (V2 0 0) birds_768x512 Nothing matCopyToM imgM (V2 w 0) birds_resized Nothing lift $ arrowedLine imgM (V2 startX y) (V2 pointX y) red 4 LineType_8 0 0.15 where [h, w] = miShape $ matInfo birds_768x512 startX = round $ fromIntegral w * (0.95 :: Double) pointX = round $ fromIntegral w * (1.05 :: Double) y = h `div` 4  -doc/generated/examples/resizeInterAreaImg.pngresizeInterAreaImg ]http://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/geometric_transformations.html#resizeOpenCV Sphinx doc,Applies an affine transformation to an imageExample: rotateBirds :: Mat (ShapeT [2, 3]) ('S 1) ('S Double) rotateBirds = getRotationMatrix2D (V2 256 170 :: V2 CFloat) 45 0.75 warpAffineImg :: Birds_512x341 warpAffineImg = exceptError $ warpAffine birds_512x341 rotateBirds InterArea False False (BorderConstant black) warpAffineInvImg :: Birds_512x341 warpAffineInvImg = exceptError $ warpAffine warpAffineImg rotateBirds InterCubic True False (BorderConstant black)  doc/generated/birds_512x341.pngoriginal  (doc/generated/examples/warpAffineImg.png warpAffineImg +doc/generated/examples/warpAffineInvImg.pngwarpAffineInvImg ahttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/geometric_transformations.html#warpaffineOpenCV Sphinx doc0Applies a perspective transformation to an image fhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/geometric_transformations.html#warpperspectiveOpenCV Sphinx doc Inverts an affine transformation lhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/geometric_transformations.html#invertaffinetransformOpenCV Sphinx docKCalculates a perspective transformation matrix for 2D perspective transform nhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/geometric_transformations.html#getperspectivetransformOpenCV Sphinx doc *Calculates an affine matrix of 2D rotation jhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/geometric_transformations.html#getrotationmatrix2dOpenCV Sphinx doc 9Applies a generic geometrical transformation to an image.GThe function remap transforms the source image using the specified map: dst(x,y) = src(map(x,y))Example: remapImg :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: * ) . (Mat ('S ['S height, 'S width]) ('S channels) ('S depth) ~ Birds_512x341) => Mat ('S ['S height, 'S width]) ('S channels) ('S depth) remapImg = exceptError $ remap birds_512x341 transform InterLinear (BorderConstant black) where transform = exceptError $ matFromFunc (Proxy :: Proxy [height, width]) (Proxy :: Proxy 2) (Proxy :: Proxy Float) exampleFunc exampleFunc [_y, x] 0 = wobble x w exampleFunc [ y, _x] 1 = wobble y h exampleFunc _pos _channel = error "impossible" wobble :: Int -> Float -> Float wobble v s = let v' = fromIntegral v n = v' / s in v' + (s * 0.05 * sin (n * 2 * pi * 5)) w = fromInteger $ natVal (Proxy :: Proxy width) h = fromInteger $ natVal (Proxy :: Proxy height)  doc/generated/birds_512x341.pngoriginal #doc/generated/examples/remapImg.pngremapImg \http://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/geometric_transformations.html#remapOpenCV documentation UThe function transforms an image to compensate radial and tangential lens distortion.Those pixels in the destination image, for which there is no correspondent pixels in the source image, are filled with zeros (black color).HThe camera matrix and the distortion parameters can be determined using calibrateCamera . If the resolution of images is different from the resolution used at the calibration stage, f_x, f_y, c_x and c_y need to be scaled accordingly, while the distortion coefficients remain the same.Example: undistortImg :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: * ) . (Mat ('S ['S height, 'S width]) ('S channels) ('S depth) ~ Birds_512x341) => Mat ('S ['S height, 'S width]) ('S channels) ('S depth) undistortImg = undistort birds_512x341 intrinsics coefficients where intrinsics :: M33 Float intrinsics = V3 (V3 15840.8 0 2049) (V3 0 15830.3 1097) (V3 0 0 1) coefficients :: Matx51d coefficients = unsafePerformIO $ newMatx51d (-2.239145913492247) 13.674526561736648 3.650187848850095e-2 (-2.0042015752853796e-2) (-0.44790921357620456)  doc/generated/birds_512x341.pngoriginal 'doc/generated/examples/undistortImg.png undistortImg Source image.Affine transformation matrix.#Perform the inverse transformation.Fill outliers.Pixel extrapolation method.Transformed source image. Source image."Perspective transformation matrix.#Perform the inverse transformation.Fill outliers.Pixel extrapolation method.Transformed source image.HArray of 4 floating-point Points representing 4 vertices in source imageMArray of 4 floating-point Points representing 4 vertices in destination imageAThe output perspective transformation, 3x3 floating-point-matrix. +Center of the rotation in the source image.Rotation angle in degrees. Positive values mean counter-clockwise rotation (the coordinate origin is assumed to be the top-left corner).Isotropic scale factor.<The output affine transformation, 2x3 floating-point matrix.  Source image. A map of (x, y) points.'Interpolation method to use. Note that $ is not supported by this function. The source image to undistort.'The 3x3 matrix of intrinsic parameters.qThe distortion coefficients (k1,k2,p1,p2[,k3[,k4,k5,k6[,s1,s2,s3,s4[,x,y]]]]) of 4, 5, 8, 12 or 14 elements.       None,-345;<=>?FNSTUV];'Harris detector and it free k parameterSA flag, indicating whether to use the more accurate L2 norm or the default L1 norm."Finds edges in an image using the  Mhttp://docs.opencv.org/2.4/modules/imgproc/doc/feature_detection.html#canny86Canny86 algorithm.Example: cannyImg :: forall shape channels depth . (Mat shape channels depth ~ Lambda) => Mat shape ('S 1) depth cannyImg = exceptError $ canny 30 200 Nothing CannyNormL1 lambda  #doc/generated/examples/cannyImg.pngcannyImg&Determines strong corners on an image.\The function finds the most prominent corners in the image or in the specified image region.wFunction calculates the corner quality measure at every source image pixel using the cornerMinEigenVal or cornerHarris.dFunction performs a non-maximum suppression (the local maximums in 3 x 3 neighborhood are retained).2The corners with the minimal eigenvalue less than .֚֞֊֢֕֒֝{֎֟֎֕ * max(x,y) qualityMeasureMap(x,y) are rejected.PThe remaining corners are sorted by the quality measure in the descending order.jFunction throws away each corner for which there is a stronger corner at a distance less than maxDistance.Example: goodFeaturesToTrackTraces :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Frog) => Mat (ShapeT [height, width]) ('S channels) ('S depth) goodFeaturesToTrackTraces = exceptError $ do imgG <- cvtColor bgr gray frog let features = goodFeaturesToTrack imgG 20 0.01 0.5 Nothing Nothing CornerMinEigenVal withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do void $ matCopyToM imgM (V2 0 0) frog Nothing forM_ features $ \f -> do circle imgM (round <$> f :: V2 Int32) 2 blue 5 LineType_AA 0  4doc/generated/examples/goodFeaturesToTrackTraces.pnggoodFeaturesToTrackTracesTFinds circles in a grayscale image using a modification of the Hough transformation.Example:  houghCircleTraces :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Circles_1000x625) => Mat (ShapeT [height, width]) ('S channels) ('S depth) houghCircleTraces = exceptError $ do imgG <- cvtColor bgr gray circles_1000x625 let circles = houghCircles 1 10 Nothing Nothing Nothing Nothing imgG withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do void $ matCopyToM imgM (V2 0 0) circles_1000x625 Nothing forM_ circles $ \c -> do circle imgM (round <$> circleCenter c :: V2 Int32) (round (circleRadius c)) blue 1 LineType_AA 0  ,doc/generated/examples/houghCircleTraces.pnghoughCircleTracesExample: houghLinesPTraces :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: * ) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Building_868x600) => Mat (ShapeT [height, width]) ('S channels) ('S depth) houghLinesPTraces = exceptError $ do edgeImg <- canny 50 200 Nothing CannyNormL1 building_868x600 edgeImgBgr <- cvtColor gray bgr edgeImg withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do edgeImgM <- thaw edgeImg lineSegments <- houghLinesP 1 (pi / 180) 80 (Just 30) (Just 10) edgeImgM void $ matCopyToM imgM (V2 0 0) edgeImgBgr Nothing forM_ lineSegments $ \lineSegment -> do line imgM (lineSegmentStart lineSegment) (lineSegmentStop lineSegment) red 2 LineType_8 0  ,doc/generated/examples/houghLinesPTraces.pnghoughLinesPTraces-First threshold for the hysteresis procedure..Second threshold for the hysteresis procedure.Aperture size for the Sobel()) operator. If not specified defaults to 3. Must be 3, 5 or 7.SA flag, indicating whether to use the more accurate L2 norm or the default L1 norm.8-bit input image.;Input 8-bit or floating-point 32-bit, single-channel image.rMaximum number of corners to return. If there are more corners than are found, the strongest of them is returned.Parameter characterizing the minimal accepted quality of image corners. The parameter value is multiplied by the best corner quality measure, which is the minimal eigenvalue (see cornerMinEigenVal ) or the Harris function response (see cornerHarris ). The corners with the quality measure less than the product are rejected. For example, if the best corner has the quality measure = 1500, and the qualityLevel=0.01 , then all the corners with the quality measure less than 15 are rejected.AMinimum possible Euclidean distance between the returned corners.Optional region of interest. If the image is not empty (it needs to have the type CV_8UC1 and the same size as image ), it specifies the region in which the corners are detected.Size of an average block for computing a derivative covariation matrix over each pixel neighborhood. See cornerEigenValsAndVecs._Parameter indicating whether to use a Harris detector (see cornerHarris) or cornerMinEigenVal.VInverse ratio of the accumulator resolution to the image resolution. For example, if dp=1B, the accumulator has the same resolution as the input image. If dp=23, the accumulator has half as big width and height.Minimum distance between the centers of the detected circles. If the parameter is too small, multiple neighbor circles may be falsely detected in addition to a true one. If it is too large, some circles may be missed..The higher threshold of the two passed to the A edge detector (the lower one is twice smaller). Default is 100.The accumulator threshold for the circle centers at the detection stage. The smaller it is, the more false circles may be detected. Circles, corresponding to the larger accumulator values, will be returned first. Default is 100.Minimum circle radius.Maximum circle radius.1Distance resolution of the accumulator in pixels./Angle resolution of the accumulator in radians.dAccumulator threshold parameter. Only those lines are returned that get enough votes (> threshold).BMinimum line length. Line segments shorter than that are rejected.AMaximum allowed gap between points on the same line to link them..Source image. May be modified by the function.   None,-;<=FNSTUV]*/Thickness of lines the contours are drawn with.+&Draw the contour, filling in the area.0-Normal size sans-serif font. Does not have a . variant. $doc/generated/FontHersheySimplex.pngFontHersheySimplex1Small size sans-serif font. "doc/generated/FontHersheyPlain.pngFontHersheyPlain *doc/generated/FontHersheyPlain_slanted.pngFontHersheyPlain20Normal size sans-serif font (more complex than 0). Does not have a . variant. #doc/generated/FontHersheyDuplex.pngFontHersheyDuplex3Normal size serif font. $doc/generated/FontHersheyComplex.pngFontHersheyComplex ,doc/generated/FontHersheyComplex_slanted.pngFontHersheyComplex4*Normal size serif font (more complex than 3). $doc/generated/FontHersheyTriplex.pngFontHersheyTriplex ,doc/generated/FontHersheyTriplex_slanted.pngFontHersheyTriplex5Smaller version of 3. )doc/generated/FontHersheyComplexSmall.pngFontHersheyComplexSmall 1doc/generated/FontHersheyComplexSmall_slanted.pngFontHersheyComplexSmall6)Hand-writing style font. Does not have a . variant. *doc/generated/FontHersheyScriptSimplex.pngFontHersheyScriptSimplex7More complex variant of 6. Does not have a . variant. *doc/generated/FontHersheyScriptComplex.pngFontHersheyScriptComplex>8-connected line. doc/generated/LineType_8.png8-connected line?4-connected line. doc/generated/LineType_4.png4-connected line@Antialiased line. doc/generated/LineType_AA.pngAntialised lineAEDraws a arrow segment pointing from the first point to the second oneExample: arrowedLineImg :: Mat (ShapeT [200, 300]) ('S 4) ('S Word8) arrowedLineImg = exceptError $ withMatM (Proxy :: Proxy [200, 300]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do arrowedLine imgM (V2 10 130 :: V2 Int32) (V2 190 40 :: V2 Int32) blue 5 LineType_AA 0 0.15 arrowedLine imgM (V2 210 50 :: V2 Int32) (V2 250 180 :: V2 Int32) red 8 LineType_AA 0 0.4  )doc/generated/examples/arrowedLineImg.pngarrowedLineImg `http://docs.opencv.org/3.0.0/d6/d6e/group__imgproc__draw.html#ga0a165a3ca093fd488ac709fdf10c05b2OpenCV Doxygen doc Zhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#arrowedlineOpenCV Sphinx docBDraws a circle.Example: rcircleImg :: Mat (ShapeT [200, 400]) ('S 4) ('S Word8) circleImg = exceptError $ withMatM (Proxy :: Proxy [200, 400]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do lift $ circle imgM (V2 100 100 :: V2 Int32) 90 blue 5 LineType_AA 0 lift $ circle imgM (V2 300 100 :: V2 Int32) 45 red (-1) LineType_AA 0  $doc/generated/examples/circleImg.png circleImg Uhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#circleOpenCV Sphinx docC?Draws a simple or thick elliptic arc or fills an ellipse sectorExample: ellipseImg :: Mat (ShapeT [200, 400]) ('S 4) ('S Word8) ellipseImg = exceptError $ withMatM (Proxy :: Proxy [200, 400]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do lift $ ellipse imgM (V2 100 100 :: V2 Int32) (V2 90 60 :: V2 Int32) 30 0 360 blue 5 LineType_AA 0 lift $ ellipse imgM (V2 300 100 :: V2 Int32) (V2 80 40 :: V2 Int32) 160 40 290 red (-1) LineType_AA 0  %doc/generated/examples/ellipseImg.png ellipseImg Vhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#ellipseOpenCV Sphinx docDFills a convex polygon. The function DO draws a filled convex polygon. This function is much faster than the function E . It can fill not only convex polygons but any monotonic polygon without self-intersections, that is, a polygon whose contour intersects every horizontal line (scan line) twice at the most (though, its top-most and/or the bottom edge could be horizontal).Example: OfillConvexPolyImg :: forall (h :: Nat) (w :: Nat) . (h ~ 300, w ~ 300) => Mat (ShapeT [h, w]) ('S 4) ('S Word8) fillConvexPolyImg = exceptError $ withMatM (Proxy :: Proxy [h, w]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do lift $ fillConvexPoly imgM pentagon blue LineType_AA 0 where pentagon :: V.Vector (V2 Int32) pentagon = V.fromList [ V2 150 0 , V2 7 104 , V2 62 271 , V2 238 271 , V2 293 104 ]  ,doc/generated/examples/fillConvexPolyImg.pngfillConvexPolyImg ]http://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#fillconvexpolyOpenCV Sphinx docE/Fills the area bounded by one or more polygons.Example: rookPts :: Int32 -> Int32 -> V.Vector (V.Vector (V2 Int32)) rookPts w h = V.singleton $ V.fromList [ V2 ( w `div` 4) ( 7*h `div` 8) , V2 ( 3*w `div` 4) ( 7*h `div` 8) , V2 ( 3*w `div` 4) (13*h `div` 16) , V2 ( 11*w `div` 16) (13*h `div` 16) , V2 ( 19*w `div` 32) ( 3*h `div` 8) , V2 ( 3*w `div` 4) ( 3*h `div` 8) , V2 ( 3*w `div` 4) ( h `div` 8) , V2 ( 26*w `div` 40) ( h `div` 8) , V2 ( 26*w `div` 40) ( h `div` 4) , V2 ( 22*w `div` 40) ( h `div` 4) , V2 ( 22*w `div` 40) ( h `div` 8) , V2 ( 18*w `div` 40) ( h `div` 8) , V2 ( 18*w `div` 40) ( h `div` 4) , V2 ( 14*w `div` 40) ( h `div` 4) , V2 ( 14*w `div` 40) ( h `div` 8) , V2 ( w `div` 4) ( h `div` 8) , V2 ( w `div` 4) ( 3*h `div` 8) , V2 ( 13*w `div` 32) ( 3*h `div` 8) , V2 ( 5*w `div` 16) (13*h `div` 16) , V2 ( w `div` 4) (13*h `div` 16) ] fillPolyImg :: forall (h :: Nat) (w :: Nat) . (h ~ 300, w ~ 300) => Mat (ShapeT [h, w]) ('S 4) ('S Word8) fillPolyImg = exceptError $ withMatM (Proxy :: Proxy [h, w]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do lift $ fillPoly imgM (rookPts w h) blue LineType_AA 0 where h = fromInteger $ natVal (Proxy :: Proxy h) w = fromInteger $ natVal (Proxy :: Proxy w)  &doc/generated/examples/fillPolyImg.png fillPolyImg Whttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#fillpolyOpenCV Sphinx docFDraws several polygonal curvesExample: polylinesImg :: forall (h :: Nat) (w :: Nat) . (h ~ 300, w ~ 300) => Mat (ShapeT [h, w]) ('S 4) ('S Word8) polylinesImg = exceptError $ withMatM (Proxy :: Proxy [h, w]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do lift $ polylines imgM (rookPts w h) True blue 2 LineType_AA 0 where h = fromInteger $ natVal (Proxy :: Proxy h) w = fromInteger $ natVal (Proxy :: Proxy w)  'doc/generated/examples/polylinesImg.png polylinesImg Xhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#polylinesOpenCV Sphinx docG+Draws a line segment connecting two points.Example: lineImg :: Mat (ShapeT [200, 300]) ('S 4) ('S Word8) lineImg = exceptError $ withMatM (Proxy :: Proxy [200, 300]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do lift $ line imgM (V2 10 130 :: V2 Int32) (V2 190 40 :: V2 Int32) blue 5 LineType_AA 0 lift $ line imgM (V2 210 50 :: V2 Int32) (V2 250 180 :: V2 Int32) red 8 LineType_AA 0  "doc/generated/examples/lineImg.pnglineImg Shttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#lineOpenCV Sphinx docH=Calculates the size of a box that contains the specified text Zhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#gettextsizeOpenCV Sphinx docIDraws a text string.The function putText renders the specified text string in the image. Symbols that cannot be rendered using the specified font are replaced by question marks.Example: putTextImg :: Mat ('S ['D, 'S 400]) ('S 4) ('S Word8) putTextImg = exceptError $ withMatM (height ::: (Proxy :: Proxy 400) ::: Z) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do forM_ (zip [0..] [minBound .. maxBound]) $ \(n, fontFace) -> lift $ putText imgM (T.pack $ show fontFace) (V2 10 (35 + n * 30) :: V2 Int32) (Font fontFace NotSlanted 1.0) black 1 LineType_AA False where height :: Int32 height = 50 + fromIntegral (30 * fromEnum (maxBound :: FontFace))  %doc/generated/examples/putTextImg.png putTextImg Vhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#puttextOpenCV Sphinx docJ3Draws a simple, thick, or filled up-right rectangleExample: rectangleImg :: Mat (ShapeT [200, 400]) ('S 4) ('S Word8) rectangleImg = exceptError $ withMatM (Proxy :: Proxy [200, 400]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do lift $ rectangle imgM (toRect $ HRect (V2 10 10) (V2 180 180)) blue 5 LineType_8 0 lift $ rectangle imgM (toRect $ HRect (V2 260 30) (V2 80 140)) red (-1) LineType_8 0  'doc/generated/examples/rectangleImg.png rectangleImg Xhttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/drawing_functions.html#rectangleOpenCV Sphinx docK!Draw contours onto a black image.Example: zflowerContours :: Mat ('S ['S 512, 'S 768]) ('S 3) ('S Word8) flowerContours = exceptError $ withMatM (Proxy :: Proxy [512,768]) (Proxy :: Proxy 3) (Proxy :: Proxy Word8) black $ \imgM -> do edges <- thaw $ exceptError $ cvtColor bgr gray flower_768x512 >>= canny 30 20 Nothing CannyNormL1 contours <- findContours ContourRetrievalList ContourApproximationSimple edges lift $ drawContours (V.map contourPoints contours) red (OutlineContour LineType_AA 1) imgM  )doc/generated/examples/flowerContours.pngflowerContoursL4Draws a marker on a predefined position in an image.0The marker will be drawn as as a 20-pixel cross.Example: markerImg :: Mat (ShapeT [100, 100]) ('S 4) ('S Word8) markerImg = exceptError $ withMatM (Proxy :: Proxy [100, 100]) (Proxy :: Proxy 4) (Proxy :: Proxy Word8) transparent $ \imgM -> do lift $ marker imgM (50 :: V2 Int32) blue  $doc/generated/examples/markerImg.png markerImg AImage. The point the arrow starts from.The point the arrow points to. Line color.Line thickness.3Number of fractional bits in the point coordinates.<The length of the arrow tip in relation to the arrow length.B Image where the circle is drawn.Center of the circle.Radius of the circle. Circle color.kThickness of the circle outline, if positive. Negative thickness means that a filled circle is to be drawn.Type of the circle boundary.SNumber of fractional bits in the coordinates of the center and in the radius value.C Image.Center of the ellipse.*Half of the size of the ellipse main axes."Ellipse rotation angle in degrees..Starting angle of the elliptic arc in degrees.,Ending angle of the elliptic arc in degrees.Ellipse color.{Thickness of the ellipse arc outline, if positive. Otherwise, this indicates that a filled ellipse sector is to be drawn.Type of the ellipse boundary. NNumber of fractional bits in the coordinates of the center and values of axes.DImage.Polygon vertices.Polygon color.4Number of fractional bits in the vertex coordinates.EImage. Polygons.Polygon color.4Number of fractional bits in the vertex coordinates.FImage. Vertices.Flag indicating whether the drawn polylines are closed or not. If they are closed, the function draws a line from the last vertex of each curve to its first vertex. Thickness of the polyline edges.4Number of fractional bits in the vertex coordinates.GImage. First point of the line segment. Scond point of the line segment. Line color.Line thickness.3Number of fractional bits in the point coordinates.H+Thickness of lines used to render the text.(size, baseLine) = (The size of a box that contains the specified text. , y-coordinate of the baseline relative to the bottom-most text point)IImage.Text string to be drawn.3Bottom-left corner of the text string in the image. Text color.+Thickness of the lines used to draw a text.When ]^, the image data origin is at the bottom-left corner. Otherwise, it is at the top-left corner.JImage.0Rectangle color or brightness (grayscale image).Line thickness.3Number of fractional bits in the point coordinates.KColor of the contours.Image.L The image to draw the marker on.,The point where the crosshair is positioned. Line color.$)*+,-./0123456789:;<=>?@ABCDEFGHIJKL$=>?@89:;</01234567,-.)*+ABCDEFGHIJKL)*+,-./0123456789:;<=>?@None,-;<=FNSTUV]V +doc/generated/examples/colorMapAutumImg.pngcolorMapAutumImgW *doc/generated/examples/colorMapBoneImg.pngcolorMapBoneImgX )doc/generated/examples/colorMapJetImg.pngcolorMapJetImgY ,doc/generated/examples/colorMapWinterImg.pngcolorMapWinterImgZ -doc/generated/examples/colorMapRainbowImg.pngcolorMapRainbowImg[ +doc/generated/examples/colorMapOceanImg.pngcolorMapOceanImg\ ,doc/generated/examples/colorMapSummerImg.pngcolorMapSummerImg] ,doc/generated/examples/colorMapSpringImg.pngcolorMapSpringImg^ *doc/generated/examples/colorMapCoolImg.pngcolorMapCoolImg_ )doc/generated/examples/colorMapHsvImg.pngcolorMapHsvImg` *doc/generated/examples/colorMapPinkImg.pngcolorMapPinkImga )doc/generated/examples/colorMapHotImg.pngcolorMapHotImgb ,doc/generated/examples/colorMapParulaImg.pngcolorMapParulaImgc@Applies a GNU Octave/MATLAB equivalent colormap on a given imageJThe human perception isn t built for observing fine changes in grayscale images. Human eyes are more sensitive to observing changes between colors, so you often need to recolor your grayscale images to get a clue about them. OpenCV now comes with various colormaps to enhance the visualization in your computer vision application.Example: grayscaleImg :: forall (height :: Nat) (width :: Nat) depth . (height ~ 30, width ~ 256, depth ~ Word8) => Mat (ShapeT [height, width]) ('S 1) ('S depth) grayscaleImg = exceptError $ matFromFunc (Proxy :: Proxy [height, width]) (Proxy :: Proxy 1) (Proxy :: Proxy depth) grayscale where grayscale :: [Int] -> Int -> Word8 grayscale [_y, x] 0 = fromIntegral x grayscale _pos _channel = error "impossible" type ColorMapImg = Mat (ShapeT [30, 256]) ('S 3) ('S Word8) mkColorMapImg :: ColorMap -> ColorMapImg mkColorMapImg cmap = exceptError $ applyColorMap cmap grayscaleImg colorMapAutumImg :: ColorMapImg colorMapBoneImg :: ColorMapImg colorMapJetImg :: ColorMapImg colorMapWinterImg :: ColorMapImg colorMapRainbowImg :: ColorMapImg colorMapOceanImg :: ColorMapImg colorMapSummerImg :: ColorMapImg colorMapSpringImg :: ColorMapImg colorMapCoolImg :: ColorMapImg colorMapHsvImg :: ColorMapImg colorMapPinkImg :: ColorMapImg colorMapHotImg :: ColorMapImg colorMapParulaImg :: ColorMapImg colorMapAutumImg = mkColorMapImg ColorMapAutumn colorMapBoneImg = mkColorMapImg ColorMapBone colorMapJetImg = mkColorMapImg ColorMapJet colorMapWinterImg = mkColorMapImg ColorMapWinter colorMapRainbowImg = mkColorMapImg ColorMapRainbow colorMapOceanImg = mkColorMapImg ColorMapOcean colorMapSummerImg = mkColorMapImg ColorMapSummer colorMapSpringImg = mkColorMapImg ColorMapSpring colorMapCoolImg = mkColorMapImg ColorMapCool colorMapHsvImg = mkColorMapImg ColorMapHsv colorMapPinkImg = mkColorMapImg ColorMapPink colorMapHotImg = mkColorMapImg ColorMapHot colorMapParulaImg = mkColorMapImg ColorMapParula  'doc/generated/examples/grayscaleImg.png grayscaleImg Thttp://docs.opencv.org/3.0-last-rst/modules/imgproc/doc/colormaps.html#applycolormapOpenCV Sphinx docUVWXYZ[\]^_`abcUVWXYZ[\]^_`abcU VWXYZ[\]^_`abNone,-;<=>?FNSTUV]Ze)Create a new cascade classifier. Returns  if the classifier is empty after initialization. This usually means that the file could not be loaded (e.g. it doesn't exist, is corrupt, etc.)fExample: LcascadeClassifierArnold :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: * ) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Arnold_small) => IO (Mat (ShapeT [height, width]) ('S channels) ('S depth)) cascadeClassifierArnold = do -- Create two classifiers from data files. Just ccFrontal <- newCascadeClassifier "data/haarcascade_frontalface_default.xml" Just ccEyes <- newCascadeClassifier "data/haarcascade_eye.xml" -- Detect some features. let eyes = ccDetectMultiscale ccEyes arnoldGray faces = ccDetectMultiscale ccFrontal arnoldGray -- Draw the result. pure $ exceptError $ withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do void $ matCopyToM imgM (V2 0 0) arnold_small Nothing forM_ eyes $ \eyeRect -> lift $ rectangle imgM eyeRect blue 2 LineType_8 0 forM_ faces $ \faceRect -> lift $ rectangle imgM faceRect green 2 LineType_8 0 where arnoldGray = exceptError $ cvtColor bgr gray arnold_small ccDetectMultiscale cc = cascadeClassifierDetectMultiScale cc Nothing Nothing minSize maxSize minSize = Nothing :: Maybe (V2 Int32) maxSize = Nothing :: Maybe (V2 Int32)  2doc/generated/examples/cascadeClassifierArnold.pngcascadeClassifierArnoldgPSpecial version which returns bounding rectangle, rejectLevels, and levelWeightsfScale factor, default is 1.1Min neighbours, default 3"Minimum size. Default: no minimum."Maximum size. Default: no maximum.gScale factor, default is 1.1Min neighbours, default 3"Minimum size. Default: no minimum."Maximum size. Default: no maximum.defgdefgd^_None"#,-;<=FNSTUV]zFlann-based descriptor matcher.This matcher trains  flann::Index_ on a train descriptor collection and calls it nearest search methods to find the best matches. So, this matcher may be faster when matching a large train collection than the brute force matcher. FlannBasedMatcherl does not support masking permissible matches of descriptor sets because flann::Index does not support this.Example: NfbMatcherImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Frog , width2 ~ (*) width 2 ) => IO (Mat (ShapeT [height, width2]) ('S channels) ('S depth)) fbMatcherImg = do let (kpts1, descs1) = exceptError $ orbDetectAndCompute orb frog Nothing (kpts2, descs2) = exceptError $ orbDetectAndCompute orb rotatedFrog Nothing fbmatcher <- newFlannBasedMatcher (def { indexParams = FlannLshIndexParams 20 10 2 }) matches <- match fbmatcher descs1 -- Query descriptors descs2 -- Train descriptors Nothing exceptErrorIO $ pureExcept $ withMatM (Proxy :: Proxy [height, width2]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do matCopyToM imgM (V2 0 0) frog Nothing matCopyToM imgM (V2 width 0) rotatedFrog Nothing -- Draw the matches as lines from the query image to the train image. forM_ matches $ \dmatch -> do let matchRec = dmatchAsRec dmatch queryPt = kpts1 V.! fromIntegral (dmatchQueryIdx matchRec) trainPt = kpts2 V.! fromIntegral (dmatchTrainIdx matchRec) queryPtRec = keyPointAsRec queryPt trainPtRec = keyPointAsRec trainPt -- We translate the train point one width to the right in order to -- match the position of rotatedFrog in imgM. line imgM (round <$> kptPoint queryPtRec :: V2 Int32) ((round <$> kptPoint trainPtRec :: V2 Int32) ^+^ V2 width 0) blue 1 LineType_AA 0 where orb = mkOrb defaultOrbParams {orb_nfeatures = 50} width = fromInteger $ natVal (Proxy :: Proxy width) rotatedFrog = exceptError $ warpAffine frog rotMat InterArea False False (BorderConstant black) rotMat = getRotationMatrix2D (V2 250 195 :: V2 CFloat) 45 0.8  'doc/generated/examples/fbMatcherImg.png fbMatcherImg zhttp://docs.opencv.org/3.0-last-rst/modules/features2d/doc/common_interfaces_of_descriptor_matchers.html#flannbasedmatcherOpenCV Sphinx doc{Brute-force descriptor matcherFor each descriptor in the first set, this matcher finds the closest descriptor in the second set by trying each one. This descriptor matcher supports masking permissible matches of descriptor sets.Example: $bfMatcherImg :: forall (width :: Nat) (width2 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Frog , width2 ~ (*) width 2 ) => IO (Mat (ShapeT [height, width2]) ('S channels) ('S depth)) bfMatcherImg = do let (kpts1, descs1) = exceptError $ orbDetectAndCompute orb frog Nothing (kpts2, descs2) = exceptError $ orbDetectAndCompute orb rotatedFrog Nothing bfmatcher <- newBFMatcher Norm_Hamming True matches <- match bfmatcher descs1 -- Query descriptors descs2 -- Train descriptors Nothing exceptErrorIO $ pureExcept $ withMatM (Proxy :: Proxy [height, width2]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do matCopyToM imgM (V2 0 0) frog Nothing matCopyToM imgM (V2 width 0) rotatedFrog Nothing -- Draw the matches as lines from the query image to the train image. forM_ matches $ \dmatch -> do let matchRec = dmatchAsRec dmatch queryPt = kpts1 V.! fromIntegral (dmatchQueryIdx matchRec) trainPt = kpts2 V.! fromIntegral (dmatchTrainIdx matchRec) queryPtRec = keyPointAsRec queryPt trainPtRec = keyPointAsRec trainPt -- We translate the train point one width to the right in order to -- match the position of rotatedFrog in imgM. line imgM (round <$> kptPoint queryPtRec :: V2 Int32) ((round <$> kptPoint trainPtRec :: V2 Int32) ^+^ V2 width 0) blue 1 LineType_AA 0 where orb = mkOrb defaultOrbParams {orb_nfeatures = 50} width = fromInteger $ natVal (Proxy :: Proxy width) rotatedFrog = exceptError $ warpAffine frog rotMat InterArea False False (BorderConstant black) rotMat = getRotationMatrix2D (V2 250 195 :: V2 CFloat) 45 0.8  'doc/generated/examples/bfMatcherImg.png bfMatcherImg rhttp://docs.opencv.org/3.0-last-rst/modules/features2d/doc/common_interfaces_of_descriptor_matchers.html#bfmatcherOpenCV Sphinx docMatch in pre-trained matcher%Extracted blobs have an area between minArea (inclusive) and maxArea (exclusive).$Extracted blobs have circularity '(4 * pi * Area)/(perimeter * perimeter) between minCircularity (inclusive) and maxCircularity (exclusive).SThis filter compares the intensity of a binary image at the center of a blob to  blobColor3. If they differ, the blob is filtered out. Use  blobColor = 0 to extract dark blobs and blobColor = 255 to extract light blobs.LExtracted blobs have convexity (area / area of blob convex hull) between  minConvexity (inclusive) and  maxConvexity (exclusive).(Extracted blobs have this ratio between minInertiaRatio (inclusive) and maxInertiaRatio (exclusive).)The maximum number of features to retain.*Pyramid decimation ratio, greater than 1. M == 2 means the classical pyramid, where each next level has 4x less pixels than the previous, but such a big scale factor will degrade feature matching scores dramatically. On the other hand, too close to 1 scale factor will mean that to cover certain scale range you will need more pyramid levels and so the speed will suffer.kThe number of pyramid levels. The smallest level will have linear size equal to input_image_linear_size /  ** .qThis is size of the border where the features are not detected. It should roughly match the patchSize parameter.-It should be 0 in the current implementation.dThe number of points that produce each element of the oriented BRIEF descriptor. The default value  means the BRIEF where we take a random point pair and compare their brightnesses, so we get 0/1 response. Other possible values are  and . For example,  means that we take 3 random points (of course, those point coordinates are random, but they are generated from the pre-defined seed, so each element of BRIEF descriptor is computed deterministically from the pixel rectangle), find point of maximum brightness and output index of the winner (0, 1 or 2). Such output will occupy 2 bits, and therefore it will need a special variant of Hamming distance, denoted as   (2 bits per bin). When p, we take 4 random points to compute each bin (that will also occupy 2 bits with possible values 0, 1, 2 or 3). The default  means that Harris algorithm is used to rank features (the score is written to KeyPoint::score and is used to retain best nfeatures features); | is alternative value of the parameter that produces slightly less stable keypoints, but it is a little faster to compute.Size of the patch used by the oriented BRIEF descriptor. Of course, on smaller pyramid layers the perceived image area covered by a feature will be larger.(Detect keypoints and compute descriptorsExample:  orbDetectAndComputeImg :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Frog) => Mat (ShapeT [height, width]) ('S channels) ('S depth) orbDetectAndComputeImg = exceptError $ do (kpts, _descs) <- orbDetectAndCompute orb frog Nothing withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do void $ matCopyToM imgM (V2 0 0) frog Nothing forM_ kpts $ \kpt -> do let kptRec = keyPointAsRec kpt circle imgM (round <$> kptPoint kptRec :: V2 Int32) 5 blue 1 LineType_AA 0 where orb = mkOrb defaultOrbParams  1doc/generated/examples/orbDetectAndComputeImg.pngorbDetectAndComputeImg(Detect keypoints and compute descriptors~Train set of descriptors.Query set of descriptors.Train set of descriptors._Mask specifying permissible matches between an input query and train matrices of descriptors..Query set of descriptors._Mask specifying permissible matches between an input query and train matrices of descriptors..Image.Mask.Image.Mask. and > norms are preferable choices for SIFT and SURF descriptors,   should be used with , BRISK and BRIEF,   should be used with  when  or  (see ).(If it is false, this is will be default {n behaviour when it finds the k nearest neighbors for each query descriptor. If crossCheck == True, then the  knnMatch() method with k=1 will only return pairs (i,j) such that for i-th query descriptor the j-th descriptor in the matcher's collection is the nearest and vice versa, i.e. the { will only return consistent pairs. Such technique usually produces best results with minimal number of outliers when there are enough matches. This is alternative to the ratio test, used by D. Lowe in SIFT paper.Tjklmnopqrstuvwxyz{|~}T|}~{zstuvwxynopqrjklm`abcdjklmnopqrstuvwxyzef{ghijk|}~ lm noNone,-;<=FNSTUV]F&A regular method using all the points.RANSAC-based robust method.Least-Median robust method.PROSAC-based robust method.KCalculates a fundamental matrix from the corresponding points in two images5The minimum number of points required depends on the .: N == 7: N >= 8: N >= 15: N >= 8With 7 points the * method is used, despite the given method.With more than 7 points the  method will be replaced by the  method.Between 7 and 15 points the  method will be replaced by the  method. With the  method and with 7 points the result can contain up to 3 matrices, resulting in either 3, 6 or 9 rows. This is why the number of resulting rows in tagged as @ynamic. For all other methods the result always contains 3 rows. xhttp://docs.opencv.org/3.0-last-rst/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#findfundamentalmatOpenCV Sphinx doc_For points in an image of a stereo pair, computes the corresponding epilines in the other image http://docs.opencv.org/3.0-last-rst/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#computecorrespondepilinesOpenCV Sphinx docPoints from the first image.Points from the second image.Points from the first image.Points from the second image.Points. Image which contains the points.Fundamental matrix.None,-;<=FNSTUV]KFlip around the x-axis.Flip around the y-axis.Flip around both x and y-axis.4Calculates an absolute value of each matrix element. Rhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#absOpenCV Sphinx docBCalculates the per-element absolute difference between two arrays.Example: vmatAbsDiffImg :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) matAbsDiffImg = matAbsDiff flower_512x341 sailboat_512x341  (doc/generated/examples/matAbsDiffImg.png matAbsDiffImg Vhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#absdiffOpenCV Sphinx doc-Calculates the per-element sum of two arrays.Example: jmatAddImg :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) matAddImg = matAdd flower_512x341 sailboat_512x341  $doc/generated/examples/matAddImg.png matAddImg Rhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#addOpenCV Sphinx doc8Calculates the per-element difference between two arraysExample: ymatSubtractImg :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) matSubtractImg = matSubtract flower_512x341 sailboat_512x341  )doc/generated/examples/matSubtractImg.pngmatSubtractImg Whttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#subtractOpenCV Sphinx doc)Calculates the weighted sum of two arraysExample: matAddWeightedImg :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) matAddWeightedImg = exceptError $ matAddWeighted flower_512x341 0.5 sailboat_512x341 0.5 0.0  ,doc/generated/examples/matAddWeightedImg.pngmatAddWeightedImg Zhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#addweightedOpenCV Sphinx doc7Calculates the sum of a scaled array and another array.The function scaleAdd is one of the classical primitive linear algebra operations, known as DAXPY or SAXPY in BLAS. It calculates the sum of a scaled array and another array. Whttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#scaleaddOpenCV Sphinx docExample:  bitwiseNotImg :: Mat (ShapeT VennShape) ('S 3) ('S Word8) bitwiseNotImg = exceptError $ do img <- bitwiseNot vennCircleAImg imgBgr <- cvtColor gray bgr img createMat $ do imgM <- lift $ thaw imgBgr lift $ vennCircleA imgM blue 2 pure imgM  (doc/generated/examples/bitwiseNotImg.png bitwiseNotImg Zhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#bitwise-notOpenCV Sphinx docExample: AbitwiseAndImg :: Mat (ShapeT VennShape) ('S 3) ('S Word8) bitwiseAndImg = exceptError $ do img <- bitwiseAnd vennCircleAImg vennCircleBImg imgBgr <- cvtColor gray bgr img createMat $ do imgM <- lift $ thaw imgBgr lift $ vennCircleA imgM blue 2 lift $ vennCircleB imgM red 2 pure imgM  (doc/generated/examples/bitwiseAndImg.png bitwiseAndImg Zhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#bitwise-andOpenCV Sphinx docExample: >bitwiseOrImg :: Mat (ShapeT VennShape) ('S 3) ('S Word8) bitwiseOrImg = exceptError $ do img <- bitwiseOr vennCircleAImg vennCircleBImg imgBgr <- cvtColor gray bgr img createMat $ do imgM <- lift $ thaw imgBgr lift $ vennCircleA imgM blue 2 lift $ vennCircleB imgM red 2 pure imgM  'doc/generated/examples/bitwiseOrImg.png bitwiseOrImg Yhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#bitwise-orOpenCV Sphinx docExample: AbitwiseXorImg :: Mat (ShapeT VennShape) ('S 3) ('S Word8) bitwiseXorImg = exceptError $ do img <- bitwiseXor vennCircleAImg vennCircleBImg imgBgr <- cvtColor gray bgr img createMat $ do imgM <- lift $ thaw imgBgr lift $ vennCircleA imgM blue 2 lift $ vennCircleB imgM red 2 pure imgM  (doc/generated/examples/bitwiseXorImg.png bitwiseXorImg Zhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#bitwise-xorOpenCV Sphinx docBCreates one multichannel array out of several single-channel ones. Thttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#mergeOpenCV Sphinx docADivides a multi-channel array into several single-channel arrays.Example: matSplitImg :: forall (width :: Nat) (width3 :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . ( Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Birds_512x341 , width3 ~ ((*) width 3) ) => Mat (ShapeT [height, width3]) ('S channels) ('S depth) matSplitImg = exceptError $ do zeroImg <- mkMat (Proxy :: Proxy [height, width]) (Proxy :: Proxy 1) (Proxy :: Proxy depth) black let blueImg = matMerge $ V.fromList [channelImgs V.! 0, zeroImg, zeroImg] greenImg = matMerge $ V.fromList [zeroImg, channelImgs V.! 1, zeroImg] redImg = matMerge $ V.fromList [zeroImg, zeroImg, channelImgs V.! 2] withMatM (Proxy :: Proxy [height, width3]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do matCopyToM imgM (V2 (w*0) 0) (unsafeCoerceMat blueImg) Nothing matCopyToM imgM (V2 (w*1) 0) (unsafeCoerceMat greenImg) Nothing matCopyToM imgM (V2 (w*2) 0) (unsafeCoerceMat redImg) Nothing where channelImgs = matSplit birds_512x341 w :: Int32 w = fromInteger $ natVal (Proxy :: Proxy width)  &doc/generated/examples/matSplitImg.png matSplitImg Thttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#splitOpenCV Sphinx doc4Apply the same 1 dimensional action to every channel0Finds the global minimum and maximum in an array Xhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#minmaxlocOpenCV Sphinx doc!Calculates an absolute array norm Shttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#normOpenCV Sphinx docECalculates an absolute difference norm, or a relative difference norm Shttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#normOpenCV Sphinx doc.Normalizes the norm or value range of an array Xhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#normalizeOpenCV Sphinx doc$Calculates the sum of array elementsExample: matSumImg :: Mat (ShapeT [201, 201]) ('S 3) ('S Word8) matSumImg = exceptError $ withMatM (Proxy :: Proxy [201, 201]) (Proxy :: Proxy 3) (Proxy :: Proxy Word8) black $ \imgM -> do -- Draw a filled circle. Each pixel has a value of (255,255,255) lift $ circle imgM (pure radius :: V2 Int32) radius white (-1) LineType_8 0 -- Calculate the sum of all pixels. scalar <- matSumM imgM let V4 area _y _z _w = fromScalar scalar :: V4 Double -- Circle area = pi * radius * radius let approxPi = area / 255 / (radius * radius) lift $ putText imgM (T.pack $ show approxPi) (V2 40 110 :: V2 Int32) (Font FontHersheyDuplex NotSlanted 1) blue 1 LineType_AA False where radius :: forall a. Num a => a radius = 100  $doc/generated/examples/matSumImg.png matSumImg Rhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#sumOpenCV Sphinx doc :Calculates a mean and standard deviation of array elements Yhttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#meanstddevOpenCV Sphinx doc <Flips a 2D matrix around vertical, horizontal, or both axes._The example scenarios of using the function are the following: Vertical flipping of the image () to switch between top-left and bottom-left image origin. This is a typical operation in video processing on Microsoft Windows* OS. Horizontal flipping of the image with the subsequent horizontal shift and absolute difference calculation to check for a vertical-axis symmetry (). Simultaneous horizontal and vertical flipping of the image with the subsequent shift and absolute difference calculation to check for a central symmetry ((). Reversing the order of point arrays ( or ).Example: gmatFlipImg :: Mat (ShapeT [341, 512]) ('S 3) ('S Word8) matFlipImg = matFlip sailboat_512x341 FlipBoth  %doc/generated/examples/matFlipImg.png matFlipImg Transposes a matrix.Example: mmatTransposeImg :: Mat (ShapeT [512, 341]) ('S 3) ('S Word8) matTransposeImg = matTranspose sailboat_512x341  *doc/generated/examples/matTransposeImg.pngmatTransposeImg 3Applies horizontal concatenation to given matrices.Example: hconcatImg :: Mat ('S '[ 'D, 'D ]) ('S 3) ('S Word8) hconcatImg = exceptError $ hconcat $ V.fromList [ halfSize birds_768x512 , halfSize flower_768x512 , halfSize sailboat_768x512 ] where halfSize = exceptError . resize (ResizeRel 0.5) InterArea  %doc/generated/examples/hconcatImg.png hconcatImg1Applies vertical concatenation to given matrices.Example: vconcatImg :: Mat ('S '[ 'D, 'D ]) ('S 3) ('S Word8) vconcatImg = exceptError $ vconcat $ V.fromList [ halfSize birds_768x512 , halfSize flower_768x512 , halfSize sailboat_768x512 ] where halfSize = exceptError . resize (ResizeRel 0.5) InterArea  %doc/generated/examples/vconcatImg.png vconcatImg:Performs the perspective matrix transformation of vectors.wTODO: Modify this function for accept 3D points TODO: Generalize return type to V.Vector (point2 CDouble) chttp://docs.opencv.org/3.0-last-rst/modules/core/doc/operations_on_arrays.html#perspectivetransformOpenCV Sphinx docsrc1alphasrc2betagammaFirst input array.!Scale factor for the first array.Second input array.OOptional operation mask; it must have the same size as the input array, depth  and 1 channel. Input array.Calculated norm.Absolute or relative norm.OOptional operation mask; it must have the same size as the input array, depth  and 1 channel.First input array.:Second input array of the same size and type as the first.Calculated norm.[Norm value to normalize to or the lower range boundary in case of the range normalization.dUpper range boundary in case of the range normalization; it is not used for the norm normalization.Optional operation mask. Input array.0Input array that must have from 1 to 4 channels. 0Input array that must have from 1 to 4 channels. Optional operation mask.  How to flip. 3      3        None,-;<=FNSTUV]RCComputes an optimal affine transformation between two 2D point sets uhttp://docs.opencv.org/3.0-last-rst/modules/video/doc/motion_analysis_and_object_tracking.html#estimaterigidtransformOpenCV Sphinx docSource Destination Full affine!None,-;<=FNSTUV]mExample: carAnim :: Animation (ShapeT [240, 320]) ('S 3) ('S Word8) carAnim = carOverhead mog2Anim :: IO (Animation (ShapeT [240, 320]) ('S 3) ('S Word8)) mog2Anim = do mog2 <- newBackgroundSubtractorMOG2 Nothing Nothing Nothing forM carOverhead $ (delay, img) -> do fg <- bgSubApply mog2 0.1 img fgBgr <- exceptErrorIO $ pureExcept $ cvtColor gray bgr fg pure (delay, fgBgr)  Original: doc/generated/examples/car.gifcarAnim Foreground: doc/generated/examples/mog2.gifmog2Anim>The value between 0 and 1 that indicates how fast the background model is learnt. Negative parameter value makes the algorithm to use some automatically chosen learning rate. 0 means that the background model is not updated at all, 1 means that the background model is completely reinitialized from the last frame.Next video frame.4The output foreground mask as an 8-bit binary image.The output background image.Length of the history.Threshold on the squared distance between the pixel and the sample to decide whether a pixel is close to that sample. This parameter does not affect the background update.If ], the algorithm will detect shadows and mark them. It decreases the speed a bit, so if you do not need this feature, set the parameter to p.Length of the history.Threshold on the squared Mahalanobis distance between the pixel and the model to decide whether a pixel is well described by the background model. This parameter does not affect the background update.If ], the algorithm will detect shadows and mark them. It decreases the speed a bit, so if you do not need this feature, set the parameter to p.qrst None,-;<=FNSTUV]n1II"None,-;<=FNSTUV]r#VideoFile and backend$VideoDevice and backend"#$%&'()*+,-./%"#$&'()*+,-./"#$%uvGNone,-;<=FNSTUV]s[  !"#$%&'()*+,-./0123456789:<;=>?@ABCDFEGHIJKLMNOPQRSTUVWXYZ[\]^chmrw|  !"&'+,0156:;?@DEIJNOSTXY]^bcghlmqrvw{|BC      !"#$%&'()*+,-./0123456789:;<=>?@ADEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRUVWXYZ[\]^_`abcdefghipqrstuvwxyz    )*+,-./0123456789:;<=>?@ABCDEFGHIJKLUVWXYZ[\]^_`abcdefgjklmnopqrstuvwxyz{|~}     "#$%&'()*+,-./#None%,-;<=CFNQSTUV]n2.An OpenCV 2D-filter preserving the matrix type3An OpenCV bidimensional matrix4'map Pixel types to a number of channels5map Pixel types to a depth65Compute an OpenCV 2D-matrix from a JuicyPixels image.Example: /fromImageImg :: IO (Mat ('S '[ 'D, 'D]) ('S 3) ('S Word8)) fromImageImg = do r <- Codec.Picture.readImage "data/Lenna.png" case r of Left err -> error err Right (Codec.Picture.ImageRGB8 img) -> pure $ OpenCV.Juicy.fromImage img Right _ -> error "Unhandled JuicyPixels format!"  'doc/generated/examples/fromImageImg.png fromImageImg74Compute a JuicyPixels image from an OpenCV 2D-matrix>FIXME: There's a bug in the colour conversions in the example:Example: toImageImg :: IO (Mat ('S '[ 'D, 'D]) ('S 3) ('S Word8)) toImageImg = exceptError . cvtColor rgb bgr . from . to . exceptError . cvtColor bgr rgb <$> fromImageImg where to :: OpenCV.Juicy.Mat2D 'D 'D ('S 3) ('S Word8) -> Codec.Picture.Image Codec.Picture.PixelRGB8 to = OpenCV.Juicy.toImage from :: Codec.Picture.Image Codec.Picture.PixelRGB8 -> OpenCV.Juicy.Mat2D 'D 'D ('S 3) ('S Word8) from = OpenCV.Juicy.fromImage  %doc/generated/examples/toImageImg.png toImageImg8_Apply an OpenCV 2D-filter to a JuicyPixels dynamic matrix, preserving the Juicy pixel encoding6JuicyPixels image7OpenCV 2D-matrix8OpenCV 2D-filterJuicyPixels dynamic image23456783245678$None,-;<=FNSTUV]I9The API might change in the future, but currently we can:Open/create a new file:  wr <- I $ G (@I "tst.MOV" "avc1" 30 (3840, 2160) ) kNow, we can write some frames, but they need to have exactly the same size as the one we have opened with:   $ L wr img =We need to close at the end or it will not finalize the file:   $ J wr @ABCDEFGHIJKL HFG@ABCDEIJKL@ABCDEFGHwxy&H)I)J)K)L)M)N)O)P)Q)R)S)T)U)V)W)X)Y)Z+[+\+]+^+_+`+a+b+c+d+e+f+f+g+h+i+j+k+l+m+n+o+p+p+q+r+s+t+u+v+w+x+y+z+{+|+}+~+,,,,,,,---......00000222222224444444777777777777 7 7 7 7                9999 9!9"9#9$9%9&9'9(9)9*9+:,:-:.:/:0:1:2:3 4 5 6 7 8 9 : ; < = > ? @ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z [ \ ] ^ _ ` a b c d e f g h i j k l m n o p q r s t u v w x y z { | } ~                                                                                             ============================================== = = = = =================== =!="="=# $ % & ' ( ) * + , - . / 0 1 2 3 3 4 4 5 6 7 8 9 : ; < = > ? @ A B C D E?F?G?H?I?J?K?L?M?N?O?P?Q?R?S?T?U?V?W?X?Y?Z?[?\?]?^?_?`?a?b?c?d?e?f?g?h?i?j?k?l?m?n?o?p?q?r?s?t?u?v?w?x?y?z?{?|?}?~??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????@ @ @ @ @ @@@@@@@@@AAAAAAAAAAA A!A"A#AA$A%A&A'C(C)C*C+C,C-C.C/C0C0C1C2C3C4C5C6C7C8C9C:C;C<C=C>C?C@CACBCCCDCEFGHIJKLMNOPQRSTUDVDWDXDYDZD[D\]^_`abcdefghijkklmnopqrstuvwxyz{|}~EEEEEEEEEEEEEEEEEEE              ! " # $ %&'()*+,-./0123456789:;<=>?@ABCDEFFGHIJKLLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnoopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<= >!?!@!A!B!C!D!E!F!G!H!I!J!K!L!M"N"O"P"Q"R"S"T"U"V"W"X"Y"Z"["\"]#^#_#`#a#b#c#d#e#f#g#h#i#j#k$l$l$m$n$o$p$q$r$s$t$u$v$w$x$y%z&{&|'}(~(((((())***************+++++,,------------------------------------------------------------------------------------------../0012234456    7 7 8999999999999999999 9!9"9#9$9%9&9'9(9)9*9+9,9-9.9/909192939495969798999:9;9<9=9>9?9@9A9B9C9D9E9F9G9H9I9J9K9L9M9N9O9P9Q9R9S9T9U9V9W9X9Y9Z9[9\9]9^9_9`9a9b9c9d9e9f9g9h9i9j9k9l9m9n9o9p9q9r9s9t9u9v9w9x9y9z9{9|9}9~99999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999::::::0::1::2::3:::;;;;;;;<<<<<<<<<<<<<<<<<< < < < < <<<<<<<<<<<<<<<<<<< <!<"<#<$<%<&<'<(<)<*<+<,<-<.</<0<1<2<3<4<5<6<7<8<9<:=;=<=>>>??@@AABACADAEAFAGAAHBIBJBKBLCMNOPCQCRCSCTUVWCXCYCZC[C\C]C^C4C_C`CabcdedrfghiijklmEnnoFpFqFrFsFtFuFvFwFxFyFzF{F|F}~~!?!!@!"Q"$s$%opencv-0.0.2.1-1sNBzT2neCr4aFZkpoyVffOpenCV.Core.TypesOpenCV.Core.ArrayOpsOpenCV.ImgCodecsOpenCV.Core.Types.Mat OpenCV.UnsafeOpenCV.Core.Types.VecOpenCV.Core.Types.SizeOpenCV.Core.Types.PointOpenCV.Core.Types.MatxOpenCV.Core.Types.RectOpenCV.ImgProc.MiscImgTransformOpenCV.VideoIO.TypesOpenCV.TypeLevel*OpenCV.ImgProc.MiscImgTransform.ColorCodesOpenCV.Exception OpenCV.PhotoOpenCV.Core.Types.Mat.RepaOpenCV.HighGuiOpenCV.Core.Types.Mat.HMat OpenCV.JSONOpenCV.ImgProc.Types!OpenCV.ImgProc.StructuralAnalysisOpenCV.ImgProc.ObjectDetectionOpenCV.ImgProc.ImgFiltering$OpenCV.ImgProc.GeometricImgTransformOpenCV.ImgProc.FeatureDetectionOpenCV.ImgProc.DrawingOpenCV.ImgProc.ColorMaps OpenCV.ImgProc.CascadeClassifierOpenCV.Features2dOpenCV.Calib3d OpenCV.VideoOpenCV.Video.MotionAnalysisOpenCV.VideoIO.VideoCapture OpenCV.JuicyOpenCV.VideoIO.VideoWriterOpenCV.InternalOpenCV.Internal.C.PlacementNew!OpenCV.Internal.C.PlacementNew.TH!OpenCV.Internal.Calib3d.ConstantsOpenCV.Internal.Core.ArrayOps$OpenCV.Internal.Core.Types.ConstantsOpenCV.Internal.ImgCodecsOpenCV.Internal.MutableOpenCV.Internal.C.TypesOpenCV.Internal.Core.Types.Vec!OpenCV.Internal.Core.Types.Vec.THOpenCV.Internal.Core.Types.Size"OpenCV.Internal.Core.Types.Size.TH OpenCV.Internal.Core.Types.Point#OpenCV.Internal.Core.Types.Point.THOpenCV.Internal.Core.Types.Matx"OpenCV.Internal.Core.Types.Matx.THOpenCV.Internal.C.InlineOpenCV.Internal.Core.Types.Rect"OpenCV.Internal.Core.Types.Rect.TH(OpenCV.Internal.ImgProc.MiscImgTransformOpenCV.Internal.Core.TypesOpenCV.Internal.Photo.Constants!OpenCV.Internal.VideoIO.ConstantsOpenCV.Internal.VideoIO.Types2OpenCV.Internal.ImgProc.MiscImgTransform.TypeLevel3OpenCV.Internal.ImgProc.MiscImgTransform.ColorCodes$OpenCV.Internal.Core.Types.Mat.DepthOpenCV.Internal.Exception&OpenCV.Internal.Core.Types.Mat.MarshalOpenCV.Internal.Core.Types.Mat%OpenCV.Internal.Core.Types.Mat.ToFrom#OpenCV.Internal.Core.Types.Mat.HMatOpenCV.Internal.ImgProc.TypesOpenCV PlacementNew NormAbsRel NormRelative NormAbsoluteNormTypeNorm_InfNorm_L1Norm_L2 Norm_L2SQR Norm_Hamming Norm_Hamming2 Norm_MinMaxCmpTypeCmp_EqCmp_GtCmp_GeCmp_LtCmp_LeCmp_Ne OutputFormat OutputBmp OutputExr OutputHdr OutputJpegOutputJpeg2000 OutputPng OutputPxm OutputSunras OutputTiff OutputWebP PngParamspngParamCompressionpngParamStrategypngParamBinaryLevel PngStrategyPngStrategyDefaultPngStrategyFilteredPngStrategyHuffmanOnlyPngStrategyRLEPngStrategyFixed JpegParamsjpegParamQualityjpegParamProgressivejpegParamOptimizejpegParamRestartIntervaljpegParamLumaQualityjpegParamChromaQuality ImreadModeImreadUnchangedImreadGrayscale ImreadColorImreadAnyDepthImreadAnyColorImreadLoadGdaldefaultJpegParamsdefaultPngParams FreezeThawfreezethaw unsafeFreeze unsafeThawMutableMutFromPtrWithPtrCSizeOfIsVectoVecfromVectoVecIOVecDimVecIsSizetoSizefromSizetoSizeIOSizeIsPoint3IsPoint2IsPointtoPoint fromPoint toPointIOPointDimPointIsMatxtoMatxfromMatxtoMatxIOMatxDimCMatxDimRMatxVec2i$fPlacementNewC'Vec$fIsVecV2Int32$fIsVecVecInt32 $fFromPtrVecVec2f$fPlacementNewC'Vec0$fIsVecV2CFloat$fIsVecVecCFloat $fFromPtrVec0Vec2d$fPlacementNewC'Vec1$fIsVecV2CDouble$fIsVecVecCDouble $fFromPtrVec1Vec3i$fPlacementNewC'Vec2$fIsVecV3Int32$fIsVecVecInt320 $fFromPtrVec2Vec3f$fPlacementNewC'Vec3$fIsVecV3CFloat$fIsVecVecCFloat0 $fFromPtrVec3Vec3d$fPlacementNewC'Vec4$fIsVecV3CDouble$fIsVecVecCDouble0 $fFromPtrVec4Vec4i$fPlacementNewC'Vec5$fIsVecV4Int32$fIsVecVecInt321 $fFromPtrVec5Vec4f$fPlacementNewC'Vec6$fIsVecV4CFloat$fIsVecVecCFloat1 $fFromPtrVec6Vec4d$fPlacementNewC'Vec7$fIsVecV4CDouble$fIsVecVecCDouble1 $fFromPtrVec7Size2i$fPlacementNewC'Size$fIsSizeV2Int32$fIsSizeSizeInt32 $fFromPtrSizeSize2f$fPlacementNewC'Size0$fIsSizeV2CFloat$fIsSizeSizeCFloat$fFromPtrSize0Size2d$fPlacementNewC'Size1$fIsSizeV2CDouble$fIsSizeSizeCDouble$fFromPtrSize1Point2i$fPlacementNewC'Point$fIsPointV2Int32$fIsPointPointInt32$fFromPtrPointPoint2f$fPlacementNewC'Point0$fIsPointV2CFloat$fIsPointPointCFloat$fFromPtrPoint0Point2d$fPlacementNewC'Point1$fIsPointV2CDouble$fIsPointPointCDouble$fFromPtrPoint1Point3i$fPlacementNewC'Point2$fIsPointV3Int32$fIsPointPointInt320$fFromPtrPoint2Point3f$fPlacementNewC'Point3$fIsPointV3CFloat$fIsPointPointCFloat0$fFromPtrPoint3Point3d$fPlacementNewC'Point4$fIsPointV3CDouble$fIsPointPointCDouble0$fFromPtrPoint4IsRecttoRectfromRecttoRectIO rectTopLeftrectBottomRightrectSizerectArea rectContainsRectSize RectPointHRect hRectTopLeft hRectSizeRectRect2i$fPlacementNewC'Rect$fIsRectHRectInt32$fIsRectRectInt32 $fFromPtrRectRect2f$fPlacementNewC'Rect0$fIsRectHRectCFloat$fIsRectRectCFloat$fFromPtrRect0Rect2dfmapRect$fPlacementNewC'Rect1$fIsRectHRectCDouble$fIsRectRectCDouble$fFromPtrRect1GrabCutOperationModeGrabCut_InitWithRectGrabCut_InitWithMaskGrabCut_InitWithRectAndMask GrabCut_Eval ThreshValue ThreshVal_AbsThreshVal_OtsuThreshVal_Triangle ThreshType Thresh_BinaryThresh_BinaryInvThresh_Truncate Thresh_ToZeroThresh_ToZeroInv FromScalar fromScalarToScalartoScalarRange TermCriteria RotatedRectScalarMatx12f newMatx12f$fPlacementNewC'Matx$fIsMatxMatxCFloat $fFromPtrMatxMatx12d newMatx12d$fPlacementNewC'Matx0$fIsMatxMatxCDouble$fFromPtrMatx0Matx13f newMatx13f$fPlacementNewC'Matx1$fIsMatxMatxCFloat0$fFromPtrMatx1Matx13d newMatx13d$fPlacementNewC'Matx2$fIsMatxMatxCDouble0$fFromPtrMatx2Matx14f newMatx14f$fPlacementNewC'Matx3$fIsMatxMatxCFloat1$fFromPtrMatx3Matx14d newMatx14d$fPlacementNewC'Matx4$fIsMatxMatxCDouble1$fFromPtrMatx4Matx16f newMatx16f$fPlacementNewC'Matx5$fIsMatxMatxCFloat2$fFromPtrMatx5Matx16d newMatx16d$fPlacementNewC'Matx6$fIsMatxMatxCDouble2$fFromPtrMatx6Matx21f newMatx21f$fPlacementNewC'Matx7$fIsMatxMatxCFloat3$fFromPtrMatx7Matx21d newMatx21d$fPlacementNewC'Matx8$fIsMatxMatxCDouble3$fFromPtrMatx8Matx22f newMatx22f$fPlacementNewC'Matx9$fIsMatxMatxCFloat4$fFromPtrMatx9Matx22d newMatx22d$fPlacementNewC'Matx10$fIsMatxMatxCDouble4$fFromPtrMatx10Matx23f newMatx23f$fPlacementNewC'Matx11$fIsMatxMatxCFloat5$fFromPtrMatx11Matx23d newMatx23d$fPlacementNewC'Matx12$fIsMatxMatxCDouble5$fFromPtrMatx12Matx31f newMatx31f$fPlacementNewC'Matx13$fIsMatxMatxCFloat6$fFromPtrMatx13Matx31d newMatx31d$fPlacementNewC'Matx14$fIsMatxMatxCDouble6$fFromPtrMatx14Matx32f newMatx32f$fPlacementNewC'Matx15$fIsMatxMatxCFloat7$fFromPtrMatx15Matx32d newMatx32d$fPlacementNewC'Matx16$fIsMatxMatxCDouble7$fFromPtrMatx16Matx33f newMatx33f$fPlacementNewC'Matx17$fIsMatxMatxCFloat8$fFromPtrMatx17Matx33d newMatx33d$fPlacementNewC'Matx18$fIsMatxMatxCDouble8$fFromPtrMatx18Matx34f newMatx34f$fPlacementNewC'Matx19$fIsMatxMatxCFloat9$fFromPtrMatx19Matx34d newMatx34d$fPlacementNewC'Matx20$fIsMatxMatxCDouble9$fFromPtrMatx20Matx41f newMatx41f$fPlacementNewC'Matx21$fIsMatxMatxCFloat10$fFromPtrMatx21Matx41d newMatx41d$fPlacementNewC'Matx22$fIsMatxMatxCDouble10$fFromPtrMatx22Matx43f newMatx43f$fPlacementNewC'Matx23$fIsMatxMatxCFloat11$fFromPtrMatx23Matx43d newMatx43d$fPlacementNewC'Matx24$fIsMatxMatxCDouble11$fFromPtrMatx24Matx44f newMatx44f$fPlacementNewC'Matx25$fIsMatxMatxCFloat12$fFromPtrMatx25Matx44d newMatx44d$fPlacementNewC'Matx26$fIsMatxMatxCDouble12$fFromPtrMatx26Matx51f newMatx51f$fPlacementNewC'Matx27$fIsMatxMatxCFloat13$fFromPtrMatx27Matx51d newMatx51d$fPlacementNewC'Matx28$fIsMatxMatxCDouble13$fFromPtrMatx28Matx61f newMatx61f$fPlacementNewC'Matx29$fIsMatxMatxCFloat14$fFromPtrMatx29Matx61d newMatx61d$fPlacementNewC'Matx30$fIsMatxMatxCDouble14$fFromPtrMatx30Matx66f$fPlacementNewC'Matx31$fIsMatxMatxCFloat15$fFromPtrMatx31Matx66d$fPlacementNewC'Matx32$fIsMatxMatxCDouble15$fFromPtrMatx32VideoCaptureAPI VideoCapAny VideoCapVfw VideoCapV4l VideoCapV4l2VideoCapFirewireVideoCapFirewareVideoCapIeee1394VideoCapDc1394VideoCapCmu1394 VideoCapQtVideoCapUnicap VideoCapDshow VideoCapPvapiVideoCapOpenniVideoCapOpenniAsusVideoCapAndroid VideoCapXiapiVideoCapAvfoundationVideoCapGiganetix VideoCapMsmf VideoCapWinrtVideoCapIntelpercVideoCapOpenni2VideoCapOpenni2AsusVideoCapGphoto2VideoCapGstreamerVideoCapFfmpegVideoCapImagesVideoCapturePropertiesVideoCapPropPosMsecVideoCapPropPosFramesVideoCapPropPosAviRatioVideoCapPropFrameWidthVideoCapPropFrameHeightVideoCapPropFpsVideoCapPropFourCcVideoCapPropFrameCountVideoCapPropFormatVideoCapPropModeVideoCapPropBrightnessVideoCapPropContrastVideoCapPropSaturationVideoCapPropHueVideoCapPropGainVideoCapPropExposureVideoCapPropConvertRgbVideoCapPropWhiteBalanceBlueUVideoCapPropRectificationVideoCapPropMonochromeVideoCapPropSharpnessVideoCapPropAutoExposureVideoCapPropGammaVideoCapPropTemperatureVideoCapPropTriggerVideoCapPropTriggerDelayVideoCapPropWhiteBalanceRedVVideoCapPropZoomVideoCapPropFocusVideoCapPropGuidVideoCapPropIsoSpeedVideoCapPropBacklightVideoCapPropPanVideoCapPropTiltVideoCapPropRollVideoCapPropIrisVideoCapPropSettingsVideoCapPropBuffersizeVideoCapPropAutofocusVideoCapPropIntFourCCunFourCCIsStaticAllMayRelaxRelaxDSNatsDSNatInElemLength ToNatListDS toNatListDSToNatDStoNatDSToInt32toInt32:::ZDSDS dsToMaybe$fToInt32proxy$fToInt32Int32$fToNatDSProxy$fToNatDSproxy$fToNatListDSProxy$fToNatListDSproxy$fPrivateIsStaticaS$fAllap: $fAllkp[] $fIsStaticaS$fShowDS$fEqDS $fFunctorDSColorCodeDepthColorCodeMatchesChannelsColorCodeChannels ColorCodeBayerBGBayerGBBayerGRBayerRGBGRBGR555BGR565BGRA BGRA_I420 BGRA_IYUV BGRA_NV12 BGRA_NV21 BGRA_UYNV BGRA_UYVY BGRA_Y422 BGRA_YUNV BGRA_YUY2 BGRA_YUYV BGRA_YV12 BGRA_YVYUBGR_EABGR_FULLBGR_I420BGR_IYUVBGR_NV12BGR_NV21BGR_UYNVBGR_UYVYBGR_VNGBGR_Y422BGR_YUNVBGR_YUY2BGR_YUYVBGR_YV12BGR_YVYUGRAYGRAY_420 GRAY_I420 GRAY_IYUV GRAY_NV12 GRAY_NV21 GRAY_UYNV GRAY_UYVY GRAY_Y422 GRAY_YUNV GRAY_YUY2 GRAY_YUYV GRAY_YV12 GRAY_YVYUHLSHLS_FULLHSVHSV_FULLLabLBGRLRGBLuvMRGBARGBRGBA RGBA_I420 RGBA_IYUV RGBA_NV12 RGBA_NV21 RGBA_UYNV RGBA_UYVY RGBA_Y422 RGBA_YUNV RGBA_YUY2 RGBA_YUYV RGBA_YV12 RGBA_YVYURGB_EARGB_FULLRGB_I420RGB_IYUVRGB_NV12RGB_NV21RGB_UYNVRGB_UYVYRGB_VNGRGB_Y422RGB_YUNVRGB_YUY2RGB_YUYVRGB_YV12RGB_YVYUXYZYCrCbYUVYUV420pYUV420spYUV_I420YUV_IYUVYUV_YV12ColorConversionbayerBGbayerGBbayerGRbayerRGbgrbgr555bgr565bgra bgra_I420 bgra_IYUV bgra_NV12 bgra_NV21 bgra_UYNV bgra_UYVY bgra_Y422 bgra_YUNV bgra_YUY2 bgra_YUYV bgra_YV12 bgra_YVYUbgr_EAbgr_FULLbgr_I420bgr_IYUVbgr_NV12bgr_NV21bgr_UYNVbgr_UYVYbgr_VNGbgr_Y422bgr_YUNVbgr_YUY2bgr_YUYVbgr_YV12bgr_YVYUgraygray_420 gray_I420 gray_IYUV gray_NV12 gray_NV21 gray_UYNV gray_UYVY gray_Y422 gray_YUNV gray_YUY2 gray_YUYV gray_YV12 gray_YVYUhlshls_FULLhsvhsv_FULLlablbgrlrgbluvmrgbargbrgba rgba_I420 rgba_IYUV rgba_NV12 rgba_NV21 rgba_UYNV rgba_UYVY rgba_Y422 rgba_YUNV rgba_YUY2 rgba_YUYV rgba_YV12 rgba_YVYUrgb_EArgb_FULLrgb_I420rgb_IYUVrgb_NV12rgb_NV21rgb_UYNVrgb_UYVYrgb_VNGrgb_Y422rgb_YUNVrgb_YUY2rgb_YUYVrgb_YV12rgb_YVYUxyzyCrCbyuvyuv420pyuv420spyuv_I420yuv_IYUVyuv_YV12DepthT ToDepthDS toDepthDSToDepthtoDepthDepthDepth_8UDepth_8S Depth_16U Depth_16S Depth_32S Depth_32F Depth_64FDepth_USRTYPE1 CvExceptTCvExceptCvCppExceptionExpectationError expectedValue actualValueCoerceMatError ShapeError SizeError ChannelError DepthError CvExceptionBindingException pureExcept exceptError exceptErrorIO exceptErrorM ToChannelsDS ToChannels ToShapeDS toShapeDSToShapetoShape ChannelsTShapeTMatInfomiShapemiDepth miChannelsMat typeCheckMatrelaxMat coerceMatunsafeCoerceMatmkMatcloneMat typeCheckMatM relaxMatM coerceMatMunsafeCoerceMatMmkMatM createMatwithMatM cloneMatMmatInfo toChannels toChannelsDSInpaintingMethodInpaintNavierStokes InpaintTeleainpaintfastNlMeansDenoisingColored fastNlMeansDenoisingColoredMulti denoise_TVL1decolor$fShowInpaintingMethodDIMMtoRepa$fSourceMdepth $fNFDataArray unsafeRead unsafeWriteFromMatfromMatToMattoMatMatDepth MatChannelsMatShapeemptyMateyeMat matSubRect matCopyTo matConvertTo matFromFunc matCopyToMfoldMatimdecode imdecodeMimencode imencodeMTrackbarCallback MouseCallback EventFlagsRec flagsLButton flagsRButton flagsMButton flagsCtrlKey flagsShiftKey flagsAltKey EventFlagsEventEventMouseMoveEventLButtonDownEventRButtonDownEventMButtonDownEventLButtonUpEventRButtonUpEventMButtonUpEventLButtonDbClickEventRButtonDbClickEventMButtonDbClickEventMouseWheelEventMouseHWheelWindow makeWindow destroyWindow withWindow resizeWindowwaitKey hasLButton hasRButton hasMButton hasCtrlKey hasShiftKey hasAltKey flagsToRecsetMouseCallbackcreateTrackbarimshowimshowM $fShowEvent$fShowEventFlagsRecKeyPoint mkRotatedRectrotatedRectCenterrotatedRectSizerotatedRectAnglerotatedRectBoundingRectrotatedRectPointsmkTermCriteriamkRange wholeRangeDMatch KeyPointReckptPointkptSizekptAngle kptResponse kptOctave kptClassId mkKeyPoint keyPointAsRec$fCSizeOfTYPEC'KeyPoint$fFromPtrKeyPoint$fWithPtrKeyPoint$fPlacementNewC'KeyPoint$fEqKeyPointRec$fShowKeyPointRec AlgorithmalgorithmClearStatealgorithmIsEmpty DMatchRecdmatchQueryIdxdmatchTrainIdx dmatchImgIdxdmatchDistancemkDMatch dmatchAsRec$fCSizeOfTYPEC'DMatch$fFromPtrDMatch$fWithPtrDMatch$fPlacementNewC'DMatch $fEqDMatchRec$fShowDMatchRecToHElemstoHElemsHElems HElems_8U HElems_8S HElems_16U HElems_16S HElems_32S HElems_32F HElems_64FHElems_USRTYPE1HMathmShape hmChannelshmElems hElemsDepth hElemsLength matToHMat hMatToMat$fFromJSONHElems$fToJSONHElems $fFromJSONMat $fToJSONMat$fFromJSONSize $fToJSONSize$fFromJSONSize0 $fToJSONSize0$fFromJSONPoint $fToJSONPoint$fFromJSONPoint0$fToJSONPoint0$fFromJSONPoint1$fToJSONPoint1$fFromJSONPoint2$fToJSONPoint2$fFromJSONPoint3$fToJSONPoint3$fFromJSONPoint4$fToJSONPoint4 $fFromJSONJ $fFromJSONJ0 $fToJSONJ $fToJSONJ0$fFromJSONHMat $fToJSONHMat BorderModeBorderConstantBorderReplicate BorderReflect BorderWrapBorderReflect101BorderTransparentBorderIsolatedInterpolationMethod InterNearest InterLinear InterCubic InterArea InterLanczos4$fShowInterpolationMethodContour contourPointscontourChildrenContourApproximationMethodContourApproximationNoneContourApproximationSimpleContourApproximationTC89L1ContourApproximationTC89KCOSContourRetrievalModeContourRetrievalExternalContourRetrievalListContourRetrievalCCompContourRetrievalTreeContourAreaOrientedContourAreaAbsoluteValue contourAreapointPolygonTest findContours approxPolyDP arcLength minAreaRect $fShowContourMatchTemplateNormalisationMatchTemplateNotNormedMatchTemplateNormedMatchTemplateMethodMatchTemplateSqDiffMatchTemplateCCorrMatchTemplateCCoeff matchTemplate$fShowMatchTemplateMethod $fShowMatchTemplateNormalisation$fEqMatchTemplateNormalisationFloodFillOperationFlagsfloodFillConnectivityfloodFillMaskFillColorfloodFillFixedRangefloodFillMaskOnlycvtColor floodFilldefaultFloodFillOperationFlags threshold watershedgrabCutinRangeMorphOperation MorphOpen MorphClose MorphGradient MorphTopHat MorphBlackHat MorphShape MorphRect MorphEllipse MorphCrossbilateralFilter laplacian medianBlurblur gaussianBlurerodefilter2Ddilate morphologyExgetStructuringElement ResizeAbsRel ResizeAbs ResizeRelresize warpAffinewarpPerspectiveinvertAffineTransformgetPerspectiveTransformgetRotationMatrix2Dremap undistort$fShowResizeAbsRel LineSegmentlineSegmentStartlineSegmentStop"GoodFeaturesToTrackDetectionMethodHarrisDetectorCornerMinEigenValCircle circleCenter circleRadius CannyNorm CannyNormL1 CannyNormL2cannygoodFeaturesToTrack houghCircles houghLinesP$fIsVecLineSegmentdepth$fShowCannyNorm $fEqCannyNorm $fShowCircle($fShowGoodFeaturesToTrackDetectionMethod&$fEqGoodFeaturesToTrackDetectionMethod$fFoldableLineSegment$fFunctorLineSegment$fTraversableLineSegment$fShowLineSegmentContourDrawModeOutlineContour FillContours FontSlant NotSlantedSlantedFontFaceFontHersheySimplexFontHersheyPlainFontHersheyDuplexFontHersheyComplexFontHersheyTriplexFontHersheyComplexSmallFontHersheyScriptSimplexFontHersheyScriptComplexFont _fontFace _fontSlant _fontScaleLineType LineType_8 LineType_4 LineType_AA arrowedLinecircleellipsefillConvexPolyfillPoly polylinesline getTextSizeputText rectangle drawContoursmarker$fShowLineType$fEnumLineType$fBoundedLineType$fShowFontFace$fEnumFontFace$fBoundedFontFace$fShowFontSlant $fShowFontColorMapColorMapAutumn ColorMapBone ColorMapJetColorMapWinterColorMapRainbow ColorMapOceanColorMapSummerColorMapSpring ColorMapCool ColorMapHsv ColorMapPink ColorMapHotColorMapParula applyColorMapCascadeClassifiernewCascadeClassifier!cascadeClassifierDetectMultiScale#cascadeClassifierDetectMultiScaleNC$fFromPtrCascadeClassifier$fWithPtrCascadeClassifierFlannBasedMatcherParams indexParams searchParamsFlannSearchParamschecksepssortedFlannIndexParamsFlannKDTreeIndexParamsFlannLshIndexParamstrees tableNumberkeySizemultiProbeLevelFlannBasedMatcher BFMatcherDescriptorMatcherupcastaddtrainmatchmatch'SimpleBlobDetectorParamsblob_minThresholdblob_maxThresholdblob_thresholdStepblob_minRepeatabilityblob_minDistBetweenBlobsblob_filterByAreablob_filterByCircularityblob_filterByColorblob_filterByConvexityblob_filterByInertiaBlobFilterByInertiablob_minInertiaRatioblob_maxInertiaRatioBlobFilterByConvexityblob_minConvexityblob_maxConvexityBlobFilterByColorblob_blobColorBlobFilterByCircularityblob_minCircularityblob_maxCircularityBlobFilterByArea blob_minArea blob_maxAreaSimpleBlobDetector OrbParams orb_nfeaturesorb_scaleFactor orb_nlevelsorb_edgeThresholdorb_firstLevel orb_WTA_K orb_scoreType orb_patchSizeorb_fastThreshold OrbScoreType HarrisScore FastScoreWTA_KWTA_K_2WTA_K_3WTA_K_4OrbdefaultOrbParamsmkOrborbDetectAndComputedefaultSimpleBlobDetectorParamsmkSimpleBlobDetector blobDetect newBFMatchernewFlannBasedMatcher drawMatches $fFromPtrOrb $fWithPtrOrb$fFromPtrSimpleBlobDetector$fWithPtrSimpleBlobDetector$fWithPtrBaseMatcher$fDescriptorMatcherBFMatcher$fFromPtrBFMatcher$fWithPtrBFMatcher$$fDescriptorMatcherFlannBasedMatcher$fFromPtrFlannBasedMatcher$fWithPtrFlannBasedMatcher$fDefaultFlannIndexParams$fDefaultFlannSearchParams $fDefaultFlannBasedMatcherParams$fDefaultDrawMatchesParams$fEqBlobFilterByArea$fEqBlobFilterByCircularity$fEqBlobFilterByColor$fEqBlobFilterByConvexity$fEqBlobFilterByInertiaFindHomographyParams fhpMethodfhpRansacReprojThreshold fhpMaxIters fhpConfidenceFindHomographyMethodFindHomographyMethod_0FindHomographyMethod_RANSACFindHomographyMethod_LMEDSFindHomographyMethod_RHO WhichImageImage1Image2FundamentalMatMethod FM_7Point FM_8Point FM_RansacFM_LmedsfindFundamentalMatfindHomographycomputeCorrespondEpilines$fDefaultFindHomographyParams$fShowFundamentalMatMethod$fEqFundamentalMatMethod$fShowWhichImage$fEqWhichImage$fShowFindHomographyMethod$fShowFindHomographyParams FlipDirectionFlipVerticallyFlipHorizontallyFlipBoth matScalarAdd matScalarMultmatAbs matAbsDiffmatAdd matSubtractmatAddWeighted matScaleAddmatMaxmatScalarCompare bitwiseNot bitwiseAnd bitwiseOr bitwiseXormatMergematSplitmatChannelMapM minMaxLocnormnormDiff normalizematSummatSumM meanStdDevmatFlip matTransposehconcatvconcatperspectiveTransform$fShowFlipDirection$fEqFlipDirectionestimateRigidTransformBackgroundSubtractorMOG2BackgroundSubtractorKNNBackgroundSubtractor bgSubApplygetBackgroundImagenewBackgroundSubtractorKNNnewBackgroundSubtractorMOG2-$fBackgroundSubtractorBackgroundSubtractorKNN"$fAlgorithmBackgroundSubtractorKNN $fFromPtrBackgroundSubtractorKNN $fWithPtrBackgroundSubtractorKNN.$fBackgroundSubtractorBackgroundSubtractorMOG2#$fAlgorithmBackgroundSubtractorMOG2!$fFromPtrBackgroundSubtractorMOG2!$fWithPtrBackgroundSubtractorMOG2VideoCaptureSourceVideoFileSourceVideoDeviceSource VideoCapturenewVideoCapturevideoCaptureOpenvideoCaptureReleasevideoCaptureIsOpenedvideoCaptureGrabvideoCaptureRetrievevideoCaptureGetDvideoCaptureGetIvideoCaptureSetDvideoCaptureSetI$fFromPtrVideoCapture$fWithPtrVideoCaptureFilterMat2D PixelChannels PixelDepth fromImagetoImageisoJuicy$fStorablePixelYA16$fStorablePixelYA8$fStorablePixelRGBA16$fStorablePixelRGBA8$fStorablePixelRGBF$fStorablePixelRGB16$fStorablePixelRGB8 VideoFileSink vfsFilePath vfsFourCCvfsFps vfsFrameDimsVideoWriterSinkVideoFileSink' VideoWritervideoWriterOpenvideoWriterReleasevideoWriterIsOpenedvideoWriterWrite$fFromPtrVideoWriter$fWithPtrVideoWriter objFromPtr placementNewplacementDeletemkPlacementNewInstancec'CV_FM_7POINTc'CV_FM_8POINTc'CV_FM_RANSAC c'CV_FM_LMEDSc'LMEDSc'RANSACc'RHOmarshalCmpTypemarshalNormTypec'sizeof_Point2ic'sizeof_Point2fc'sizeof_Point2dc'sizeof_Point3ic'sizeof_Point3fc'sizeof_Point3dc'sizeof_Size2ic'sizeof_Size2fc'sizeof_Scalarc'sizeof_Rangec'sizeof_KeyPointc'sizeof_DMatch c'sizeof_Matc'TERMCRITERIA_COUNTc'TERMCRITERIA_EPSmarshalImreadModemarshalJpegParamsmarshalPngStrategymarshalPngParamsmarshalOutputFormatunMutwithPtrCcSizeOfC'TrackbarCallbackC'MouseCallbackC'CascadeClassifier C'VideoWriterC'VideoCaptureC'Ptr_BackgroundSubtractorMOG2C'Ptr_BackgroundSubtractorKNNC'FlannBasedMatcher C'BFMatcherC'DescriptorMatcherC'Ptr_SimpleBlobDetector C'Ptr_ORBC'DMatch C'KeyPointC'MatC'ScalarC'RangeC'TermCriteria C'RotatedRectC'CvCppException $fWithPtrMut$fWithPtrMaybebaseGHC.BaseNothingGHC.PtrnullPtrfromPtrC'Rect2dC'Rect2fC'Rect2iC'Size2dC'Size2fC'Size2i C'Point3d C'Point3f C'Point3i C'Point2d C'Point2f C'Point2iC'Vec4dC'Vec4fC'Vec4iC'Vec3dC'Vec3fC'Vec3iC'Vec2dC'Vec2fC'Vec2i C'Matx66d C'Matx66f C'Matx61d C'Matx61f C'Matx51d C'Matx51f C'Matx44d C'Matx44f C'Matx43d C'Matx43f C'Matx41d C'Matx41f C'Matx34d C'Matx34f C'Matx33d C'Matx33f C'Matx32d C'Matx32f C'Matx31d C'Matx31f C'Matx23d C'Matx23f C'Matx22d C'Matx22f C'Matx21d C'Matx21f C'Matx16d C'Matx16f C'Matx14d C'Matx14f C'Matx13d C'Matx13f C'Matx12d C'Matx12fC'RectC'SizeC'PointC'VecC'MatxtoCFloat fromCFloat toCDouble fromCDoubleunVec mkVecTypeunSize mkSizeTypeunPoint mkPointTypeunMatx mkMatxType openCvCtx+inline-c-cpp-0.2.1.0-5XYddFIBGhf7AI10yh7sGfLanguage.C.Inline.CppcppCtx'inline-c-0.6.0.5-10pmocMfjnz3vvXf3ri6nzLanguage.C.Inline.ContextbsCtxvecCtx ctxTypesTablectxAntiQuotersunRect mkRectTypec'THRESH_BINARYc'THRESH_BINARY_INVc'THRESH_TRUNCc'THRESH_TOZEROc'THRESH_TOZERO_INVmarshalThreshType c'THRESH_OTSUc'THRESH_TRIANGLEmarshalThreshValuec'FLOODFILL_FIXED_RANGEc'FLOODFILL_MASK_ONLYc'GC_INIT_WITH_RECTc'GC_INIT_WITH_MASK c'GC_EVALmarshalGrabCutOperationModemarshalGrabCutOperationModeRectc'COLOR_BGR2BGRAc'COLOR_RGB2RGBAc'COLOR_BGRA2BGRc'COLOR_RGBA2RGBc'COLOR_BGR2RGBAc'COLOR_RGB2BGRAc'COLOR_RGBA2BGRc'COLOR_BGRA2RGBc'COLOR_BGR2RGBc'COLOR_RGB2BGRc'COLOR_BGRA2RGBAc'COLOR_RGBA2BGRAc'COLOR_BGR2GRAYc'COLOR_RGB2GRAYc'COLOR_GRAY2BGRc'COLOR_GRAY2RGBc'COLOR_GRAY2BGRAc'COLOR_GRAY2RGBAc'COLOR_BGRA2GRAYc'COLOR_RGBA2GRAYc'COLOR_BGR2BGR565c'COLOR_RGB2BGR565c'COLOR_BGR5652BGRc'COLOR_BGR5652RGBc'COLOR_BGRA2BGR565c'COLOR_RGBA2BGR565c'COLOR_BGR5652BGRAc'COLOR_BGR5652RGBAc'COLOR_GRAY2BGR565c'COLOR_BGR5652GRAYc'COLOR_BGR2BGR555c'COLOR_RGB2BGR555c'COLOR_BGR5552BGRc'COLOR_BGR5552RGBc'COLOR_BGRA2BGR555c'COLOR_RGBA2BGR555c'COLOR_BGR5552BGRAc'COLOR_BGR5552RGBAc'COLOR_GRAY2BGR555c'COLOR_BGR5552GRAYc'COLOR_BGR2XYZc'COLOR_RGB2XYZc'COLOR_XYZ2BGRc'COLOR_XYZ2RGBc'COLOR_BGR2YCrCbc'COLOR_RGB2YCrCbc'COLOR_YCrCb2BGRc'COLOR_YCrCb2RGBc'COLOR_BGR2HSVc'COLOR_RGB2HSVc'COLOR_BGR2Labc'COLOR_RGB2Labc'COLOR_BGR2Luvc'COLOR_RGB2Luvc'COLOR_BGR2HLSc'COLOR_RGB2HLSc'COLOR_HSV2BGRc'COLOR_HSV2RGBc'COLOR_Lab2BGRc'COLOR_Lab2RGBc'COLOR_Luv2BGRc'COLOR_Luv2RGBc'COLOR_HLS2BGRc'COLOR_HLS2RGBc'COLOR_BGR2HSV_FULLc'COLOR_RGB2HSV_FULLc'COLOR_BGR2HLS_FULLc'COLOR_RGB2HLS_FULLc'COLOR_HSV2BGR_FULLc'COLOR_HSV2RGB_FULLc'COLOR_HLS2BGR_FULLc'COLOR_HLS2RGB_FULLc'COLOR_LBGR2Labc'COLOR_LRGB2Labc'COLOR_LBGR2Luvc'COLOR_LRGB2Luvc'COLOR_Lab2LBGRc'COLOR_Lab2LRGBc'COLOR_Luv2LBGRc'COLOR_Luv2LRGBc'COLOR_BGR2YUVc'COLOR_RGB2YUVc'COLOR_YUV2BGRc'COLOR_YUV2RGBc'COLOR_YUV2RGB_NV12c'COLOR_YUV2BGR_NV12c'COLOR_YUV2RGB_NV21c'COLOR_YUV2BGR_NV21c'COLOR_YUV420sp2RGBc'COLOR_YUV420sp2BGRc'COLOR_YUV2RGBA_NV12c'COLOR_YUV2BGRA_NV12c'COLOR_YUV2RGBA_NV21c'COLOR_YUV2BGRA_NV21c'COLOR_YUV420sp2RGBAc'COLOR_YUV420sp2BGRAc'COLOR_YUV2RGB_YV12c'COLOR_YUV2BGR_YV12c'COLOR_YUV2RGB_IYUVc'COLOR_YUV2BGR_IYUVc'COLOR_YUV2RGB_I420c'COLOR_YUV2BGR_I420c'COLOR_YUV420p2RGBc'COLOR_YUV420p2BGRc'COLOR_YUV2RGBA_YV12c'COLOR_YUV2BGRA_YV12c'COLOR_YUV2RGBA_IYUVc'COLOR_YUV2BGRA_IYUVc'COLOR_YUV2RGBA_I420c'COLOR_YUV2BGRA_I420c'COLOR_YUV420p2RGBAc'COLOR_YUV420p2BGRAc'COLOR_YUV2GRAY_420c'COLOR_YUV2GRAY_NV21c'COLOR_YUV2GRAY_NV12c'COLOR_YUV2GRAY_YV12c'COLOR_YUV2GRAY_IYUVc'COLOR_YUV2GRAY_I420c'COLOR_YUV420sp2GRAYc'COLOR_YUV420p2GRAYc'COLOR_YUV2RGB_UYVYc'COLOR_YUV2BGR_UYVYc'COLOR_YUV2RGB_Y422c'COLOR_YUV2BGR_Y422c'COLOR_YUV2RGB_UYNVc'COLOR_YUV2BGR_UYNVc'COLOR_YUV2RGBA_UYVYc'COLOR_YUV2BGRA_UYVYc'COLOR_YUV2RGBA_Y422c'COLOR_YUV2BGRA_Y422c'COLOR_YUV2RGBA_UYNVc'COLOR_YUV2BGRA_UYNVc'COLOR_YUV2RGB_YUY2c'COLOR_YUV2BGR_YUY2c'COLOR_YUV2RGB_YVYUc'COLOR_YUV2BGR_YVYUc'COLOR_YUV2RGB_YUYVc'COLOR_YUV2BGR_YUYVc'COLOR_YUV2RGB_YUNVc'COLOR_YUV2BGR_YUNVc'COLOR_YUV2RGBA_YUY2c'COLOR_YUV2BGRA_YUY2c'COLOR_YUV2RGBA_YVYUc'COLOR_YUV2BGRA_YVYUc'COLOR_YUV2RGBA_YUYVc'COLOR_YUV2BGRA_YUYVc'COLOR_YUV2RGBA_YUNVc'COLOR_YUV2BGRA_YUNVc'COLOR_YUV2GRAY_UYVYc'COLOR_YUV2GRAY_YUY2c'COLOR_YUV2GRAY_Y422c'COLOR_YUV2GRAY_UYNVc'COLOR_YUV2GRAY_YVYUc'COLOR_YUV2GRAY_YUYVc'COLOR_YUV2GRAY_YUNVc'COLOR_RGBA2mRGBAc'COLOR_mRGBA2RGBAc'COLOR_RGB2YUV_I420c'COLOR_BGR2YUV_I420c'COLOR_RGB2YUV_IYUVc'COLOR_BGR2YUV_IYUVc'COLOR_RGBA2YUV_I420c'COLOR_BGRA2YUV_I420c'COLOR_RGBA2YUV_IYUVc'COLOR_BGRA2YUV_IYUVc'COLOR_RGB2YUV_YV12c'COLOR_BGR2YUV_YV12c'COLOR_RGBA2YUV_YV12c'COLOR_BGRA2YUV_YV12c'COLOR_BayerBG2BGRc'COLOR_BayerGB2BGRc'COLOR_BayerRG2BGRc'COLOR_BayerGR2BGRc'COLOR_BayerBG2RGBc'COLOR_BayerGB2RGBc'COLOR_BayerRG2RGBc'COLOR_BayerGR2RGBc'COLOR_BayerBG2GRAYc'COLOR_BayerGB2GRAYc'COLOR_BayerRG2GRAYc'COLOR_BayerGR2GRAYc'COLOR_BayerBG2BGR_VNGc'COLOR_BayerGB2BGR_VNGc'COLOR_BayerRG2BGR_VNGc'COLOR_BayerGR2BGR_VNGc'COLOR_BayerBG2RGB_VNGc'COLOR_BayerGB2RGB_VNGc'COLOR_BayerRG2RGB_VNGc'COLOR_BayerGR2RGB_VNGc'COLOR_BayerBG2BGR_EAc'COLOR_BayerGB2BGR_EAc'COLOR_BayerRG2BGR_EAc'COLOR_BayerGR2BGR_EAc'COLOR_BayerBG2RGB_EAc'COLOR_BayerGB2RGB_EAc'COLOR_BayerRG2RGB_EAc'COLOR_BayerGR2RGB_EA newWholeRange withArrayPtrForeign.ForeignPtr.ImpwithForeignPtrnewRotatedRectnewTermCriterianewRangeunRangeunTermCriteria unRotatedRectunScalar newScalar withPolygons c'INPAINT_NSc'INPAINT_TELEAc'NORMAL_CLONE c'MIXED_CLONEc'MONOCHROME_TRANSFERc'RECURS_FILTERc'NORMCONV_FILTERc'CAP_PROP_POS_MSECc'CAP_PROP_POS_FRAMESc'CAP_PROP_POS_AVI_RATIOc'CAP_PROP_FRAME_WIDTHc'CAP_PROP_FRAME_HEIGHTc'CAP_PROP_FPSc'CAP_PROP_FOURCCc'CAP_PROP_FRAME_COUNTc'CAP_PROP_FORMATc'CAP_PROP_MODEc'CAP_PROP_BRIGHTNESSc'CAP_PROP_CONTRASTc'CAP_PROP_SATURATIONc'CAP_PROP_HUEc'CAP_PROP_GAINc'CAP_PROP_EXPOSUREc'CAP_PROP_CONVERT_RGBc'CAP_PROP_WHITE_BALANCE_BLUE_Uc'CAP_PROP_RECTIFICATIONc'CAP_PROP_MONOCHROMEc'CAP_PROP_SHARPNESSc'CAP_PROP_AUTO_EXPOSUREc'CAP_PROP_GAMMAc'CAP_PROP_TEMPERATUREc'CAP_PROP_TRIGGERc'CAP_PROP_TRIGGER_DELAYc'CAP_PROP_WHITE_BALANCE_RED_Vc'CAP_PROP_ZOOMc'CAP_PROP_FOCUSc'CAP_PROP_GUIDc'CAP_PROP_ISO_SPEEDc'CAP_PROP_BACKLIGHTc'CAP_PROP_PANc'CAP_PROP_TILTc'CAP_PROP_ROLLc'CAP_PROP_IRISc'CAP_PROP_SETTINGSc'CAP_PROP_BUFFERSIZEc'CAP_PROP_AUTOFOCUS c'CAP_ANY c'CAP_VFW c'CAP_V4L c'CAP_V4L2c'CAP_FIREWIREc'CAP_FIREWAREc'CAP_IEEE1394 c'CAP_DC1394 c'CAP_CMU1394c'CAP_QT c'CAP_UNICAP c'CAP_DSHOW c'CAP_PVAPI c'CAP_OPENNIc'CAP_OPENNI_ASUS c'CAP_ANDROID c'CAP_XIAPIc'CAP_AVFOUNDATIONc'CAP_GIGANETIX c'CAP_MSMF c'CAP_WINRTc'CAP_INTELPERC c'CAP_OPENNI2c'CAP_OPENNI2_ASUS c'CAP_GPHOTO2c'CAP_GSTREAMER c'CAP_FFMPEG c'CAP_IMAGESmarshalCapturePropertiesmarshalVideoCaptureAPIMaybePlusTwoWidthAndHeightPlusTwocolorConversionCode StaticDepthThandleCvExceptioncvExcept cvExceptU runCvExceptSTunsafeCvExceptunsafeWrapExceptionunCvCppException marshalFlags marshalDepthunmarshalDepthunmarshalFlagskeepMatAliveDuringGHC.ForeignPtr ForeignPtrPtrdeallocateMatM dimPositions $fToShape::: $fToShapeZ&vector-0.12.0.1-IUGn3M9mkBh8CyXcBnfTR4 Data.VectorVector$fToShapeProxy$fToShapeProxy0 $fToShape[]$fToShapeVectornewMat withVector withMatDataunMat newEmptyMatmatElemAddress#repa-3.4.1.3-GEnIKEZpCV61MI3ixuOgDrData.Array.Repa.BaseArrayD:R:ArrayMshdepth0 windowNamewindowMouseCallbackwindowTrackbars TrackbarStatetrackbarCallbacktrackbarValuePtr unKeyPointunDMatchJunJc'INTER_NEARESTc'INTER_LINEAR c'INTER_CUBIC c'INTER_AREAc'INTER_LANCZOS4marshalInterpolationMethodc'BORDER_CONSTANTc'BORDER_REPLICATEc'BORDER_REFLECT c'BORDER_WRAPc'BORDER_REFLECT_101c'BORDER_TRANSPARENTc'BORDER_ISOLATEDmarshalBorderModeGHC.WordWord8Word16ghc-prim GHC.TypesFloatTrueunCascadeClassifierDrawMatchesParams matchColorsinglePointColorflagsunFlannBasedMatcher unBFMatcher BaseMatcher unBaseMatcherunSimpleBlobDetectorunOrbFalseunBackgroundSubtractorMOG2unBackgroundSubtractorKNNunVideoCapture unVideoWriter