&      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~ None+,9:;DLQRST[1Perform dctDenoising function for colored images.Example: #dctDenoisingImg :: forall h w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Lenna_512x512 , w2 ~ ((*) w 2) ) => Mat ('S ['S h, 'S w2]) ('S c) ('S d) dctDenoisingImg = exceptError $ do denoised <- dctDenoising 10 Nothing lenna_512x512 withMatM (Proxy :: Proxy [h, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) lenna_512x512 Nothing matCopyToM imgM (V2 w 0) denoised Nothing where w = fromInteger $ natVal (Proxy :: Proxy w)  *doc/generated/examples/dctDenoisingImg.pngdctDenoisingImg!expected noise standard deviation7size of block side where dct is computed use default 16"Input image 8-bit 3-channel image.)Output image same size and type as input.Safe+,9:;DLQRST[$Haskell representation of an OpenCV cv::Ptr cv::Tracker object$Haskell representation of an OpenCV cv::Ptr cv::xphoto::SimpleWB object$Haskell representation of an OpenCV cv::Ptr cv::xphoto::LearningBasedWB object$Haskell representation of an OpenCV cv::Ptr cv::xphoto::GrayworldWB object$Haskell representation of an OpenCV cv::Ptr cv::xfeatures2d::SURF object$Haskell representation of an OpenCV cv::Ptr 'cv::bgsegm::Ptr_BackgroundSubtractorMOG object$Haskell representation of an OpenCV cv::Ptr #cv::bgsegm::BackgroundSubtractorGMG objectNone+,9:;DLQRST[?Context useful to work with the OpenCV library's extra modules. Based on  , ,  and most importantly .8: converts OpenCV basic types to their counterparts in OpenCV.Internal.C.Inline.No .None+,9:;<=DLQRST[Haar Feature-based-soon Histogram of Oriented Gradients features"soon Local Binary Pattern featuressoon All types of Feature2D- NameNameName !"#$      !"#$None!"+,9:;DLQRST[/4Threshold for hessian keypoint detector used in SURF09Number of pyramid octaves the keypoint detector will use.1+Number of octave layers within each octave.2kExtended descriptor flag (true - use extended 128-element descriptors; false - use 64-element descriptors).3oUp-right or rotated features flag (true - do not compute orientation of features; false - compute orientation).7(Detect keypoints and compute descriptorsExample: surfDetectAndComputeImg :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Frog) => Mat (ShapeT [height, width]) ('S channels) ('S depth) surfDetectAndComputeImg = exceptError $ do (kpts, _descs) <- surfDetectAndCompute surf frog Nothing withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do void $ matCopyToM imgM (V2 0 0) frog Nothing forM_ kpts $ \kpt -> do let kptRec = keyPointAsRec kpt circle imgM (round <$> kptPoint kptRec :: V2 Int32) 5 blue 1 LineType_AA 0 where surf = mkSurf defaultSurfParams  2doc/generated/examples/surfDetectAndComputeImg.pngsurfDetectAndComputeImg-./01234567Image.Mask.89 -./01234567 4-./0123567 -./0123456789None+,9:;DLQRST[??Perform GrayworldWB a simple grayworld white balance algorithm.Example: ggrayworldWBImg :: forall h w h2 w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Sailboat_768x512 , w2 ~ ((*) w 2) , h2 ~ ((*) h 2) ) => IO (Mat ('S ['S h2, 'S w2]) ('S c) ('S d)) grayworldWBImg = do let bw :: (WhiteBalancer a) => a (PrimState IO) -> IO (Mat (ShapeT [h, w]) ('S c) ('S d)) bw = flip balanceWhite sailboat_768x512 balancedGrayworldWB <- bw =<< newGrayworldWB Nothing balancedLearningBasedWB <- bw =<< newLearningBasedWB Nothing Nothing Nothing balancedSimpleWB <- bw =<< newSimpleWB Nothing Nothing Nothing Nothing Nothing pure $ exceptError $ withMatM (Proxy :: Proxy [h2, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) sailboat_768x512 Nothing matCopyToM imgM (V2 w 0) balancedGrayworldWB Nothing matCopyToM imgM (V2 0 h) balancedLearningBasedWB Nothing matCopyToM imgM (V2 w h) balancedSimpleWB Nothing where w = fromInteger $ natVal (Proxy :: Proxy w) h = fromInteger $ natVal (Proxy :: Proxy h)  )doc/generated/examples/grayworldWBImg.pnggrayworldWBImg):;<=>?A threshold of 1 means that all pixels are used to white-balance, while a threshold of 0 means no pixels are used. Lower thresholds are useful in white-balancing saturated images. Default: 0.9.@default 64, Defines the size of one dimension of a three-dimensional RGB histogram that is used internally by the algorithm. It often makes sense to increase the number of bins for images with higher bit depth (e.g. 256 bins for a 12 bit image).jdefault 255, Maximum possible value of the input image (e.g. 255 for 8 bit images, 4095 for 12 bit images){default 0.98, Threshold that is used to determine saturated pixels, i.e. pixels where at least one of the channels exceedsAInput Min (default: 0)Input Max (default: 255)Output Min (default: 0)Output Max (default: 255)3Percent of top/bottom values to ignore (default: 2)BCDEFGHIJKLM:;<=>?@A=><;:?@A":;<=>?@ABCDEFGHIJKLMNone+,9:;DLQRST[NOP:Number of frames used to initialize the background models.FThreshold value, above which it is marked foreground, else background.QLength of the history.Number of Gaussian mixtures.Background ratio.lNoise strength (standard deviation of the brightness or each color channel). 0 means some automatic value.RSTUVWXYNOPQONPQNOPQRSTUVWXYNone!"+,9:;DLQRST[Z9The set of predefined ArUco dictionaries known to OpenCV.lThe result of calling s on an image.An encoding of the result of o.mvA ChArUco board is used to perform camera calibration from ArUco markers overlaid on a chess board of known size. Use v to create values of this type.nA  Dictionary= describes the possible QR codes used for ArUco markers. Use w to lookup known dictionaries.odGiven an image and the detected ArUco markers in that image, attempt to perform ChAruco calibration.pnGiven an image, the ChArUco markers in that image, and the camera calibration, estimate the pose of the board.qAGiven an estimated pose for a board, draw the axis over an image.rYGiven a list of ChArUco calibration results, combine all results into camera calibration.sPerform ArUco marker detection.t<Given a frame, overlay the result of ArUco marker detection.u>Given a frame, overlay the result of ChArUco marker detection.v)Create a new ChArUco board configuration.w:Turn a predefined dictionary name into a ArUco dictionary.xSDraw a ChArUco board, ready to be printed and used for calibration/marke detection.Example: drawChArUcoBoardImg :: forall (w :: Nat) (h :: Nat) . (w ~ 500, h ~ 500) => Mat ('S '[ 'S h, 'S w]) ('S 1) ('S Word8) drawChArUcoBoardImg = drawChArUcoBoard charucoBoard (Proxy :: Proxy w) (Proxy :: Proxy h) where charucoBoard :: ChArUcoBoard charucoBoard = createChArUcoBoard 10 10 20 5 dictionary dictionary :: Dictionary dictionary = getPredefinedDictionary DICT_7X7_1000  .doc/generated/examples/drawChArUcoBoardImg.pngdrawChArUcoBoardImgYZ[\]^_`abcdefghijklmno-The ChArUco board to interpolate markers for.A view of a ChArUco board.-The ArUco markers detected in the same image.pThe ChArUco board parameters.Detected ChArUco markers.aA pair of the camera intrinsic parameters and a 5 dimensional vector of distortion coefficients.q/The matrix of intrinsic parameters of a camera.2A 5-dimensional vector of distortion coefficients.RThe transposition and rotation matrices from local to camera space, respectively.An image to draw the axis onto.rs&A dictionary describing ArUco markers."The matrix to detect markers from.t[The image to draw detected markers onto. Usually the same image you detected markers from.The ArUco markers to draw.u#The image to draw detected corners.$The ChArUco markers corners to draw.v'The amount of squares along the X-axis.'The amount of squares along the Y-axis.-The length of a side of a chess-board square.:The length of a marker's side within a chess-board square. The dictionary of ArUco markers.wxwidthheightyz{|}~Z[\]^_`abcdefghijklmnopqrstuvwxnZ[\]^_`abcdefghijkwsltmvxopruq:Z[\]^_`abcdefghijklmnopqrstuvwxyz{|}~ None+,9:;DLQRST[S -./01234567:;<=>?@ANOPQZ[\]^_`abcdefghijklmnopqrstuvwx    !"#$%&'()*+,-./01234556789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~;ABCUVstu*opencv-extra-0.1.0.0-7oUu4cQW57hdHR9pPuswVOpenCV.Extra.XPhotoOpenCV.Extra.TrackingOpenCV.Extra.XFeatures2d!OpenCV.Extra.XPhoto.WhiteBalancerOpenCV.Extra.BgsegmOpenCV.Extra.ArUcoOpenCV.Extra.Internal.C.TypesOpenCV.Extra.Internal.C.InlineCcppCtx OpenCV.Extra dctDenoisingMultiTrackerAltunMultiTrackerAlt MultiTrackerunMultiTrackerTrackerFeatureunTrackerFeatureTracker unTrackerTrackerFeatureTypeHAARHOGLBP FEATURE2D TrackerTypeBOOSTINGMILKCF MEDIANFLOWTLD newTracker initTracker updateTrackernewMultiTrackernewTrackerFeature$fFromPtrMultiTrackerAlt$fWithPtrMultiTrackerAlt$fFromPtrMultiTracker$fWithPtrMultiTracker$fFromPtrTrackerFeature$fWithPtrTrackerFeature$fFromPtrTracker$fWithPtrTracker$fEqTrackerType$fShowTrackerType$fEnumTrackerType$fBoundedTrackerType$fEqTrackerFeatureType$fShowTrackerFeatureType$fEnumTrackerFeatureType$fBoundedTrackerFeatureType SurfParamssurf_hessianThreshold surf_nOctavessurf_nOctaveLayers surf_extended surf_uprightSurfdefaultSurfParamsmkSurfsurfDetectAndCompute $fFromPtrSurf $fWithPtrSurfSimpleWBLearningBasedWB GrayworldWB WhiteBalancer balanceWhitenewGrayworldWBnewLearningBasedWB newSimpleWB$fWhiteBalancerSimpleWB$fWhiteBalancerLearningBasedWB$fWhiteBalancerGrayworldWB$fAlgorithmSimpleWB$fAlgorithmLearningBasedWB$fAlgorithmGrayworldWB$fFromPtrSimpleWB$fWithPtrSimpleWB$fFromPtrLearningBasedWB$fWithPtrLearningBasedWB$fFromPtrGrayworldWB$fWithPtrGrayworldWBBackgroundSubtractorMOGBackgroundSubtractorGMGnewBackgroundSubtractorGMGnewBackgroundSubtractorMOG-$fBackgroundSubtractorBackgroundSubtractorMOG-$fBackgroundSubtractorBackgroundSubtractorGMG"$fAlgorithmBackgroundSubtractorMOG"$fAlgorithmBackgroundSubtractorGMG $fFromPtrBackgroundSubtractorMOG $fFromPtrBackgroundSubtractorGMG $fWithPtrBackgroundSubtractorMOG $fWithPtrBackgroundSubtractorGMGPredefinedDictionaryName DICT_4X4_50 DICT_4X4_100 DICT_4X4_250 DICT_4X4_1000 DICT_5X5_50 DICT_5X5_100 DICT_5X5_250 DICT_5X5_1000 DICT_6X6_50 DICT_6X6_100 DICT_6X6_250 DICT_6X6_1000 DICT_7X7_50 DICT_7X7_100 DICT_7X7_250 DICT_7X7_1000DICT_ARUCO_ORIGINAL ArUcoMarkers ChArUcoBoard DictionaryinterpolateChArUcoMarkersestimatePoseChArUcoBoarddrawEstimatedPosecalibrateCameraFromFrames detectMarkersdrawDetectedMarkersdrawDetectedCornersCharucocreateChArUcoBoardgetPredefinedDictionarydrawChArUcoBoard$fWithPtrVector'Vector'Point2f$fFromPtrVector'Vector'Point2f$fWithPtrVector'Int$fFromPtrVector'Int$fWithPtrChArUcoBoard$fFromPtrChArUcoBoard$fWithPtrDictionary$fFromPtrDictionary$fShowPredefinedDictionaryName$fEqPredefinedDictionaryName inline_c_ffi_6989586621679106206 C'Ptr_TrackerC'Ptr_SimpleWBC'Ptr_LearningBasedWBC'Ptr_GrayworldWB C'Ptr_SURFC'Ptr_BackgroundSubtractorMOGC'Ptr_BackgroundSubtractorGMGC'Vector'Vector'Point2f C'Vector'IntC'Ptr'CharucoBoardC'Ptr'DictionaryC'Ptr_TrackerSamplerAlgorithmC'TrackerFeatureSetC'Ptr_MultiTrackerAltC'Ptr_MultiTrackerC'Ptr_TrackerFeatureopenCvExtraCtx'inline-c-0.5.6.1-ALNIYNFiTSt6GLSjOqBNq3Language.C.Inline.ContextbsCtxvecCtx%opencv-0.0.1.1-ID9tZYSQ0yUCw8T0FILchkOpenCV.Internal.C.Inline openCvCtx ctxTypesTablectxAntiQuotersopenCvExtraTypesTable inline_c_ffi_6989586621679136283 inline_c_ffi_6989586621679136309 inline_c_ffi_6989586621679136336 inline_c_ffi_6989586621679136347 inline_c_ffi_6989586621679136357 inline_c_ffi_6989586621679136367 inline_c_ffi_6989586621679136379 inline_c_ffi_6989586621679136391 inline_c_ffi_6989586621679136403 inline_c_ffi_6989586621679158505 inline_c_ffi_6989586621679158533 inline_c_ffi_6989586621679159015 inline_c_ffi_6989586621679159024unSurfnewSurf inline_c_ffi_6989586621679171274 inline_c_ffi_6989586621679171292 inline_c_ffi_6989586621679171318 inline_c_ffi_6989586621679171336 inline_c_ffi_6989586621679171354 inline_c_ffi_6989586621679171372 inline_c_ffi_6989586621679171382 inline_c_ffi_6989586621679171395 inline_c_ffi_6989586621679171405 inline_c_ffi_6989586621679171418 inline_c_ffi_6989586621679171428 inline_c_ffi_6989586621679171441 inline_c_ffi_6989586621679171451 inline_c_ffi_6989586621679171463 inline_c_ffi_6989586621679171475 unSimpleWBunLearningBasedWB unGrayworldWB inline_c_ffi_6989586621679183042 inline_c_ffi_6989586621679183064 inline_c_ffi_6989586621679183086 inline_c_ffi_6989586621679183100 inline_c_ffi_6989586621679183122 inline_c_ffi_6989586621679183136 inline_c_ffi_6989586621679183146 inline_c_ffi_6989586621679183159 inline_c_ffi_6989586621679183169 inline_c_ffi_6989586621679183182 inline_c_ffi_6989586621679183192 inline_c_ffi_6989586621679183202unBackgroundSubtractorMOGunBackgroundSubtractorGMGChArUcoMarkers inline_c_ffi_6989586621679195530 inline_c_ffi_6989586621679195567 inline_c_ffi_6989586621679195594 inline_c_ffi_6989586621679195652 inline_c_ffi_6989586621679195672 inline_c_ffi_6989586621679195693 inline_c_ffi_6989586621679195711 inline_c_ffi_6989586621679195737 inline_c_ffi_6989586621679195747 inline_c_ffi_6989586621679195769 inline_c_ffi_6989586621679195784 inline_c_ffi_6989586621679195793 inline_c_ffi_6989586621679195802 inline_c_ffi_6989586621679195811 arucoCornersarucoIds charucoIdscharucoCornersVector'Vector'Point2funVectorVectorPoint2f Vector'Int unVectorIntunChArUcoBoard unDictionary c'DICT_4X4_50c'DICT_4X4_100c'DICT_4X4_250c'DICT_4X4_1000 c'DICT_5X5_50c'DICT_5X5_100c'DICT_5X5_250c'DICT_5X5_1000 c'DICT_6X6_50c'DICT_6X6_100c'DICT_6X6_250c'DICT_6X6_1000 c'DICT_7X7_50c'DICT_7X7_100c'DICT_7X7_250c'DICT_7X7_1000c'DICT_ARUCO_ORIGINALmarshalPredefinedDictionaryNamewithPtrs