2q      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnop None+,9:;DLQRST[1Perform dctDenoising function for colored images.Example: #dctDenoisingImg :: forall h w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Lenna_512x512 , w2 ~ ((*) w 2) ) => Mat ('S ['S h, 'S w2]) ('S c) ('S d) dctDenoisingImg = exceptError $ do denoised <- dctDenoising 10 Nothing lenna_512x512 withMatM (Proxy :: Proxy [h, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) lenna_512x512 Nothing matCopyToM imgM (V2 w 0) denoised Nothing where w = fromInteger $ natVal (Proxy :: Proxy w)  *doc/generated/examples/dctDenoisingImg.pngdctDenoisingImgq!expected noise standard deviation7size of block side where dct is computed use default 16"Input image 8-bit 3-channel image.)Output image same size and type as input.qSafe+,9:;DLQRST[r$Haskell representation of an OpenCV cv::Ptr cv::Tracker objects$Haskell representation of an OpenCV cv::Ptr cv::xphoto::SimpleWB objectt$Haskell representation of an OpenCV cv::Ptr cv::xphoto::LearningBasedWB objectu$Haskell representation of an OpenCV cv::Ptr cv::xphoto::GrayworldWB objectv$Haskell representation of an OpenCV cv::Ptr cv::xfeatures2d::SURF objectw$Haskell representation of an OpenCV cv::Ptr 'cv::bgsegm::Ptr_BackgroundSubtractorMOG objectx$Haskell representation of an OpenCV cv::Ptr #cv::bgsegm::BackgroundSubtractorGMG objectyz{|}~rstuvwxyz{|}~rstuvwxyz{|}~rstuvwxNone+,9:;DLQRST[?Context useful to work with the OpenCV library's extra modules. Based on  , ,  and most importantly .8: converts OpenCV basic types to their counterparts in OpenCV.Internal.C.Inline.No .None+,9:;<=DLQRST[Haar Feature-based-soon Histogram of Oriented Gradients features"soon Local Binary Pattern featuressoon All types of Feature2D- NameNameName !"#$      !"#$None!"+,9:;DLQRST[/4Threshold for hessian keypoint detector used in SURF09Number of pyramid octaves the keypoint detector will use.1+Number of octave layers within each octave.2kExtended descriptor flag (true - use extended 128-element descriptors; false - use 64-element descriptors).3oUp-right or rotated features flag (true - do not compute orientation of features; false - compute orientation).7(Detect keypoints and compute descriptorsExample: surfDetectAndComputeImg :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Frog) => Mat (ShapeT [height, width]) ('S channels) ('S depth) surfDetectAndComputeImg = exceptError $ do (kpts, _descs) <- surfDetectAndCompute surf frog Nothing withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do void $ matCopyToM imgM (V2 0 0) frog Nothing forM_ kpts $ \kpt -> do let kptRec = keyPointAsRec kpt circle imgM (round <$> kptPoint kptRec :: V2 Int32) 5 blue 1 LineType_AA 0 where surf = mkSurf defaultSurfParams  2doc/generated/examples/surfDetectAndComputeImg.pngsurfDetectAndComputeImg-./01234567Image.Mask.89 -./01234567 4-./0123567 -./0123456789None+,9:;DLQRST[??Perform GrayworldWB a simple grayworld white balance algorithm.Example: ggrayworldWBImg :: forall h w h2 w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Sailboat_768x512 , w2 ~ ((*) w 2) , h2 ~ ((*) h 2) ) => IO (Mat ('S ['S h2, 'S w2]) ('S c) ('S d)) grayworldWBImg = do let bw :: (WhiteBalancer a) => a (PrimState IO) -> IO (Mat (ShapeT [h, w]) ('S c) ('S d)) bw = flip balanceWhite sailboat_768x512 balancedGrayworldWB <- bw =<< newGrayworldWB Nothing balancedLearningBasedWB <- bw =<< newLearningBasedWB Nothing Nothing Nothing balancedSimpleWB <- bw =<< newSimpleWB Nothing Nothing Nothing Nothing Nothing pure $ exceptError $ withMatM (Proxy :: Proxy [h2, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) sailboat_768x512 Nothing matCopyToM imgM (V2 w 0) balancedGrayworldWB Nothing matCopyToM imgM (V2 0 h) balancedLearningBasedWB Nothing matCopyToM imgM (V2 w h) balancedSimpleWB Nothing where w = fromInteger $ natVal (Proxy :: Proxy w) h = fromInteger $ natVal (Proxy :: Proxy h)  )doc/generated/examples/grayworldWBImg.pnggrayworldWBImg):;<=>?A threshold of 1 means that all pixels are used to white-balance, while a threshold of 0 means no pixels are used. Lower thresholds are useful in white-balancing saturated images. Default: 0.9.@default 64, Defines the size of one dimension of a three-dimensional RGB histogram that is used internally by the algorithm. It often makes sense to increase the number of bins for images with higher bit depth (e.g. 256 bins for a 12 bit image).jdefault 255, Maximum possible value of the input image (e.g. 255 for 8 bit images, 4095 for 12 bit images){default 0.98, Threshold that is used to determine saturated pixels, i.e. pixels where at least one of the channels exceedsAInput Min (default: 0)Input Max (default: 255)Output Min (default: 0)Output Max (default: 255)3Percent of top/bottom values to ignore (default: 2)BCDEFGHIJKLM:;<=>?@A=><;:?@A":;<=>?@ABCDEFGHIJKLMNone+,9:;DLQRST[NOP:Number of frames used to initialize the background models.FThreshold value, above which it is marked foreground, else background.QLength of the history.Number of Gaussian mixtures.Background ratio.lNoise strength (standard deviation of the brightness or each color channel). 0 means some automatic value.RSTUVWXYNOPQONPQNOPQRSTUVWXYNone!"+,9:;DLQRST[Z9The set of predefined ArUco dictionaries known to OpenCV.\The result of calling c on an image.An encoding of the result of _.]vA ChArUco board is used to perform camera calibration from ArUco markers overlaid on a chess board of known size. Use f to create values of this type.^A  Dictionary= describes the possible QR codes used for ArUco markers. Use g to lookup known dictionaries._dGiven an image and the detected ArUco markers in that image, attempt to perform ChAruco calibration.`nGiven an image, the ChArUco markers in that image, and the camera calibration, estimate the pose of the board.aAGiven an estimated pose for a board, draw the axis over an image.bYGiven a list of ChArUco calibration results, combine all results into camera calibration.cPerform ArUco marker detection.d<Given a frame, overlay the result of ArUco marker detection.e>Given a frame, overlay the result of ChArUco marker detection.f)Create a new ChArUco board configuration.g:Turn a predefined dictionary name into a ArUco dictionary.hSDraw a ChArUco board, ready to be printed and used for calibration/marke detection.Example: drawChArUcoBoardImg :: forall (w :: Nat) (h :: Nat) . (w ~ 500, h ~ 500) => Mat ('S '[ 'S h, 'S w]) ('S 1) ('S Word8) drawChArUcoBoardImg = drawChArUcoBoard charucoBoard (Proxy :: Proxy w) (Proxy :: Proxy h) where charucoBoard :: ChArUcoBoard charucoBoard = createChArUcoBoard 10 10 20 5 dictionary dictionary :: Dictionary dictionary = getPredefinedDictionary DICT_7X7_1000  .doc/generated/examples/drawChArUcoBoardImg.pngdrawChArUcoBoardImg8Z[\]^_-The ChArUco board to interpolate markers for.A view of a ChArUco board.-The ArUco markers detected in the same image.`The ChArUco board parameters.Detected ChArUco markers.aA pair of the camera intrinsic parameters and a 5 dimensional vector of distortion coefficients.a/The matrix of intrinsic parameters of a camera.2A 5-dimensional vector of distortion coefficients.RThe transposition and rotation matrices from local to camera space, respectively.An image to draw the axis onto.bc&A dictionary describing ArUco markers."The matrix to detect markers from.d[The image to draw detected markers onto. Usually the same image you detected markers from.The ArUco markers to draw.e#The image to draw detected corners.$The ChArUco markers corners to draw.f'The amount of squares along the X-axis.'The amount of squares along the Y-axis.-The length of a side of a chess-board square.:The length of a marker's side within a chess-board square. The dictionary of ArUco markers.ghwidthheightijklmnopZ[\]^_`abcdefgh^Z[gc\d]fh_`bea)Z[\]^_`abcdefghijklmnop None+,9:;DLQRST[C -./01234567:;<=>?@ANOPQZ[\]^_`abcdefgh    !"#$%&'()*+,-./01234556789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~;ABCUVcde+opencv-extra-0.0.0.1-1Ua3MjShzzp89TL6XhXr78OpenCV.Extra.XPhotoOpenCV.Extra.TrackingOpenCV.Extra.XFeatures2d!OpenCV.Extra.XPhoto.WhiteBalancerOpenCV.Extra.BgsegmOpenCV.Extra.ArUcoOpenCV.Extra.Internal.C.TypesOpenCV.Extra.Internal.C.InlineCcppCtx OpenCV.Extra dctDenoisingMultiTrackerAltunMultiTrackerAlt MultiTrackerunMultiTrackerTrackerFeatureunTrackerFeatureTracker unTrackerTrackerFeatureTypeHAARHOGLBP FEATURE2D TrackerTypeBOOSTINGMILKCF MEDIANFLOWTLD newTracker initTracker updateTrackernewMultiTrackernewTrackerFeature$fFromPtrMultiTrackerAlt$fWithPtrMultiTrackerAlt$fFromPtrMultiTracker$fWithPtrMultiTracker$fFromPtrTrackerFeature$fWithPtrTrackerFeature$fFromPtrTracker$fWithPtrTracker$fEqTrackerType$fShowTrackerType$fEnumTrackerType$fBoundedTrackerType$fEqTrackerFeatureType$fShowTrackerFeatureType$fEnumTrackerFeatureType$fBoundedTrackerFeatureType SurfParamssurf_hessianThreshold surf_nOctavessurf_nOctaveLayers surf_extended surf_uprightSurfdefaultSurfParamsmkSurfsurfDetectAndCompute $fFromPtrSurf $fWithPtrSurfSimpleWBLearningBasedWB GrayworldWB WhiteBalancer balanceWhitenewGrayworldWBnewLearningBasedWB newSimpleWB$fWhiteBalancerSimpleWB$fWhiteBalancerLearningBasedWB$fWhiteBalancerGrayworldWB$fAlgorithmSimpleWB$fAlgorithmLearningBasedWB$fAlgorithmGrayworldWB$fFromPtrSimpleWB$fWithPtrSimpleWB$fFromPtrLearningBasedWB$fWithPtrLearningBasedWB$fFromPtrGrayworldWB$fWithPtrGrayworldWBBackgroundSubtractorMOGBackgroundSubtractorGMGnewBackgroundSubtractorGMGnewBackgroundSubtractorMOG-$fBackgroundSubtractorBackgroundSubtractorMOG-$fBackgroundSubtractorBackgroundSubtractorGMG"$fAlgorithmBackgroundSubtractorMOG"$fAlgorithmBackgroundSubtractorGMG $fFromPtrBackgroundSubtractorMOG $fFromPtrBackgroundSubtractorGMG $fWithPtrBackgroundSubtractorMOG $fWithPtrBackgroundSubtractorGMGPredefinedDictionaryName DICT_7X7_1000 ArUcoMarkers ChArUcoBoard DictionaryinterpolateChArUcoMarkersestimatePoseChArUcoBoarddrawEstimatedPosecalibrateCameraFromFrames detectMarkersdrawDetectedMarkersdrawDetectedCornersCharucocreateChArUcoBoardgetPredefinedDictionarydrawChArUcoBoard$fWithPtrVector'Vector'Point2f$fFromPtrVector'Vector'Point2f$fWithPtrVector'Int$fFromPtrVector'Int$fWithPtrChArUcoBoard$fFromPtrChArUcoBoard$fWithPtrDictionary$fFromPtrDictionary inline_c_ffi_6989586621679106206 C'Ptr_TrackerC'Ptr_SimpleWBC'Ptr_LearningBasedWBC'Ptr_GrayworldWB C'Ptr_SURFC'Ptr_BackgroundSubtractorMOGC'Ptr_BackgroundSubtractorGMGC'Vector'Vector'Point2f C'Vector'IntC'Ptr'CharucoBoardC'Ptr'DictionaryC'Ptr_TrackerSamplerAlgorithmC'TrackerFeatureSetC'Ptr_MultiTrackerAltC'Ptr_MultiTrackerC'Ptr_TrackerFeatureopenCvExtraCtx'inline-c-0.5.6.1-ALNIYNFiTSt6GLSjOqBNq3Language.C.Inline.ContextbsCtxvecCtx%opencv-0.0.1.0-7rccqxaVltZGT7fzBeTepnOpenCV.Internal.C.Inline openCvCtx ctxTypesTablectxAntiQuotersopenCvExtraTypesTable inline_c_ffi_6989586621679136283 inline_c_ffi_6989586621679136309 inline_c_ffi_6989586621679136336 inline_c_ffi_6989586621679136347 inline_c_ffi_6989586621679136357 inline_c_ffi_6989586621679136367 inline_c_ffi_6989586621679136379 inline_c_ffi_6989586621679136391 inline_c_ffi_6989586621679136403 inline_c_ffi_6989586621679158505 inline_c_ffi_6989586621679158533 inline_c_ffi_6989586621679159015 inline_c_ffi_6989586621679159024unSurfnewSurf inline_c_ffi_6989586621679171274 inline_c_ffi_6989586621679171292 inline_c_ffi_6989586621679171318 inline_c_ffi_6989586621679171336 inline_c_ffi_6989586621679171354 inline_c_ffi_6989586621679171372 inline_c_ffi_6989586621679171382 inline_c_ffi_6989586621679171395 inline_c_ffi_6989586621679171405 inline_c_ffi_6989586621679171418 inline_c_ffi_6989586621679171428 inline_c_ffi_6989586621679171441 inline_c_ffi_6989586621679171451 inline_c_ffi_6989586621679171463 inline_c_ffi_6989586621679171475 unSimpleWBunLearningBasedWB unGrayworldWB inline_c_ffi_6989586621679183042 inline_c_ffi_6989586621679183064 inline_c_ffi_6989586621679183086 inline_c_ffi_6989586621679183100 inline_c_ffi_6989586621679183122 inline_c_ffi_6989586621679183136 inline_c_ffi_6989586621679183146 inline_c_ffi_6989586621679183159 inline_c_ffi_6989586621679183169 inline_c_ffi_6989586621679183182 inline_c_ffi_6989586621679183192 inline_c_ffi_6989586621679183202unBackgroundSubtractorMOGunBackgroundSubtractorGMGChArUcoMarkers inline_c_ffi_6989586621679195479 inline_c_ffi_6989586621679195516 inline_c_ffi_6989586621679195543 inline_c_ffi_6989586621679195601 inline_c_ffi_6989586621679195621 inline_c_ffi_6989586621679195642 inline_c_ffi_6989586621679195660 inline_c_ffi_6989586621679195686 inline_c_ffi_6989586621679195703 inline_c_ffi_6989586621679195711 inline_c_ffi_6989586621679195733 inline_c_ffi_6989586621679195748 inline_c_ffi_6989586621679195757 inline_c_ffi_6989586621679195766 inline_c_ffi_6989586621679195775 arucoCornersarucoIds charucoIdscharucoCornersVector'Vector'Point2funVectorVectorPoint2f Vector'Int unVectorIntunChArUcoBoard unDictionarywithPtrs