2q      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnop None+,9:;DLQRST[1Perform dctDenoising function for colored images.Example: #dctDenoisingImg :: forall h w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Lenna_512x512 , w2 ~ ((*) w 2) ) => Mat ('S ['S h, 'S w2]) ('S c) ('S d) dctDenoisingImg = exceptError $ do denoised <- dctDenoising 10 Nothing lenna_512x512 withMatM (Proxy :: Proxy [h, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) lenna_512x512 Nothing matCopyToM imgM (V2 w 0) denoised Nothing where w = fromInteger $ natVal (Proxy :: Proxy w)  *doc/generated/examples/dctDenoisingImg.pngdctDenoisingImgq!expected noise standard deviation7size of block side where dct is computed use default 16"Input image 8-bit 3-channel image.)Output image same size and type as input.qSafe+,9:;DLQRST[r$Haskell representation of an OpenCV cv::Ptr cv::Tracker objects$Haskell representation of an OpenCV cv::Ptr cv::xphoto::SimpleWB objectt$Haskell representation of an OpenCV cv::Ptr cv::xphoto::LearningBasedWB objectu$Haskell representation of an OpenCV cv::Ptr cv::xphoto::GrayworldWB objectv$Haskell representation of an OpenCV cv::Ptr cv::xfeatures2d::SURF objectw$Haskell representation of an OpenCV cv::Ptr 'cv::bgsegm::Ptr_BackgroundSubtractorMOG objectx$Haskell representation of an OpenCV cv::Ptr #cv::bgsegm::BackgroundSubtractorGMG objectyz{|}~rstuvwxyz{|}~rstuvwxyz{|}~rstuvwxNone+,9:;DLQRST[?Context useful to work with the OpenCV library's extra modules. Based on  , ,  and most importantly .8: converts OpenCV basic types to their counterparts in OpenCV.Internal.C.Inline.No .None+,9:;<=DLQRST[Haar Feature-based-soon Histogram of Oriented Gradients features"soon Local Binary Pattern featuressoon All types of Feature2D- NameNameName !"#$      !"#$None!"+,9:;DLQRST[/4Threshold for hessian keypoint detector used in SURF09Number of pyramid octaves the keypoint detector will use.1+Number of octave layers within each octave.2kExtended descriptor flag (true - use extended 128-element descriptors; false - use 64-element descriptors).3oUp-right or rotated features flag (true - do not compute orientation of features; false - compute orientation).7(Detect keypoints and compute descriptorsExample: surfDetectAndComputeImg :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Frog) => Mat (ShapeT [height, width]) ('S channels) ('S depth) surfDetectAndComputeImg = exceptError $ do (kpts, _descs) <- surfDetectAndCompute surf frog Nothing withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do void $ matCopyToM imgM (V2 0 0) frog Nothing forM_ kpts $ \kpt -> do let kptRec = keyPointAsRec kpt circle imgM (round <$> kptPoint kptRec :: V2 Int32) 5 blue 1 LineType_AA 0 where surf = mkSurf defaultSurfParams  2doc/generated/examples/surfDetectAndComputeImg.pngsurfDetectAndComputeImg-./01234567Image.Mask.89 -./01234567 4-./0123567 -./0123456789None+,9:;DLQRST[??Perform GrayworldWB a simple grayworld white balance algorithm.Example: ggrayworldWBImg :: forall h w h2 w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Sailboat_768x512 , w2 ~ ((*) w 2) , h2 ~ ((*) h 2) ) => IO (Mat ('S ['S h2, 'S w2]) ('S c) ('S d)) grayworldWBImg = do let bw :: (WhiteBalancer a) => a (PrimState IO) -> IO (Mat (ShapeT [h, w]) ('S c) ('S d)) bw = flip balanceWhite sailboat_768x512 balancedGrayworldWB <- bw =<< newGrayworldWB Nothing balancedLearningBasedWB <- bw =<< newLearningBasedWB Nothing Nothing Nothing balancedSimpleWB <- bw =<< newSimpleWB Nothing Nothing Nothing Nothing Nothing pure $ exceptError $ withMatM (Proxy :: Proxy [h2, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) sailboat_768x512 Nothing matCopyToM imgM (V2 w 0) balancedGrayworldWB Nothing matCopyToM imgM (V2 0 h) balancedLearningBasedWB Nothing matCopyToM imgM (V2 w h) balancedSimpleWB Nothing where w = fromInteger $ natVal (Proxy :: Proxy w) h = fromInteger $ natVal (Proxy :: Proxy h)  )doc/generated/examples/grayworldWBImg.pnggrayworldWBImg):;<=>?A threshold of 1 means that all pixels are used to white-balance, while a threshold of 0 means no pixels are used. Lower thresholds are useful in white-balancing saturated images. Default: 0.9.@default 64, Defines the size of one dimension of a three-dimensional RGB histogram that is used internally by the algorithm. It often makes sense to increase the number of bins for images with higher bit depth (e.g. 256 bins for a 12 bit image).jdefault 255, Maximum possible value of the input image (e.g. 255 for 8 bit images, 4095 for 12 bit images){default 0.98, Threshold that is used to determine saturated pixels, i.e. pixels where at least one of the channels exceedsAInput Min (default: 0)Input Max (default: 255)Output Min (default: 0)Output Max (default: 255)3Percent of top/bottom values to ignore (default: 2)BCDEFGHIJKLM:;<=>?@A=><;:?@A":;<=>?@ABCDEFGHIJKLMNone+,9:;DLQRST[NOP:Number of frames used to initialize the background models.FThreshold value, above which it is marked foreground, else background.QLength of the history.Number of Gaussian mixtures.Background ratio.lNoise strength (standard deviation of the brightness or each color channel). 0 means some automatic value.RSTUVWXYNOPQONPQNOPQRSTUVWXYNone!"+,9:;DLQRST[Z9The set of predefined ArUco dictionaries known to OpenCV.\The result of calling c on an image.An encoding of the result of _.]vA ChArUco board is used to perform camera calibration from ArUco markers overlaid on a chess board of known size. Use f to create values of this type.^A  Dictionary= describes the possible QR codes used for ArUco markers. Use g to lookup known dictionaries._dGiven an image and the detected ArUco markers in that image, attempt to perform ChAruco calibration.`nGiven an image, the ChArUco markers in that image, and the camera calibration, estimate the pose of the board.aAGiven an estimated pose for a board, draw the axis over an image.bYGiven a list of ChArUco calibration results, combine all results into camera calibration.cPerform ArUco marker detection.d<Given a frame, overlay the result of ArUco marker detection.e>Given a frame, overlay the result of ChArUco marker detection.f)Create a new ChArUco board configuration.g:Turn a predefined dictionary name into a ArUco dictionary.hSDraw a ChArUco board, ready to be printed and used for calibration/marke detection.Example: drawChArUcoBoardImg :: forall (w :: Nat) (h :: Nat) . (w ~ 500, h ~ 500) => Mat ('S '[ 'S h, 'S w]) ('S 1) ('S Word8) drawChArUcoBoardImg = drawChArUcoBoard charucoBoard (Proxy :: Proxy w) (Proxy :: Proxy h) where charucoBoard :: ChArUcoBoard charucoBoard = createChArUcoBoard 10 10 20 5 dictionary dictionary :: Dictionary dictionary = getPredefinedDictionary DICT_7X7_1000  .doc/generated/examples/drawChArUcoBoardImg.pngdrawChArUcoBoardImg8Z[\]^_-The ChArUco board to interpolate markers for.A view of a ChArUco board.-The ArUco markers detected in the same image.`The ChArUco board parameters.Detected ChArUco markers.aA pair of the camera intrinsic parameters and a 5 dimensional vector of distortion coefficients.a/The matrix of intrinsic parameters of a camera.2A 5-dimensional vector of distortion coefficients.RThe transposition and rotation matrices from local to camera space, respectively.An image to draw the axis onto.bc&A dictionary describing ArUco markers."The matrix to detect markers from.d[The image to draw detected markers onto. Usually the same image you detected markers from.The ArUco markers to draw.e#The image to draw detected corners.$The ChArUco markers corners to draw.f'The amount of squares along the X-axis.'The amount of squares along the Y-axis.-The length of a side of a chess-board square.:The length of a marker's side within a chess-board square. The dictionary of ArUco markers.ghwidthheightijklmnopZ[\]^_`abcdefgh^Z[gc\d]fh_`bea)Z[\]^_`abcdefghijklmnop None+,9:;DLQRST[C -./01234567:;<=>?@ANOPQZ[\]^_`abcdefgh    !"#$%&'()*+,-./01234556789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~;ABCUVcde+opencv-extra-0.0.0.0-IyBCLIdJLyh9wjnsendLLUOpenCV.Extra.XPhotoOpenCV.Extra.TrackingOpenCV.Extra.XFeatures2d!OpenCV.Extra.XPhoto.WhiteBalancerOpenCV.Extra.BgsegmOpenCV.Extra.ArUcoOpenCV.Extra.Internal.C.TypesOpenCV.Extra.Internal.C.InlineCcppCtx OpenCV.Extra dctDenoisingMultiTrackerAltunMultiTrackerAlt MultiTrackerunMultiTrackerTrackerFeatureunTrackerFeatureTracker unTrackerTrackerFeatureTypeHAARHOGLBP FEATURE2D TrackerTypeBOOSTINGMILKCF MEDIANFLOWTLD newTracker initTracker updateTrackernewMultiTrackernewTrackerFeature$fFromPtrMultiTrackerAlt$fWithPtrMultiTrackerAlt$fFromPtrMultiTracker$fWithPtrMultiTracker$fFromPtrTrackerFeature$fWithPtrTrackerFeature$fFromPtrTracker$fWithPtrTracker$fEqTrackerType$fShowTrackerType$fEnumTrackerType$fBoundedTrackerType$fEqTrackerFeatureType$fShowTrackerFeatureType$fEnumTrackerFeatureType$fBoundedTrackerFeatureType SurfParamssurf_hessianThreshold surf_nOctavessurf_nOctaveLayers surf_extended surf_uprightSurfdefaultSurfParamsmkSurfsurfDetectAndCompute $fFromPtrSurf $fWithPtrSurfSimpleWBLearningBasedWB GrayworldWB WhiteBalancer balanceWhitenewGrayworldWBnewLearningBasedWB newSimpleWB$fWhiteBalancerSimpleWB$fWhiteBalancerLearningBasedWB$fWhiteBalancerGrayworldWB$fAlgorithmSimpleWB$fAlgorithmLearningBasedWB$fAlgorithmGrayworldWB$fFromPtrSimpleWB$fWithPtrSimpleWB$fFromPtrLearningBasedWB$fWithPtrLearningBasedWB$fFromPtrGrayworldWB$fWithPtrGrayworldWBBackgroundSubtractorMOGBackgroundSubtractorGMGnewBackgroundSubtractorGMGnewBackgroundSubtractorMOG-$fBackgroundSubtractorBackgroundSubtractorMOG-$fBackgroundSubtractorBackgroundSubtractorGMG"$fAlgorithmBackgroundSubtractorMOG"$fAlgorithmBackgroundSubtractorGMG $fFromPtrBackgroundSubtractorMOG $fFromPtrBackgroundSubtractorGMG $fWithPtrBackgroundSubtractorMOG $fWithPtrBackgroundSubtractorGMGPredefinedDictionaryName DICT_7X7_1000 ArUcoMarkers ChArUcoBoard DictionaryinterpolateChArUcoMarkersestimatePoseChArUcoBoarddrawEstimatedPosecalibrateCameraFromFrames detectMarkersdrawDetectedMarkersdrawDetectedCornersCharucocreateChArUcoBoardgetPredefinedDictionarydrawChArUcoBoard$fWithPtrVector'Vector'Point2f$fFromPtrVector'Vector'Point2f$fWithPtrVector'Int$fFromPtrVector'Int$fWithPtrChArUcoBoard$fFromPtrChArUcoBoard$fWithPtrDictionary$fFromPtrDictionary inline_c_ffi_6989586621679124836 C'Ptr_TrackerC'Ptr_SimpleWBC'Ptr_LearningBasedWBC'Ptr_GrayworldWB C'Ptr_SURFC'Ptr_BackgroundSubtractorMOGC'Ptr_BackgroundSubtractorGMGC'Vector'Vector'Point2f C'Vector'IntC'Ptr'CharucoBoardC'Ptr'DictionaryC'Ptr_TrackerSamplerAlgorithmC'TrackerFeatureSetC'Ptr_MultiTrackerAltC'Ptr_MultiTrackerC'Ptr_TrackerFeatureopenCvExtraCtx'inline-c-0.5.6.1-JmHBrZYpnLIEXGiFhG6adYLanguage.C.Inline.ContextbsCtxvecCtx%opencv-0.0.0.0-DCLXR8qMRB1DJJVpBRrxNSOpenCV.Internal.C.Inline openCvCtx ctxTypesTablectxAntiQuotersopenCvExtraTypesTable inline_c_ffi_6989586621679153749 inline_c_ffi_6989586621679153775 inline_c_ffi_6989586621679153802 inline_c_ffi_6989586621679153813 inline_c_ffi_6989586621679153823 inline_c_ffi_6989586621679153833 inline_c_ffi_6989586621679153845 inline_c_ffi_6989586621679153857 inline_c_ffi_6989586621679153869 inline_c_ffi_6989586621679175971 inline_c_ffi_6989586621679175999 inline_c_ffi_6989586621679176481 inline_c_ffi_6989586621679176490unSurfnewSurf inline_c_ffi_6989586621679188436 inline_c_ffi_6989586621679188454 inline_c_ffi_6989586621679188480 inline_c_ffi_6989586621679188498 inline_c_ffi_6989586621679188516 inline_c_ffi_6989586621679188534 inline_c_ffi_6989586621679188544 inline_c_ffi_6989586621679188557 inline_c_ffi_6989586621679188567 inline_c_ffi_6989586621679188580 inline_c_ffi_6989586621679188590 inline_c_ffi_6989586621679188603 inline_c_ffi_6989586621679188613 inline_c_ffi_6989586621679188625 inline_c_ffi_6989586621679188637 unSimpleWBunLearningBasedWB unGrayworldWB inline_c_ffi_6989586621679200135 inline_c_ffi_6989586621679200157 inline_c_ffi_6989586621679200179 inline_c_ffi_6989586621679200193 inline_c_ffi_6989586621679200215 inline_c_ffi_6989586621679200229 inline_c_ffi_6989586621679200239 inline_c_ffi_6989586621679200252 inline_c_ffi_6989586621679200262 inline_c_ffi_6989586621679200275 inline_c_ffi_6989586621679200285 inline_c_ffi_6989586621679200295unBackgroundSubtractorMOGunBackgroundSubtractorGMGChArUcoMarkers inline_c_ffi_6989586621679211832 inline_c_ffi_6989586621679211869 inline_c_ffi_6989586621679211896 inline_c_ffi_6989586621679211954 inline_c_ffi_6989586621679211974 inline_c_ffi_6989586621679211995 inline_c_ffi_6989586621679212013 inline_c_ffi_6989586621679212039 inline_c_ffi_6989586621679212056 inline_c_ffi_6989586621679212064 inline_c_ffi_6989586621679212086 inline_c_ffi_6989586621679212101 inline_c_ffi_6989586621679212110 inline_c_ffi_6989586621679212119 inline_c_ffi_6989586621679212128 arucoCornersarucoIds charucoIdscharucoCornersVector'Vector'Point2funVectorVectorPoint2f Vector'Int unVectorIntunChArUcoBoard unDictionarywithPtrs