úÎŸĒ•˜„      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~€‚ƒ Safe,-;<=FNSTUV]C„$Haskell representation of an OpenCV cv::Ptr cv::Tracker object…$Haskell representation of an OpenCV cv::Ptr cv::xphoto::SimpleWB object†$Haskell representation of an OpenCV cv::Ptr cv::xphoto::LearningBasedWB object‡$Haskell representation of an OpenCV cv::Ptr cv::xphoto::GrayworldWB objectˆ$Haskell representation of an OpenCV cv::Ptr cv::xfeatures2d::SURF object‰$Haskell representation of an OpenCV cv::Ptr 'cv::bgsegm::Ptr_BackgroundSubtractorMOG objectŠ$Haskell representation of an OpenCV cv::Ptr #cv::bgsegm::BackgroundSubtractorGMG object‹ŒŽ‘’“„…†‡ˆ‰ŠNone,-;<=FNSTUV]–”?Context useful to work with the OpenCV library's extra modules. Based on  , •, – and most importantly —.˜8: converts OpenCV basic types to their counterparts in OpenCV.Internal.C.Inline.No ™.”None,-;<=FNSTUV]:Number of frames used to initialize the background models.FThreshold value, above which it is marked foreground, else background.Length of the history.Number of Gaussian mixtures.Background ratio.lNoise strength (standard deviation of the brightness or each color channel). 0 means some automatic value.š›œNone"#,-;<=FNSTUV]Dė 9The set of predefined ArUco dictionaries known to OpenCV.The result of calling % on an image.žAn encoding of the result of !.vA ChArUco board is used to perform camera calibration from ArUco markers overlaid on a chess board of known size. Use ( to create values of this type. A  Dictionary= describes the possible QR codes used for ArUco markers. Use ) to lookup known dictionaries.!dGiven an image and the detected ArUco markers in that image, attempt to perform ChAruco calibration."nGiven an image, the ChArUco markers in that image, and the camera calibration, estimate the pose of the board.#AGiven an estimated pose for a board, draw the axis over an image.$YGiven a list of ChArUco calibration results, combine all results into camera calibration.%Perform ArUco marker detection.&<Given a frame, overlay the result of ArUco marker detection.'>Given a frame, overlay the result of ChArUco marker detection.()Create a new ChArUco board configuration.):Turn a predefined dictionary name into a ArUco dictionary.*SDraw a ChArUco board, ready to be printed and used for calibration/marke detection.Example: ĸ›drawChArUcoBoardImg :: forall (w :: Nat) (h :: Nat) . (w ~ 500, h ~ 500) => Mat ('S '[ 'S h, 'S w]) ('S 1) ('S Word8) drawChArUcoBoardImg = drawChArUcoBoard charucoBoard (Proxy :: Proxy w) (Proxy :: Proxy h) where charucoBoard :: ChArUcoBoard charucoBoard = createChArUcoBoard 10 10 20 5 dictionary dictionary :: Dictionary dictionary = getPredefinedDictionary DICT_7X7_1000  .doc/generated/examples/drawChArUcoBoardImg.pngdrawChArUcoBoardImg!-The ChArUco board to interpolate markers for.A view of a ChArUco board.-The ArUco markers detected in the same image."The ChArUco board parameters.Detected ChArUco markers.aA pair of the camera intrinsic parameters and a 5 dimensional vector of distortion coefficients.#/The matrix of intrinsic parameters of a camera.2A 5-dimensional vector of distortion coefficients.RThe transposition and rotation matrices from local to camera space, respectively.An image to draw the axis onto.%&A dictionary describing ArUco markers."The matrix to detect markers from.&[The image to draw detected markers onto. Usually the same image you detected markers from.The ArUco markers to draw.'#The image to draw detected corners.$The ChArUco markers corners to draw.('The amount of squares along the X-axis.'The amount of squares along the Y-axis.-The length of a side of a chess-board square.:The length of a marker's side within a chess-board square. The dictionary of ArUco markers.*widthheight  !"#$%&'()* )%&(*!"$'#  Ÿ ĄžĒĢĪĨͧĻĐŠŦŽ ­ŪNone,-;<=>?FNSTUV]K^BHaar Feature-basedC-soon Histogram of Oriented Gradients featuresD"soon Local Binary Pattern featuresEsoon All types of Feature2DL fhttps://github.com/opencv/opencv_extra/tree/c4219d5eb3105ed8e634278fad312a1a8d2c182d/testdata/trackingMNameQName56789:;<=>?@ABCDEFGHIJKLMNOPQ>?@89:567FGHIJKL;<=ABCDEMNOPQ56789:;<=>?@ABCDEFGHIJKLNone"#,-;<=FNSTUV]bd4Threshold for hessian keypoint detector used in SURFe9Number of pyramid octaves the keypoint detector will use.f+Number of octave layers within each octave.gkExtended descriptor flag (true - use extended 128-element descriptors; false - use 64-element descriptors).hoUp-right or rotated features flag (true - do not compute orientation of features; false - compute orientation).l(Detect keypoints and compute descriptorsExample: ĸsurfDetectAndComputeImg :: forall (width :: Nat) (height :: Nat) (channels :: Nat) (depth :: *) . (Mat (ShapeT [height, width]) ('S channels) ('S depth) ~ Frog) => Mat (ShapeT [height, width]) ('S channels) ('S depth) surfDetectAndComputeImg = exceptError $ do (kpts, _descs) <- surfDetectAndCompute surf frog Nothing withMatM (Proxy :: Proxy [height, width]) (Proxy :: Proxy channels) (Proxy :: Proxy depth) white $ \imgM -> do void $ matCopyToM imgM (V2 0 0) frog Nothing forM_ kpts $ \kpt -> do let kptRec = keyPointAsRec kpt circle imgM (round <$> kptPoint kptRec :: V2 Int32) 5 blue 1 LineType_AA 0 where surf = mkSurf defaultSurfParams  2doc/generated/examples/surfDetectAndComputeImg.pngsurfDetectAndComputeImglImage.Mask. bcdefghijkl ibcdefghjklbcdefghiŊ°None,-;<=FNSTUV]pĪo1Perform dctDenoising function for colored images.Example: ĸ#dctDenoisingImg :: forall h w w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Lenna_512x512 , w2 ~ ((*) w 2) ) => Mat ('S ['S h, 'S w2]) ('S c) ('S d) dctDenoisingImg = exceptError $ do denoised <- dctDenoising 10 Nothing lenna_512x512 withMatM (Proxy :: Proxy [h, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) lenna_512x512 Nothing matCopyToM imgM (V2 w 0) denoised Nothing where w = fromInteger $ natVal (Proxy :: Proxy w)  *doc/generated/examples/dctDenoisingImg.pngdctDenoisingImgo!expected noise standard deviation7size of block side where dct is computed use default 16"Input image 8-bit 3-channel image.)Output image same size and type as input.ooNone,-;<=FNSTUV]“u?Perform GrayworldWB a simple grayworld white balance algorithm.Example: ĸggrayworldWBImg :: forall h w h2 w2 c d . ( Mat (ShapeT [h, w]) ('S c) ('S d) ~ Sailboat_768x512 , w2 ~ ((*) w 2) , h2 ~ ((*) h 2) ) => IO (Mat ('S ['S h2, 'S w2]) ('S c) ('S d)) grayworldWBImg = do let bw :: (WhiteBalancer a) => a (PrimState IO) -> IO (Mat (ShapeT [h, w]) ('S c) ('S d)) bw = flip balanceWhite sailboat_768x512 balancedGrayworldWB <- bw =<< newGrayworldWB Nothing balancedLearningBasedWB <- bw =<< newLearningBasedWB Nothing Nothing Nothing balancedSimpleWB <- bw =<< newSimpleWB Nothing Nothing Nothing Nothing Nothing pure $ exceptError $ withMatM (Proxy :: Proxy [h2, w2]) (Proxy :: Proxy c) (Proxy :: Proxy d) black $ \imgM -> do matCopyToM imgM (V2 0 0) sailboat_768x512 Nothing matCopyToM imgM (V2 w 0) balancedGrayworldWB Nothing matCopyToM imgM (V2 0 h) balancedLearningBasedWB Nothing matCopyToM imgM (V2 w h) balancedSimpleWB Nothing where w = fromInteger $ natVal (Proxy :: Proxy w) h = fromInteger $ natVal (Proxy :: Proxy h)  )doc/generated/examples/grayworldWBImg.pnggrayworldWBImgtThe input Image.The output image.uÃA threshold of 1 means that all pixels are used to white-balance, while a threshold of 0 means no pixels are used. Lower thresholds are useful in white-balancing saturated images. Default: 0.9.vødefault 64, Defines the size of one dimension of a three-dimensional RGB histogram that is used internally by the algorithm. It often makes sense to increase the number of bins for images with higher bit depth (e.g. 256 bins for a 12 bit image).jdefault 255, Maximum possible value of the input image (e.g. 255 for 8 bit images, 4095 for 12 bit images){default 0.98, Threshold that is used to determine saturated pixels, i.e. pixels where at least one of the channels exceedswInput Min (default: 0)Input Max (default: 255)Output Min (default: 0)Output Max (default: 255)3Percent of top/bottom values to ignore (default: 2)pqrstuvwstrqpuvwpąēqģīrĩķst None,-;<=FNSTUV]”BT  !"#$%&'()*56789:;<=>?@ABCDEFGHIJKLMNOPQbcdefghijklopqrstuvw·   !"#$%&'()*+,-./0123456789:;<=>?@AABCCDEEFGGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijjklmnopqrstuvwxyz{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžœŸ ĄĒœĢœĪ Ĩ ͧ*ĻЧŠŦŽŽ­ŪŪŊ+°,ąpēwģxīyĩķ*opencv-extra-0.2.0.1-f5X2IPzj4JGKQKP7WXW32OpenCV.Extra.BgsegmOpenCV.Extra.ArUcoOpenCV.Extra.TrackingOpenCV.Extra.XFeatures2dOpenCV.Extra.XPhoto!OpenCV.Extra.XPhoto.WhiteBalancerOpenCV.Extra.Internal.C.TypesOpenCV.Extra.Internal.C.InlineCcppCtx OpenCV.ExtraBackgroundSubtractorMOGBackgroundSubtractorGMGnewBackgroundSubtractorGMGnewBackgroundSubtractorMOG-$fBackgroundSubtractorBackgroundSubtractorGMG"$fAlgorithmBackgroundSubtractorGMG $fFromPtrBackgroundSubtractorGMG $fWithPtrBackgroundSubtractorGMG-$fBackgroundSubtractorBackgroundSubtractorMOG"$fAlgorithmBackgroundSubtractorMOG $fFromPtrBackgroundSubtractorMOG $fWithPtrBackgroundSubtractorMOGPredefinedDictionaryName DICT_4X4_50 DICT_4X4_100 DICT_4X4_250 DICT_4X4_1000 DICT_5X5_50 DICT_5X5_100 DICT_5X5_250 DICT_5X5_1000 DICT_6X6_50 DICT_6X6_100 DICT_6X6_250 DICT_6X6_1000 DICT_7X7_50 DICT_7X7_100 DICT_7X7_250 DICT_7X7_1000DICT_ARUCO_ORIGINAL ArUcoMarkers ChArUcoBoard DictionaryinterpolateChArUcoMarkersestimatePoseChArUcoBoarddrawEstimatedPosecalibrateCameraFromFrames detectMarkersdrawDetectedMarkersdrawDetectedCornersCharucocreateChArUcoBoardgetPredefinedDictionarydrawChArUcoBoard$fWithPtrDictionary$fFromPtrDictionary$fWithPtrChArUcoBoard$fFromPtrChArUcoBoard$fWithPtrVector'Int$fFromPtrVector'Int$fWithPtrVector'Vector'Point2f$fFromPtrVector'Vector'Point2f$fShowPredefinedDictionaryName$fEqPredefinedDictionaryNameMultiTrackerAltunMultiTrackerAlt MultiTrackerunMultiTrackerTrackerFeatureunTrackerFeatureTracker unTrackerTrackerFeatureTypeHAARHOGLBP FEATURE2D TrackerTypeBOOSTINGMILKCF MEDIANFLOWTLDGOTURN newTracker initTracker updateTrackernewMultiTrackernewTrackerFeature$fFromPtrTracker$fWithPtrTracker$fFromPtrTrackerFeature$fWithPtrTrackerFeature$fFromPtrMultiTracker$fWithPtrMultiTracker$fFromPtrMultiTrackerAlt$fWithPtrMultiTrackerAlt$fEqTrackerType$fShowTrackerType$fEnumTrackerType$fBoundedTrackerType$fEqTrackerFeatureType$fShowTrackerFeatureType$fEnumTrackerFeatureType$fBoundedTrackerFeatureType SurfParamssurf_hessianThreshold surf_nOctavessurf_nOctaveLayers surf_extended surf_uprightSurfdefaultSurfParamsmkSurfsurfDetectAndCompute $fFromPtrSurf $fWithPtrSurf dctDenoisingSimpleWBLearningBasedWB GrayworldWB WhiteBalancer balanceWhitenewGrayworldWBnewLearningBasedWB newSimpleWB$fWhiteBalancerGrayworldWB$fAlgorithmGrayworldWB$fFromPtrGrayworldWB$fWithPtrGrayworldWB$fWhiteBalancerLearningBasedWB$fAlgorithmLearningBasedWB$fFromPtrLearningBasedWB$fWithPtrLearningBasedWB$fWhiteBalancerSimpleWB$fAlgorithmSimpleWB$fFromPtrSimpleWB$fWithPtrSimpleWB C'Ptr_TrackerC'Ptr_SimpleWBC'Ptr_LearningBasedWBC'Ptr_GrayworldWB C'Ptr_SURFC'Ptr_BackgroundSubtractorMOGC'Ptr_BackgroundSubtractorGMGC'Vector'Vector'Point2f C'Vector'IntC'Ptr'CharucoBoardC'Ptr'DictionaryC'Ptr_TrackerSamplerAlgorithmC'TrackerFeatureSetC'Ptr_MultiTrackerAltC'Ptr_MultiTrackerC'Ptr_TrackerFeatureopenCvExtraCtx'inline-c-0.6.0.5-10pmocMfjnz3vvXf3ri6nzLanguage.C.Inline.ContextbsCtxvecCtx%opencv-0.0.2.1-B7UEGPcqqPxHXbUt4zHBV3OpenCV.Internal.C.Inline openCvCtx ctxTypesTablectxAntiQuotersunBackgroundSubtractorMOGunBackgroundSubtractorGMGChArUcoMarkers arucoCornersarucoIds charucoIdscharucoCornersVector'Vector'Point2funVectorVectorPoint2f Vector'Int unVectorIntunChArUcoBoard unDictionaryunSurf unSimpleWBunLearningBasedWB unGrayworldWB