h,_      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             !!!!!!!!!!!!!"""""""""""""""""#################$$$$$$$$$$$$$$$$$%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%&&&&&&&''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''(()))))))))))))))))))))))))))))))))***********+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,-------------.........................////////////////0000000000000011111111111111222222222222222222222222222222222222222222222222222222222222222222222222222222222333333333333333333333333333333333333333333333333333333333333333333333333344444444444444444444444444444444444444444444444444444444444444444445555555555555555555556666666666666666666667777777777777777777777777777777777777777777777777777777777777777777777777777777777778888888899999999999999999999999999999999::::::::::::::;;;;;<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<=====================>>>>>>>>>>>>>>>>???????@@@@@@@@@@@@@@@@@@@@AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAB0.2.1.2NoneF}None FNone1HO hasktorchBool  hasktorchByte! hasktorchChar" hasktorchShort# hasktorchInt$ hasktorchLong% hasktorchHalf& hasktorchFloat' hasktorchDouble( hasktorch ComplexHalf) hasktorch ComplexFloat* hasktorch ComplexDouble+ hasktorchQInt8, hasktorchQUInt8- hasktorchQInt32. hasktorchBFloat1610/.*)('&%"#$!-+, .-,+*)( '%$#"!&/01NoneHTQOSNRPNOPQRSTNone%&)*01NV hasktorchA V determines the ordering of samples streamed out of a dataset. You can either order sequentially, or supply a random generator to shuffle samples.Y hasktorchDataset options used when loading datasets. Specify shuffling behavior, the number of threads to use, and the buffer size used to store retrieved samples in each thread.[ hasktorch!The ordering of samples streamed.\ hasktorch%Number of threads retrieving samples.] hasktorch Bool)x hasktorchDoes the file have a header?y hasktorchColumn delimiter.z hasktorchCSV file path. hasktorchProduce a CsvDatastream' from the given file with default options, and tab separated columns. hasktorchProduce a CsvDatastream' from the given file with default options, and comma separated columns.{rstwvyuzxq|~}{|~}stzyxwvurq None %&] hasktorch%An In-Memory cached dataset. See the . function for how to create a cached dataset. hasktorch5Run a map function in parallel over the given stream. hasktorch-Run a pipe in parallel over the given stream. hasktorchMap a ListT transform over the given the stream in parallel. This should be useful for using functions which groups elements of a stream and yields them downstream. hasktorch?Enumerate the given stream, zipping each element with an index. hasktorch/Run a given batching function in parallel. See ( for how the given samples are batched. hasktorchRun a batching function with integer batch size over the given stream. The elements of the stream are split into lists of the given batch size and are collated with the given function. Only Just values are yielded downstream. If the last chunk of samples is less than the given batch size then the batching function will be passed a list of length less than batch size. hasktorch+Enumerate a given stream and store it as a . This function should be used after a time consuming preprocessing pipeline and used in subsequent epochs to avoid repeating the preprocessing pipeline.CNone^ $abhij^_`YZ]\[VWXfgcde   None^b None^ None)*/01_ hasktorchAlternative version of % with better type inference based on  hasktorchStronger version of 4 that allows for better inference of the return typeD Nonea#None1=a hasktorchType synonym for lensE  NoneaNone%&b)None%&1=kA hasktorch9Returns the total number of elements in the input tensor. hasktorch:Returns the size of a given dimension of the input tensor. hasktorchReturns the shape of the tensor hasktorch*Returns the dimensions of the input tensor hasktorch*Returns the dimensions of the input tensor hasktorch*Returns the dimensions of the input tensor hasktorch=Returns the device on which the tensor is currently allocated hasktorch)Returns the data type of the input tensor hasktorch-Casts the input tensor to the given data type hasktorch&Casts the input tensor to given device hasktorchSlices the input tensor along the selected dimension at the given index. hasktorchReturns a new tensor which indexes the input tensor along dimension dim using the entries in index which is a LongTensor. hasktorchSlices the input tensor along the selected dimension at the given range. hasktorchReturns a tensor with the same data and number of elements as input, but with the specified shape. hasktorchThe internal function of withTensor. It does not check contiguous memory-layout. hasktorchinput hasktorchnumber of elements in tensor hasktorch dimension hasktorchinput hasktorchinput hasktorch5list of integers representing the shape of the tensor hasktorchinput hasktorchoutput hasktorchinput hasktorchoutput hasktorchinput hasktorchoutput hasktorchinput hasktorchobject representing the device hasktorchinput hasktorchdata type of the input tensor hasktorchdata type to cast input to hasktorchinput hasktorchoutput hasktorchdevice to cast input to hasktorchinput hasktorchoutput hasktorchdimension to slice along hasktorchindex in the given dimension hasktorchinput hasktorchoutput hasktorchdim hasktorch indexTensor hasktorchinput hasktorchoutput hasktorchdim hasktorch indexList hasktorchinput hasktorchoutput hasktorchdim hasktorchstart hasktorchend hasktorchstep hasktorchinputFGHNonee hasktorchReturns a tensor filled with the scalar value 1, with the shape defined by the variable argument size. hasktorchReturns a tensor filled with the scalar value 1, with the same size as input tensor hasktorchReturns a tensor filled with the scalar value 0, with the shape defined by the variable argument size. hasktorchReturns a tensor filled with the scalar value 0, with the same size as input tensor hasktorchReturns a tensor filled with random numbers from a uniform distribution on the interval [0,1) hasktorchReturns a tensor filled with random numbers from a standard normal distribution. hasktorchReturns a tensor filled with random integers generated uniformly between low (inclusive) and high (exclusive). hasktorchReturns a tensor with the same size as input that is filled with random numbers from standard normal distribution. hasktorchReturns a tensor with the same size as input that is filled with random numbers from a uniform distribution on the interval [0,1). hasktorchReturns a one-dimensional tensor of steps equally spaced points between start and end. hasktorchReturns a 2-D tensor with ones on the diagonal and zeros elsewhere. hasktorch6Returns a tensor of given size filled with fill_value. hasktorchConstructs a sparse tensors in COO(rdinate) format with non-zero elements at the given indices with the given values. hasktorchReturns a 1-D tensor with values from the interval [start, end) taken with common difference step beginning from start. hasktorchReturns a 1-D tensor with values from the interval [start, end) taken with common difference step beginning from start.$ hasktorch aten_impl hasktorchshape hasktorchopts hasktorchoutput hasktorch=sequence of integers defining the shape of the output tensor. hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorchinput hasktorchoutput hasktorch=sequence of integers defining the shape of the output tensor. hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorchinput hasktorchoutput hasktorch=sequence of integers defining the shape of the output tensor. hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorch=sequence of integers defining the shape of the output tensor. hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorch=lowest integer to be drawn from the distribution. Default: 0. hasktorchone above the highest integer to be drawn from the distribution. hasktorchthe shape of the output tensor. hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorchinput hasktorchoutput hasktorchinput hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorchinput hasktorch _fill_value hasktorchopt hasktorchoutput hasktorch start hasktorch end hasktorch steps hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorchdim hasktorchopts hasktorchoutput hasktorchthe number of rows hasktorchthe number of columns hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorchthe shape of the output tensor. hasktorch)the number to fill the output tensor with hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorchThe indices are the coordinates of the non-zero values in the matrix hasktorchInitial values for the tensor. hasktorchthe shape of the output tensor. hasktorchoutput hasktorcht hasktorchoutput hasktorcht hasktorchp hasktorchoutput hasktorcht hasktorchoutput hasktorcht hasktorch num_samples hasktorchoutput hasktorcht hasktorch num_samples hasktorch replacement hasktorchoutput hasktorch_mean hasktorchoutput hasktorch_mean hasktorch_std hasktorchoutput hasktorch_mean hasktorch_std hasktorchoutput hasktorch_mean hasktorch_std hasktorchoutput hasktorch_mean hasktorch_std hasktorch_size hasktorchoutput hasktorcht hasktorchoutput hasktorcht hasktorchupper hasktorchoutput hasktorcht hasktorchlower hasktorchupper hasktorchoutput hasktorcht hasktorchlower hasktorchupper hasktorchtraining hasktorchoutput hasktorcht hasktorchnoise hasktorchoutput hasktorcht hasktorchnoise hasktorchupper hasktorchoutput hasktorcht hasktorchnoise hasktorchlower hasktorchupper hasktorchoutput hasktorcht hasktorchnoise hasktorchlower hasktorchupper hasktorchtraining hasktorchoutput hasktorchstart hasktorchend hasktorchstep hasktorchconfigures the data type, device, layout and other properties of the resulting tensor. hasktorchoutput hasktorchstart hasktorchend hasktorchstep hasktorchoutputNone " hasktorchGenerate a slice from a  6https://pytorch.org/cppdocs/notes/tensor_indexing.htmlpython compatible expression. When you take the odd-numbered element of tensor with `tensor[1::2]` in python, you can write `tensor ! [slice|1::2|]` in hasktorch. hasktorchGenerate a lens from a  6https://pytorch.org/cppdocs/notes/tensor_indexing.htmlpython compatible expression. When you take the odd-numbered elements of tensor with `tensor[1::2]` in python, you can write `tensor ^. [lslice|1::2|]` in hasktorch. When you put 2 in the odd numbered elements of the tensor, you can write `tensor & [lslice|1::2|] ~. 2`.None "01 hasktorchtensors hasktorchinput hasktorchp hasktorchtrain hasktorchinput hasktorchp hasktorchtrain hasktorchinput hasktorchp hasktorchtrain hasktorchinput hasktorchp hasktorchtrain hasktorchinput hasktorchp hasktorchtrain hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorch ceil_mode hasktorchcount_include_pad hasktorchself hasktorch output_size hasktorchself hasktorch output_size hasktorchself hasktorchother hasktorchalpha hasktorchself hasktorchother hasktorchalpha hasktorchself hasktorchmat hasktorchvec hasktorchbeta hasktorchalpha hasktorchself hasktorchvec1 hasktorchvec2 hasktorchbeta hasktorchalpha hasktorchtheta hasktorchsize hasktorch align_corners hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchother hasktorchrtol hasktorchatol hasktorch equal_nan hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchsize hasktorchstride hasktorchstorage_offset hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchtensors hasktorchself hasktorchtensors hasktorchself hasktorchtensors hasktorchself hasktorchbatch1 hasktorchbatch2 hasktorchbeta hasktorchalpha  hasktorchinput hasktorchweight hasktorchbias hasktorch running_mean hasktorch running_var hasktorchtraining hasktorchmomentum hasktorcheps hasktorch cudnn_enabled hasktorchinput hasktorchweight hasktorchbias hasktorchmean hasktorchvar hasktorcheps hasktorch output_scale hasktorchoutput_zero_point hasktorchinput1 hasktorchinput2 hasktorchweight hasktorchbias hasktorchself hasktorchtarget hasktorchweight hasktorch reduction hasktorchself hasktorchtarget hasktorchweight hasktorch pos_weight hasktorch reduction hasktorchself hasktorchweights hasktorch minlength hasktorchself hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchmat2 hasktorchtensors hasktorchself hasktorchsize hasktorchtensors hasktorchdim hasktorchtensors hasktorchdim hasktorchtensors hasktorchdim hasktorchtensors hasktorchdim hasktorchtensors hasktorchdim hasktorchtensors hasktorchdim hasktorchtensors hasktorchself hasktorchmatrices hasktorchself hasktorchchunks hasktorchdim hasktorchself hasktorchchunks hasktorchdim hasktorchself hasktorchsections hasktorchdim hasktorchself hasktorchtensor_indices_or_sections hasktorchdim hasktorchself hasktorchmin hasktorchmax hasktorchself hasktorchmin hasktorchmax hasktorchself hasktorchmax hasktorchself hasktorchmax hasktorchself hasktorchmin hasktorchself hasktorchmin hasktorchself hasktorchmin hasktorchmax hasktorchself hasktorchmin hasktorchmax hasktorchself hasktorchreal hasktorchimag hasktorchabs hasktorchangle hasktorchself hasktorchpad hasktorchvalue  hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorch transposed hasktorchoutput_padding hasktorchgroups  hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorch transposed hasktorchoutput_padding hasktorchgroups  hasktorch grad_output hasktorchinput hasktorchweight hasktorchstride hasktorchpadding hasktorchdilation hasktorch transposed hasktorchoutput_padding hasktorchgroups  hasktorch output_mask hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups hasktorchself hasktorchweight hasktorchbias hasktorchpad hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchoutput_padding hasktorchgroups hasktorchdilation hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchoutput_padding hasktorchgroups hasktorchdilation hasktorchinput hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchoutput_padding hasktorchgroups hasktorchdilation hasktorchself hasktorchsrc hasktorch non_blocking hasktorchself hasktorchself hasktorchinput1 hasktorchinput2 hasktorchtarget hasktorchmargin hasktorch reduction hasktorchself hasktorchdim hasktorchself hasktorch correction hasktorchfweights hasktorchaweights hasktorchself hasktorchtheta hasktorchN hasktorchC hasktorchH hasktorchW hasktorchinput hasktorchweight hasktorchbias hasktorch running_mean hasktorch running_var hasktorchtraining hasktorchexponential_average_factor hasktorchepsilon  hasktorchself hasktorchweight hasktorchpadding hasktorchstride hasktorchdilation hasktorchgroups hasktorch benchmark hasktorch deterministic hasktorch allow_tf32  hasktorchself hasktorchweight hasktorchpadding hasktorchoutput_padding hasktorchstride hasktorchdilation hasktorchgroups hasktorch benchmark hasktorch deterministic  hasktorch allow_tf32 hasktorchself hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups  hasktorchself hasktorchweight hasktorchz hasktorchalpha hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups hasktorchself hasktorchgrid hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchdtype hasktorchself hasktorchdim hasktorchdtype hasktorchself hasktorchdim hasktorchdtype hasktorchself hasktorchdim hasktorchdtype hasktorchy hasktorchx hasktorchdim hasktorchy hasktorchdx hasktorchdim hasktorch log_probs hasktorchtargets hasktorch input_lengths hasktorchtarget_lengths hasktorchblank hasktorch reduction hasktorch zero_infinity hasktorch log_probs hasktorchtargets hasktorch input_lengths hasktorchtarget_lengths hasktorchblank hasktorch reduction hasktorch zero_infinity hasktorchself hasktorchoffset hasktorchdim1 hasktorchdim2 hasktorchself hasktorchoffset hasktorchself hasktorchoffset hasktorchdim1 hasktorchdim2 hasktorchA hasktorchoffset hasktorchdim1 hasktorchdim2 hasktorchself hasktorchoutdim hasktorchdim1 hasktorchdim2 hasktorchoffset hasktorchself hasktorchn hasktorchdim hasktorchprepend hasktorchappend hasktorchself hasktorchspacing hasktorchdim hasktorch edge_order hasktorchself hasktorchdim hasktorch edge_order hasktorchself hasktorchother hasktorchself hasktorchother hasktorch rounding_mode hasktorchself hasktorchother hasktorchself hasktorchother hasktorch rounding_mode hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorch rounding_mode hasktorchself hasktorchother hasktorch rounding_mode hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchtensor hasktorchself hasktorchother hasktorchequation hasktorchtensors hasktorchpath hasktorchweight hasktorchindices hasktorch padding_idx hasktorchscale_grad_by_freq hasktorchsparse hasktorchtensors hasktorchweight hasktorchindices hasktorchoffsets hasktorchscale_grad_by_freq hasktorchmode hasktorchsparse hasktorchper_sample_weights hasktorchinclude_last_offset  hasktorchweight hasktorchindices hasktorchoffsets hasktorchscale_grad_by_freq hasktorchmode hasktorchsparse hasktorchper_sample_weights hasktorchinclude_last_offset hasktorch padding_idx hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorch start_dim hasktorchend_dim hasktorchself hasktorch start_dim hasktorchend_dim hasktorchout_dim hasktorchself hasktorch start_dim hasktorchend_dim hasktorchout_dim hasktorchself hasktorchdims hasktorchout_dim hasktorchself hasktorchdim hasktorchsizes hasktorchself hasktorchdim hasktorchsizes hasktorchnames hasktorchself hasktorchvalue hasktorchself hasktorchvalue hasktorchself hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchself hasktorchother hasktorchself hasktorchother hasktorchinput hasktorchgrid hasktorchinterpolation_mode hasktorch padding_mode hasktorch align_corners hasktorchinput hasktorchgrid hasktorchinterpolation_mode hasktorch padding_mode hasktorch align_corners hasktorchinput hasktorchgrid hasktorchinterpolation_mode hasktorch padding_mode hasktorch align_corners hasktorchself hasktorchtarget hasktorchmargin hasktorch reduction hasktorchinput hasktorch num_groups hasktorchweight hasktorchbias hasktorcheps hasktorch cudnn_enabled hasktorchinput hasktorchweight hasktorchbias hasktorchN hasktorchC hasktorchHxW hasktorchgroup hasktorcheps hasktorchself hasktorchindices hasktorchself hasktorchdim hasktorchindex hasktorchsource hasktorchself hasktorchdim hasktorchindex hasktorchsource hasktorchself hasktorchindices hasktorchvalues hasktorch accumulate  hasktorchinput hasktorchweight hasktorchbias hasktorch running_mean hasktorch running_var hasktorchuse_input_stats hasktorchmomentum hasktorcheps hasktorch cudnn_enabled hasktorchself hasktorchother hasktorchrtol hasktorchatol hasktorch equal_nan hasktorchelements hasktorch test_elements hasktorch assume_unique hasktorchinvert hasktorchelements hasktorch test_element hasktorch assume_unique hasktorchinvert hasktorchelement hasktorch test_elements hasktorch assume_unique hasktorchinvert hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchother hasktorchself hasktorchself hasktorchself hasktorchtarget hasktorch reduction hasktorch log_target hasktorchself hasktorchother hasktorchself hasktorchk hasktorchdim hasktorchkeepdim hasktorchself hasktorchk hasktorchdim hasktorchkeepdim hasktorchinput hasktorchnormalized_shape hasktorchweight hasktorchbias hasktorcheps hasktorch cudnn_enable hasktorchinput hasktorchnormalized_shape hasktorchweight hasktorchbias hasktorcheps hasktorchself hasktorchnan hasktorchposinf hasktorchneginf hasktorchinput hasktorchweight hasktorchbias hasktorchself hasktorchweight hasktorchbias hasktorch input_size hasktorch grad_output hasktorchweight hasktorch grad_output hasktorchinput hasktorchweight hasktorch bias_defined hasktorchinput hasktorchweight hasktorchpacked hasktorch col_offsets hasktorch weight_scale hasktorchweight_zero_point hasktorchbias hasktorchinput hasktorchweight hasktorchpacked hasktorch col_offsets hasktorch weight_scale hasktorchweight_zero_point hasktorchbias hasktorchinput hasktorchinput hasktorchinput hasktorch packed_weight hasktorchbias hasktorchinput hasktorch packed_weight hasktorchbias hasktorchinput hasktorchinput hasktorchK hasktorchN hasktorchself hasktorchother hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchdim hasktorchdtype hasktorchself hasktorchdim hasktorchdtype hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchinput1 hasktorchinput2 hasktorchtarget hasktorchmargin hasktorch reduction hasktorchself hasktorchother hasktorchself hasktorchn hasktorchself hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode hasktorchself hasktorchdtype hasktorchself hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchweight hasktorchbias hasktorchpadding hasktorchstride hasktorchdilation hasktorchgroups hasktorchinput hasktorchweight0 hasktorchweight1 hasktorchweight2 hasktorchweight3 hasktorchhx_ hasktorchcx_ hasktorchreverse hasktorch batch_sizes  hasktorchmode  hasktorch hidden_size  hasktorch num_layers  hasktorch has_biases  hasktorch bidirectional hasktorch batch_first hasktorchtrain hasktorchinput hasktorchweight hasktorchbias hasktorch running_mean hasktorch running_var hasktorchtraining hasktorchexponential_average_factor hasktorchepsilon  hasktorchself hasktorchweight hasktorchbias hasktorchpadding hasktorchstride hasktorchdilation hasktorchgroups hasktorch benchmark hasktorch deterministic  hasktorchself hasktorchweight hasktorchbias hasktorchpadding hasktorchoutput_padding hasktorchstride hasktorchdilation hasktorchgroups hasktorch benchmark  hasktorch deterministic  hasktorchself hasktorchweight hasktorchbias hasktorchpadding hasktorchstride hasktorchdilation hasktorchgroups hasktorch benchmark hasktorch deterministic hasktorchself hasktorchweight hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups  hasktorchself hasktorchweight hasktorchz hasktorchalpha hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation hasktorchgroups hasktorchinput hasktorchweight hasktorchweight_stride0 hasktorchhx hasktorchcx hasktorchmode hasktorch hidden_size hasktorch num_layers hasktorch batch_first  hasktorchdropout  hasktorchtrain  hasktorch bidirectional  hasktorch batch_sizes  hasktorch dropout_state hasktorchself hasktorchmat2 hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchvec hasktorchself hasktorchp hasktorchself hasktorchdim hasktorchstart hasktorchlength hasktorchself hasktorchdim hasktorchstart hasktorchlength hasktorchself hasktorchdim hasktorchstart hasktorchlength hasktorchinput hasktorchweight hasktorchbias hasktorch running_mean hasktorch running_var hasktorchtraining hasktorchmomentum hasktorcheps hasktorchinput hasktorcheps hasktorchinput hasktorchweight hasktorchbias hasktorchmean hasktorchinvstd hasktorcheps hasktorchinput hasktorchmean hasktorchinvstd hasktorch running_mean hasktorch running_var hasktorchmomentum hasktorcheps hasktorchcount hasktorchinput hasktorchmean hasktorchinvstd hasktorch running_mean hasktorch running_var hasktorchmomentum hasktorcheps hasktorchcounts hasktorchgrad_out hasktorchinput hasktorchmean hasktorchinvstd hasktorchweight hasktorchinput_g hasktorchweight_g hasktorchbias_g hasktorchgrad_out hasktorchinput hasktorchmean hasktorchinvstd hasktorchweight hasktorchmean_dy hasktorch mean_dy_xmu hasktorchcount hasktorchinput hasktorch running_mean hasktorch running_var hasktorchmomentum hasktorchx1 hasktorchx2 hasktorchp hasktorcheps hasktorchkeepdim hasktorchx1 hasktorchx2 hasktorchp hasktorch compute_mode hasktorchself hasktorchp hasktorchx1 hasktorchx2 hasktorchdim hasktorcheps hasktorchself hasktorchdims hasktorchself hasktorchsource hasktorch destination hasktorchself hasktorchsource hasktorch destination hasktorchself hasktorchself hasktorchupscale_factor hasktorchself hasktorchdownscale_factor hasktorchself hasktorchgroups hasktorchself hasktorchgroups hasktorchself hasktorchrcond hasktorchinput hasktorchtarget hasktorch log_input hasktorchfull hasktorcheps hasktorch reduction hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchrepeats hasktorch output_size hasktorchself hasktorchrepeats hasktorchdim hasktorch output_size hasktorchself hasktorchrepeats hasktorchdim hasktorch output_size hasktorchself hasktorchshape hasktorchself hasktorchself hasktorchdecimals hasktorchself hasktorchself hasktorchself hasktorchweight hasktorchself hasktorch approximate hasktorchself hasktorchlambd hasktorchself hasktorchself hasktorchdim hasktorchindex hasktorchself hasktorchdim hasktorchindex hasktorchself hasktorchself hasktorchalpha hasktorchself hasktorchself hasktorchself hasktorchself hasktorcheps hasktorchself hasktorchself hasktorchself hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchstart hasktorchend hasktorchstep hasktorchself hasktorchsrc hasktorchdim hasktorchstart hasktorchend hasktorchstep hasktorchself hasktorchsrc hasktorchdim hasktorchindex hasktorchself hasktorchsrc hasktorchoffset hasktorchdim1 hasktorchdim2 hasktorchself hasktorchsrc hasktorchsize hasktorchstride hasktorchstorage_offset hasktorchself hasktorchmat2 hasktorchself hasktorchdim hasktorchdtype hasktorchself hasktorchdim hasktorchdtype hasktorchself hasktorch split_size hasktorchdim hasktorchself hasktorch split_size hasktorchdim hasktorchself hasktorch split_sizes hasktorchdim hasktorchself hasktorch split_sizes hasktorchdim hasktorchself hasktorchsections hasktorchself hasktorchsections hasktorchself hasktorchsections hasktorchself hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchself hasktorchmat1 hasktorchmat2 hasktorchbeta hasktorchalpha hasktorchtensors hasktorchdim hasktorchtensors hasktorchtensors hasktorchtensors hasktorchself hasktorchn_fft hasktorch hop_length hasktorch win_length hasktorchwindow hasktorch normalized hasktorchonesided hasktorchreturn_complex  hasktorchself hasktorchn_fft hasktorch hop_length hasktorch win_length hasktorchwindow hasktorchcenter hasktorchpad_mode hasktorch normalized hasktorchonesided  hasktorchreturn_complex  hasktorchself hasktorchn_fft hasktorch hop_length hasktorch win_length hasktorchwindow hasktorchcenter hasktorch normalized hasktorchonesided hasktorchlength  hasktorchreturn_complex hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchself hasktorchdtype hasktorchself hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchself hasktorchself hasktorchunbiased hasktorchself hasktorchdim hasktorchunbiased hasktorchkeepdim hasktorchself hasktorchdim hasktorch correction hasktorchkeepdim hasktorchself hasktorchunbiased hasktorchself hasktorchdim hasktorchunbiased hasktorchkeepdim hasktorchself hasktorchdim hasktorch correction hasktorchkeepdim hasktorchself hasktorchdim hasktorchunbiased hasktorchkeepdim hasktorchself hasktorchdim hasktorch correction hasktorchkeepdim hasktorchself hasktorchdim hasktorchunbiased hasktorchkeepdim hasktorchself hasktorchdim hasktorch correction hasktorchkeepdim hasktorchself hasktorchdtype hasktorchself hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchself hasktorchself hasktorchself hasktorchother hasktorch dims_self hasktorch dims_other hasktorchself hasktorch threshold hasktorchvalue hasktorchself hasktorchdims hasktorchself hasktorchdim0 hasktorchdim1 hasktorchself hasktorchdim0 hasktorchdim1 hasktorchself hasktorch num_classes hasktorchself hasktorchdims hasktorchself hasktorchself hasktorchself hasktorchshifts hasktorchdims hasktorchself hasktorchk hasktorchdims hasktorchy hasktorchx hasktorchdim hasktorchy hasktorchdx hasktorchdim hasktorchy hasktorchx hasktorchdim hasktorchy hasktorchdx hasktorchdim hasktorchanchor hasktorchpositive hasktorchnegative hasktorchmargin hasktorchp hasktorcheps hasktorchswap hasktorch reduction hasktorchself hasktorchself hasktorchself hasktorchdim hasktorchsorted hasktorchreturn_inverse hasktorch return_counts hasktorchself hasktorchreturn_inverse hasktorch return_counts hasktorchdim hasktorchself hasktorchdim hasktorchreturn_inverse hasktorch return_counts hasktorchself hasktorchdim hasktorchx hasktorchN hasktorch increasing hasktorchself hasktorchunbiased hasktorchself hasktorchdim hasktorchunbiased hasktorchkeepdim hasktorchself hasktorchdim hasktorch correction hasktorchkeepdim hasktorchself hasktorchdim hasktorchunbiased hasktorchkeepdim hasktorchself hasktorchdim hasktorch correction hasktorchkeepdim hasktorchself hasktorchunbiased hasktorchself hasktorchdim hasktorchunbiased hasktorchkeepdim hasktorchself hasktorchdim hasktorch correction hasktorchkeepdim hasktorchself hasktorchdim hasktorchunbiased hasktorchkeepdim hasktorchself hasktorchdim hasktorch correction hasktorchkeepdim hasktorch condition hasktorchself hasktorchother hasktorch condition hasktorchself hasktorchother hasktorch condition hasktorchself hasktorchother hasktorch condition hasktorchself hasktorchother hasktorch condition hasktorchv hasktorchpow hasktorchdim hasktorchself hasktorchp hasktorchself hasktorchp hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchp hasktorchdtype hasktorchself hasktorchp hasktorchself hasktorchp hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchp hasktorchdim hasktorchkeepdim hasktorchself hasktorchp hasktorchdim hasktorchkeepdim hasktorchdtype hasktorchself hasktorchp hasktorchdim hasktorchkeepdim hasktorchself hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorchkeepdim hasktorchself hasktorchdim hasktorchkeepdim hasktorchself hasktorch memory_format hasktorchself hasktorchself hasktorchother hasktorchalpha hasktorchself hasktorchother hasktorchalpha hasktorchself hasktorchother hasktorchalpha hasktorchself hasktorchother hasktorchalpha hasktorchself hasktorchother hasktorchalpha hasktorchself hasktorchvalues hasktorchself hasktorchother hasktorchalpha hasktorchself hasktorchmat1 hasktorchmat2 hasktorchbeta hasktorchalpha hasktorchself hasktorchmat1 hasktorchmat2 hasktorchbeta hasktorchalpha hasktorchmat1 hasktorchmat2 hasktorchself hasktorchdim hasktorchself hasktorchdim hasktorchself hasktorchpadding hasktorchstride hasktorchdilation hasktorchgroups hasktorch input_size hasktorchself hasktorchpadding hasktorchstride hasktorchdilation hasktorchgroups hasktorchself hasktorchdtype hasktorch reduce_range hasktorchself hasktorchscale hasktorch zero_point hasktorchdtype hasktorchself hasktorchscale hasktorch zero_point hasktorchdtype hasktorchtensors hasktorchscales hasktorch zero_points hasktorchdtype hasktorchself hasktorchscales hasktorch zero_points hasktorchaxis hasktorchdtype hasktorchself hasktorchtensors hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchscale hasktorch zero_point hasktorch quant_min hasktorch quant_max hasktorchself hasktorchscale hasktorch zero_point hasktorch quant_min hasktorch quant_max hasktorchself hasktorchscale hasktorch zero_point hasktorch quant_min hasktorch quant_max hasktorchself hasktorchscale hasktorch zero_point hasktorchaxis hasktorch quant_min hasktorch quant_max hasktorchself hasktorchscale hasktorch zero_point hasktorchaxis hasktorch quant_min hasktorch quant_max  hasktorchself hasktorch observer_on hasktorch fake_quant_on hasktorch running_min hasktorch running_max hasktorchscale hasktorch zero_point hasktorchaveraging_const hasktorch quant_min  hasktorch quant_max  hasktorchch_axis  hasktorchper_row_fake_quant  hasktorchsymmetric_quant hasktorchinput hasktorchnumel hasktorchn_bins hasktorchratio hasktorch bit_width hasktorchtensors hasktorchtensors hasktorchindexing hasktorchtensors hasktorchself hasktorchr hasktorchwith_replacement hasktorchtensor hasktorchother hasktorchtensor hasktorchother hasktorchscalar hasktorchtensor hasktorchscalar1 hasktorchscalar2 hasktorchfrom hasktorchto hasktorchtype1 hasktorchtype2  hasktorchinput hasktorchhx hasktorchparams hasktorch has_biases hasktorch num_layers hasktorchdropout hasktorchtrain hasktorch bidirectional hasktorch batch_first  hasktorchdata hasktorch batch_sizes hasktorchhx hasktorchparams hasktorch has_biases hasktorch num_layers hasktorchdropout hasktorchtrain hasktorch bidirectional  hasktorchinput hasktorchhx hasktorchparams hasktorch has_biases hasktorch num_layers hasktorchdropout hasktorchtrain hasktorch bidirectional hasktorch batch_first  hasktorchdata hasktorch batch_sizes hasktorchhx hasktorchparams hasktorch has_biases hasktorch num_layers hasktorchdropout hasktorchtrain hasktorch bidirectional  hasktorchinput hasktorchhx hasktorchparams hasktorch has_biases hasktorch num_layers hasktorchdropout hasktorchtrain hasktorch bidirectional hasktorch batch_first  hasktorchdata hasktorch batch_sizes hasktorchhx hasktorchparams hasktorch has_biases hasktorch num_layers hasktorchdropout hasktorchtrain hasktorch bidirectional  hasktorchinput hasktorchhx hasktorchparams hasktorch has_biases hasktorch num_layers hasktorchdropout hasktorchtrain hasktorch bidirectional hasktorch batch_first  hasktorchdata hasktorch batch_sizes hasktorchhx hasktorchparams hasktorch has_biases hasktorch num_layers hasktorchdropout hasktorchtrain hasktorch bidirectional hasktorchinput hasktorchhx hasktorchw_ih hasktorchw_hh hasktorchb_ih hasktorchb_hh hasktorchinput hasktorchhx hasktorchw_ih hasktorchw_hh hasktorchb_ih hasktorchb_hh hasktorchinput hasktorchhx hasktorchw_ih hasktorchw_hh hasktorchb_ih hasktorchb_hh hasktorchinput hasktorchhx hasktorchw_ih hasktorchw_hh hasktorchb_ih hasktorchb_hh hasktorchinput hasktorchhx hasktorchw_ih hasktorchw_hh hasktorchb_ih hasktorchb_hh hasktorch packed_ih hasktorch packed_hh hasktorchcol_offsets_ih  hasktorchcol_offsets_hh  hasktorchscale_ih  hasktorchscale_hh  hasktorch zero_point_ih  hasktorch zero_point_hh hasktorchinput hasktorchhx hasktorchw_ih hasktorchw_hh hasktorchb_ih hasktorchb_hh hasktorch packed_ih hasktorch packed_hh hasktorchcol_offsets_ih  hasktorchcol_offsets_hh  hasktorchscale_ih  hasktorchscale_hh  hasktorch zero_point_ih  hasktorch zero_point_hh hasktorchinput hasktorchhx hasktorchw_ih hasktorchw_hh hasktorchb_ih hasktorchb_hh hasktorch packed_ih hasktorch packed_hh hasktorchcol_offsets_ih  hasktorchcol_offsets_hh  hasktorchscale_ih  hasktorchscale_hh  hasktorch zero_point_ih  hasktorch zero_point_hh hasktorchinput hasktorchhx hasktorchw_ih hasktorchw_hh hasktorchb_ih hasktorchb_hh hasktorch packed_ih hasktorch packed_hh hasktorchcol_offsets_ih  hasktorchcol_offsets_hh  hasktorchscale_ih  hasktorchscale_hh  hasktorch zero_point_ih  hasktorch zero_point_hh hasktorchself hasktorchself hasktorchself hasktorchself hasktorchmask hasktorchvalue hasktorchself hasktorchmask hasktorchvalue hasktorchself hasktorchmask hasktorchsource hasktorchself hasktorchindex hasktorchsource hasktorch accumulate hasktorchself hasktorchdim hasktorchindex hasktorchsource hasktorchalpha hasktorchself hasktorchdim hasktorchindex hasktorchsource hasktorchalpha hasktorchself hasktorchdim hasktorchindex hasktorchsource hasktorchreduce hasktorch include_self hasktorchself hasktorchdim hasktorchindex hasktorchvalue hasktorchself hasktorchdim hasktorchindex hasktorchvalue hasktorchself hasktorchdim hasktorchindex hasktorchvalue hasktorchself hasktorchdim hasktorchindex hasktorchvalue hasktorchself hasktorchdim hasktorchindex hasktorchsrc hasktorchself hasktorchdim hasktorchindex hasktorchvalue hasktorchself hasktorchdim hasktorchindex hasktorchsrc hasktorchreduce hasktorchself hasktorchdim hasktorchindex hasktorchvalue hasktorchreduce hasktorchself hasktorchdim hasktorchindex hasktorchsrc hasktorchself hasktorchdim hasktorchindex hasktorchvalue hasktorchself hasktorchdim hasktorchindex hasktorchsrc hasktorchself hasktorchdim hasktorchindex hasktorchsrc hasktorchself hasktorchdim hasktorchindex hasktorchsrc hasktorchreduce hasktorch include_self hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchbatch1 hasktorchbatch2 hasktorchbeta hasktorchalpha hasktorchself hasktorchdiagonal hasktorchself hasktorchother hasktorchdim hasktorchself hasktorchdiagonal hasktorchself hasktorchdiagonal hasktorchself hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchindex hasktorchself hasktorchindices hasktorchdim hasktorchself hasktorchdim hasktorchindex hasktorchself hasktorchdim hasktorchindex hasktorchself hasktorchmask hasktorchself hasktorchself hasktorchself hasktorchself hasktorchdim hasktorchindex hasktorch sparse_grad hasktorchself hasktorchdim hasktorchindex hasktorch sparse_grad hasktorchself hasktorchtensor1 hasktorchtensor2 hasktorchvalue hasktorchself hasktorchtensor1 hasktorchtensor2 hasktorchvalue hasktorchself hasktorchtarget hasktorchweight hasktorch reduction hasktorch ignore_index hasktorchlabel_smoothing hasktorchself hasktorchA hasktorchupper hasktorch transpose hasktorch unitriangular hasktorchself hasktorchB hasktorchupper hasktorchleft hasktorch unitriangular hasktorchx hasktorchN hasktorchself hasktorchsome hasktorch compute_uv hasktorchself hasktorchaxis0 hasktorchaxis1 hasktorchself hasktorchdim0 hasktorchdim1 hasktorchself hasktorchupper hasktorchself hasktorchinput2 hasktorchupper hasktorchself hasktorchupper hasktorchself hasktorchsome hasktorchself hasktorchself hasktorchinput2 hasktorchself hasktorchinput2 hasktorchinput3 hasktorchleft hasktorch transpose hasktorchself hasktorchLU_data hasktorch LU_pivots hasktorchLU_data hasktorch LU_pivots hasktorch unpack_data hasktorch unpack_pivots hasktorchself hasktorchself hasktorchn hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchself hasktorchother hasktorchp hasktorchself hasktorchother hasktorchself hasktorchother hasktorchself hasktorchend hasktorchweight hasktorchself hasktorchend hasktorchweight hasktorchself hasktorchbins hasktorchmin hasktorchmax hasktorchself hasktorchbins hasktorchweight hasktorchdensity hasktorchself hasktorchbins hasktorchrange hasktorchweight hasktorchdensity hasktorchself hasktorchbins hasktorchrange hasktorchweight hasktorchdensity hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself  hasktorchself hasktorchother  hasktorchself  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchq hasktorchdim hasktorchkeepdim hasktorch interpolation  hasktorchself hasktorchq hasktorchdim hasktorchkeepdim hasktorch interpolation  hasktorchself hasktorchq hasktorchdim hasktorchkeepdim hasktorch interpolation  hasktorchself hasktorchq hasktorchdim hasktorchkeepdim hasktorch interpolation  hasktorchself hasktorchdim hasktorch descending  hasktorchself hasktorchstable hasktorchdim hasktorch descending  hasktorchself hasktorchdim hasktorch descending  hasktorchself hasktorchstable hasktorchdim hasktorch descending  hasktorchself  hasktorchself hasktorchdim hasktorch descending  hasktorchself hasktorchstable hasktorchdim hasktorch descending  hasktorchself hasktorchdim hasktorch descending  hasktorchself hasktorchk hasktorchdim hasktorchlargest hasktorchsorted  hasktorchself  hasktorchself  hasktorchself hasktorchp hasktorchdim hasktorchmaxnorm  hasktorchself hasktorchother  hasktorchself hasktorchexponent  hasktorchself hasktorchexponent  hasktorchself hasktorchexponent  hasktorchself hasktorchexponent  hasktorchself hasktorchexponent  hasktorchself hasktorchexponent  hasktorchself  hasktorchself hasktorch boundaries hasktorch out_int32 hasktorchright  hasktorchself hasktorch boundaries hasktorch out_int32 hasktorchright  hasktorchsorted_sequence hasktorchself hasktorch out_int32 hasktorchright hasktorchside hasktorchsorter  hasktorchsorted_sequence hasktorchself hasktorch out_int32 hasktorchright hasktorchside hasktorchsorter  hasktorchself hasktorchtarget hasktorch reduction  hasktorchself hasktorchtarget hasktorch reduction  hasktorchself hasktorchtarget hasktorchp hasktorchmargin hasktorchweight hasktorch reduction  hasktorchself hasktorchtarget hasktorch reduction  hasktorchself hasktorchtarget hasktorch reduction  hasktorchself hasktorchtarget hasktorchweight hasktorch reduction hasktorch ignore_index  hasktorchself hasktorchtarget hasktorchweight hasktorch reduction hasktorch ignore_index  hasktorchself hasktorchtarget hasktorchweight hasktorch reduction hasktorch ignore_index  hasktorchself hasktorchtarget hasktorchweight hasktorch reduction hasktorch ignore_index  hasktorchself hasktorchtarget hasktorchweight hasktorch reduction hasktorch ignore_index  hasktorchself hasktorchtarget hasktorch reduction hasktorchbeta  hasktorchself hasktorchtarget hasktorch reduction hasktorchdelta  hasktorchself hasktorchtarget hasktorch reduction  hasktorchself hasktorchalpha hasktorchscale hasktorch input_scale  hasktorchself hasktorchdim  hasktorchglu hasktorchx hasktorchdx hasktorchdim  hasktorchgrad_x hasktorchgrad_glu hasktorchx hasktorch dgrad_glu hasktorchdx hasktorchdim  hasktorchself  hasktorchself hasktorchmin_val hasktorchmax_val  hasktorchself  hasktorchself hasktorchnegative_slope  hasktorchself  hasktorchself  hasktorchself hasktorchbeta hasktorch threshold  hasktorchself hasktorchlambd  hasktorchself hasktorch output_size  hasktorchself hasktorch output_size  hasktorchself hasktorch output_size  hasktorchself hasktorch output_size  hasktorchself hasktorch output_size  hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorch ceil_mode hasktorchcount_include_pad hasktorchdivisor_override  hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorch ceil_mode hasktorchcount_include_pad hasktorchdivisor_override  hasktorchself hasktorch kernel_size hasktorch output_size hasktorchrandom_samples  hasktorchself hasktorch kernel_size hasktorch output_size hasktorchrandom_samples  hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode  hasktorchself hasktorch kernel_size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil_mode  hasktorchself hasktorchindices hasktorch output_size  hasktorchself hasktorchindices hasktorch output_size hasktorchstride hasktorchpadding  hasktorchself hasktorchpadding  hasktorchself hasktorchpadding  hasktorchself hasktorchpadding  hasktorchself hasktorchpadding  hasktorchself hasktorchpadding  hasktorchself hasktorchpadding  hasktorchself hasktorchpad hasktorchmode hasktorchvalue  hasktorchinput hasktorch output_size hasktorch align_corners hasktorch scale_factors  hasktorchinput hasktorch output_size hasktorch align_corners hasktorch scale_factors  hasktorchinput hasktorch output_size hasktorch align_corners hasktorch scale_factors  hasktorchinput hasktorch output_size hasktorch align_corners hasktorch scale_factors  hasktorchinput hasktorch output_size hasktorch scale_factors  hasktorchinput hasktorch output_size hasktorch scale_factors  hasktorchinput hasktorch output_size hasktorch scale_factors  hasktorchself hasktorch output_size hasktorch align_corners hasktorchscales_h hasktorchscales_w  hasktorchself hasktorch output_size hasktorch align_corners hasktorchscales_h hasktorchscales_w  hasktorchself hasktorch output_size hasktorch align_corners hasktorchscales_d hasktorchscales_h hasktorchscales_w  hasktorchself hasktorch output_size hasktorchscales_h hasktorchscales_w  hasktorchself hasktorch output_size hasktorchscales_d hasktorchscales_h hasktorchscales_w  hasktorchself hasktorchweight hasktorch kernel_size hasktorchbias hasktorchstride hasktorchpadding hasktorchoutput_padding hasktorchdilation  hasktorchself hasktorchweight hasktorch kernel_size hasktorchbias hasktorchstride hasktorchpadding hasktorchoutput_padding hasktorchdilation  hasktorchself hasktorchweight hasktorch kernel_size hasktorchbias hasktorchstride hasktorchpadding  hasktorchself hasktorchweight hasktorch kernel_size hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation  hasktorchself hasktorchweight hasktorch kernel_size hasktorchbias hasktorchstride hasktorchpadding  hasktorchself hasktorchweight hasktorch kernel_size hasktorchbias hasktorchstride hasktorchpadding  hasktorchself hasktorchweight hasktorch kernel_size hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation  hasktorchself hasktorchweight hasktorch kernel_size hasktorchbias hasktorchstride hasktorchpadding hasktorchdilation  hasktorchself hasktorch output_size hasktorch kernel_size hasktorchdilation hasktorchpadding hasktorchstride  hasktorchtensors  hasktorchself hasktorch kernel_size hasktorchdilation hasktorchpadding hasktorchstride  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself hasktorcheps  hasktorchn hasktorchself  hasktorchself hasktorchdim hasktorchkeepdim  hasktorchself  hasktorchself  hasktorchself hasktorchdecimals  hasktorchself  hasktorchself hasktorchdim hasktorchdtype  hasktorchself hasktorchother  hasktorchself hasktorchother  hasktorchself hasktorchp  hasktorchself hasktorchdim hasktorchdtype  hasktorchself hasktorchn hasktorchdim hasktorchnorm  hasktorchself hasktorchn hasktorchdim hasktorchnorm  hasktorchself hasktorchn hasktorchdim hasktorchnorm  hasktorchself hasktorchn hasktorchdim hasktorchnorm  hasktorchself hasktorchn hasktorchdim hasktorchnorm  hasktorchself hasktorchn hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchs hasktorchdim hasktorchnorm  hasktorchself hasktorchdim  hasktorchself hasktorchdim  hasktorchself hasktorchupper hasktorch check_errors  hasktorchself hasktorchupper  hasktorchself hasktorchother hasktorchdim  hasktorchA hasktorchpivot  hasktorchA hasktorchpivot hasktorch check_errors  hasktorchA hasktorchpivot  hasktorchLU hasktorchpivots hasktorchB hasktorchleft hasktorchadjoint  hasktorchA  hasktorchself  hasktorchself hasktorch hermitian hasktorch check_errors  hasktorchself hasktorch hermitian  hasktorchLD hasktorchpivots hasktorchB hasktorch hermitian  hasktorcha hasktorchb hasktorchrcond hasktorchdriver  hasktorchb hasktorcha  hasktorchself hasktorchother  hasktorchx hasktorchy hasktorchdim  hasktorchself  hasktorchA  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself hasktorchUPLO  hasktorchself hasktorchUPLO  hasktorchinput hasktorchtau  hasktorchA hasktorch check_errors  hasktorchA  hasktorchself  hasktorchself hasktorchother  hasktorchself hasktorchvec2  hasktorchself hasktorchvec2  hasktorchself hasktorchord hasktorchdim hasktorchkeepdim hasktorchdtype  hasktorchself hasktorchord hasktorchdim hasktorchkeepdim hasktorchdtype  hasktorchself hasktorchord hasktorchdim hasktorchkeepdim hasktorchdtype  hasktorchA hasktorch full_matrices hasktorchdriver  hasktorchA hasktorchdriver  hasktorchself hasktorchp  hasktorchself hasktorchatol hasktorchrtol hasktorch hermitian  hasktorchself hasktorchatol hasktorchrtol hasktorch hermitian  hasktorchself hasktorchrcond hasktorch hermitian  hasktorchself hasktorchrcond hasktorch hermitian  hasktorchA hasktorchB hasktorchleft hasktorch check_errors  hasktorchA hasktorchB hasktorchleft  hasktorchself hasktorchind  hasktorchself hasktorchother hasktorchdims  hasktorchA hasktorchmode  hasktorchself hasktorchn  hasktorchinput hasktorchatol hasktorchrtol hasktorch hermitian  hasktorchself hasktorchatol hasktorchrtol hasktorch hermitian  hasktorchself hasktorchtol hasktorch hermitian  hasktorchinput hasktorchtol hasktorch hermitian  hasktorchtensors  hasktorchself hasktorchpadding hasktorch output_size  hasktorchdata hasktorchreduce hasktorchlengths hasktorchindices hasktorchoffsets hasktorchaxis hasktorchunsafe hasktorchinitial  hasktorch sequences hasktorch batch_first hasktorch padding_value  hasktorchtensors  hasktorchflat hasktorchtensors  hasktorchself  hasktorchself  hasktorchself hasktorchsize hasktorchstride hasktorchstorage_offset  hasktorchself hasktorchoffset hasktorchdim1 hasktorchdim2  hasktorchself hasktorchsize hasktorchimplicit  hasktorchself hasktorchdims  hasktorchself hasktorchdim hasktorchindex  hasktorchself  hasktorchself hasktorchdim hasktorchstart hasktorchend hasktorchstep  hasktorchself hasktorch split_size hasktorchdim  hasktorchself hasktorch split_sizes hasktorchdim  hasktorchself  hasktorchself hasktorchdim  hasktorchself  hasktorchself hasktorchdim0 hasktorchdim1  hasktorchself hasktorchdim  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchself hasktorchdim  hasktorchself hasktorchsize  hasktorchself hasktorchdtype  hasktorchself hasktorch dimension hasktorchsize hasktorchstep  hasktorchself  hasktorchquery hasktorchkey hasktorchvalue hasktorch attn_mask hasktorch dropout_p hasktorch is_causal  hasktorchx  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchself  hasktorchself  hasktorchself  hasktorchself  hasktorchx  hasktorchx  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx hasktorchn  hasktorchx  hasktorchself hasktorchindices hasktorchmax_norm hasktorch norm_type  hasktorchself hasktorchsize hasktorch memory_format  hasktorchself hasktorch the_template hasktorch memory_format  hasktorchself hasktorch the_template  hasktorchself  hasktorchself hasktorchsize hasktorch sparse_dim hasktorch dense_dim  hasktorchself hasktorchsize hasktorch sparse_dim hasktorch dense_dim  hasktorchself hasktorchsrc hasktorch non_blocking  hasktorchself hasktorchsource  hasktorchself                                                                                    None_o  hasktorch;Returns the mean value of all elements in the input tensor.  hasktorchReturns the standard deviation of all elements in the input tensor.  hasktorch9Returns the variance of all elements in the input tensor.  hasktorch4Returns the sum of all elements in the input tensor.  hasktorchComputes the element-wise absolute value of the given input tensor.  hasktorchComputes the fractional portion of each element in input. out_i = input_i - (floor . abs) input_i * (sign input_i)  hasktorchReturns the indices of the maximum value of all elements in the input tensor.  hasktorchEach element of the tensor other added to each element of the tensor input. The resulting tensor is returned.  hasktorchMultiplies each element of the tensor other to each element of the input tensor and returns a new resulting tensor.  hasktorchElement wise subtraction of other tensor from input tensor and returns a new resulting tensor  hasktorchElement wise division of input tensor by other tensor and returns a new resulting tensor  hasktorchceil  hasktorchfloor  hasktorchmin  hasktorchmax  hasktorchmedian  hasktorchAdds each element of the input input with the scalar and returns a new resulting tensor.  hasktorchSubtracts each element of the input input with the scalar and returns a new resulting tensor.  hasktorchMultiplies each element of the input input with the scalar and returns a new resulting tensor.  hasktorchDivides each element of the input input with the scalar and returns a new resulting tensor.  hasktorchMatrix product of two tensors.The behavior depends on the dimensionality of the tensors as follows: If both tensors are 1-dimensional, the dot product (scalar) is returned. If both arguments are 2-dimensional, the matrix-matrix product is returned. If the first argument is 1-dimensional and the second argument is 2-dimensional, a 1 is prepended to its dimension for the purpose of the matrix multiply. After the matrix multiply, the prepended dimension is removed. If the first argument is 2-dimensional and the second argument is 1-dimensional, the matrix-vector product is returned. If both arguments are at least 1-dimensional and at least one argument is N-dimensional (where N > 2), then a batched matrix multiply is returned. If the first argument is 1-dimensional, a 1 is prepended to its dimension for the purpose of the batched matrix multiply and removed after. If the second argument is 1-dimensional, a 1 is appended to its dimension for the purpose of the batched matrix multiple and removed after. The non-matrix (i.e. batch) dimensions are broadcasted (and thus must be broadcastable). For example, if input is a (j times 1 times n times m)(j1nm) tensor and other is a (k times m times p)(kmp) tensor, out will be an (j times k times n times p)(jknp) tensor.  hasktorchA simple lookup table that looks up embeddings in a fixed dictionary and size. This module is often used to retrieve word embeddings using indices. The input to the module is a list of indices, and the embedding matrix, and the output is the corresponding word embeddings.  hasktorchA one hot encoding of the given input. The encoding is based on the given number of classes.  hasktorch+Computes the error function of each element  hasktorchComputes the complementary error function of each element of input  hasktorchComputes the inverse error function of each element of input. The inverse error function is defined in the range (-1, 1)(D1,1) as: erfinv(erf(x)) = x  hasktorch6Computes the logarithm of the gamma function on input.  hasktorchComputes the logarithmic derivative of the gamma function on input.  hasktorchComputes the nth derivative of the digamma function on input. n geq 0nD0 is called the order of the polygamma function.  hasktorchComputes the multivariate log-gamma function with dimension pp element-wise. All elements must be greater than (p-1)/2, otherwise an error would be thrown.  hasktorchReturns a new tensor with the exponential of the elements of the input tensor input.  hasktorch?Returns a new tensor with the natural logarithm of (1 + input).  hasktorchReturns a new tensor with the logarithm to the base 2 of the elements of input.  hasktorchReturns a new tensor with the natural logarithm of the elements of input.  hasktorchReturns a new tensor with the logarithm to the base 10 of the elements of input.  hasktorchTakes the power of each element in input with exponent and returns a tensor with the result.  hasktorchTakes the power of each element in input with exponent and returns a tensor with the result. Exponent is a tensor with the same number of elements as input.  hasktorch8Applies the rectified linear unit function element-wise.  hasktorchApplies Exponential linear unit function element-wise, with alpha input, ;\text{ELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x) - 1))  hasktorchApplies exponential linear unit function element wise with default alpha value = 1  hasktorchApplies element-wise, \text{SELU}(x) = scale * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1)) , with =1.6732632423543772848170429916717 and scale=1.0507009873554804934193349852946.  hasktorchApplies element-wise, \text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x/\alpha) - 1)).  hasktorch*Applies the element-wise function sigmoid.  hasktorchApplies a softmax function. It is applied to all slices along dim, and will re-scale them so that the elements lie in the range [0, 1] and sum to 1.  hasktorchApplies a softmax followed by a logarithm. While mathematically equivalent to log(softmax(x)), doing these two operations separately is slower, and numerically unstable. This function uses an alternative formulation to compute the output and gradient correctly.  hasktorch,Thresholds each element of the input Tensor.  hasktorch other element-wise. The second argument can be a number or a tensor whose shape is broadcastable with the first argument.  hasktorchComputes input < other element-wise. The second argument can be a number or a tensor whose shape is broadcastable with the first argument.  hasktorchComputes input >= other element-wise. The second argument can be a number or a tensor whose shape is broadcastable with the first argument.  hasktorchComputes input <= other element-wise. The second argument can be a number or a tensor whose shape is broadcastable with the first argument.  hasktorchComputes input == other element-wise. The second argument can be a number or a tensor whose shape is broadcastable with the first argument.  hasktorchReturns a new tensor with the elements of input at the given indices. The input tensor is treated as if it were viewed as a 1-D tensor. The result takes the same shape as the indices.  hasktorchReturns a new 1-D tensor which indexes the input tensor according to the boolean mask mask which is a BoolTensor. The shapes of the mask tensor and the input tensor don@t need to match, but they must be broadcastable.  hasktorchReturns a tuple of 1-D tensors, one for each dimension in input, each containing the indices (in that dimension) of all non-zero elements of input .  hasktorchComputes input /= other element-wise. The second argument can be a number or a tensor whose shape is broadcastable with the first argument.  hasktorchCasting to given Dtype, where Dtype is an object that represents the data type of a tensor in hasktorch.  hasktorch squeezeAll  hasktorch squeezeDim  hasktorchReturns a tuple (values, indices) where values is the cumulative maximum of elements of input in the dimension dim. And indices is the index location of each maximum value found in the dimension dim.  hasktorchReturns a tuple (values, indices) where values is the cumulative minimum of elements of input in the dimension dim. And indices is the index location of each maximum value found in the dimension dim.  hasktorchReturns the cumulative product of elements of input in the dimension dim. For example, if input is a vector of size N, the result will also be a vector of size N, with elements.  hasktorchReturns the cumulative sum of elements of input in the dimension dim. For example, if input is a vector of size N, the result will also be a vector of size N, with elements.  hasktorchFunction that measures the Binary Cross Entropy between the target and the output.  hasktorchBinary Cross Entropy with weights defaulted to 1.0 & reduction defaulted to ReduceMean  hasktorchThis loss combines a Sigmoid layer and the BCELoss in one single class. This version is more numerically stable than using a plain Sigmoid followed by a BCELoss as, by combining the operations into one layer, we take advantage of the log-sum-exp trick for numerical stability.  hasktorchCreates a criterion that measures the mean squared error (squared L2 norm) between each element in the input and target.  hasktorch!The negative log likelihood loss.  hasktorchReturns cosine similarity between x1 and x2, computed along dim.  hasktorch1Returns cosine similarity with defaulted options.  hasktorchThe Connectionist Temporal Classification loss. Calculates loss between a continuous (unsegmented) time series and a target sequence. CTCLoss sums over the probability of possible alignments of input to target, producing a loss value which is differentiable with respect to each input node. The alignment of input to target is assumed to be @many-to-one@, which limits the length of the target sequence such that it must be leqD the input length.  hasktorch(Returns CTC loss with defaulted options.  hasktorchReturns the p-norm of (input - other) The shapes of input and other must be broadcastable.  hasktorchMeasures the loss given an input tensor xx and a labels tensor yy (containing 1 or -1). This is usually used for measuring whether two inputs are similar or dissimilar, e.g. using the L1 pairwise distance as xx, and is typically used for learning nonlinear embeddings or semi-supervised learning.  hasktorch#The 2D negative log likelihood loss  hasktorchCreates a criterion that optimizes a multi-class classification hinge loss (margin-based loss) between input x% (a 2D mini-batch Tensor) and output y/ (which is a 1D tensor of target class indices)  hasktorchCreates a criterion that optimizes a multi-label one-versus-all loss based on max-entropy, between input x and target y of size (N,C) .  hasktorchThe Kullback-Leibler divergence Loss KL divergence is a useful distance measure for continuous distributions and is often useful when performing direct regression over the space of (discretely sampled) continuous output distributions. As with NLLLoss, the input given is expected to contain log-probabilities and is not restricted to a 2D Tensor. The targets are interpreted as probabilities by default, but could be considered as log-probabilities with log_target set to True. This criterion expects a target Tensor of the same size as the input Tensor.  hasktorchCreates a criterion that uses a squared term if the absolute element-wise error falls below 1 and an L1 term otherwise. It is less sensitive to outliers than the MSELoss and in some cases prevents exploding gradients (e.g. see Fast R-CNN paper by Ross Girshick). Also known as the Huber loss.  hasktorchCreates a criterion that optimizes a two-class classification logistic loss between input tensor x and target tensor y (containing 1 or -1).  hasktorchApplies a 1D adaptive max pooling over an input signal composed of several input planes.  hasktorchApplies a 2D adaptive max pooling over an input signal composed of several input planes.  hasktorchApplies a 3D adaptive max pooling over an input signal composed of several input planes  hasktorchmaxPool1dWithIndices  hasktorchApplies a 1D max pooling over an input signal composed of several input planes.  hasktorchApplies a 2D max pooling over an input signal composed of several input planes.  hasktorchApplies a 3D max pooling over an input signal composed of several input planes.  hasktorchCalculates resulting dimensions from a 2d maxpool operation see https://pytorch.org/docs/master/generated/torch.nn.MaxPool2d.html#torch.nn.MaxPool2d  hasktorchApplies a 1D average pooling over an input signal composed of several input planes.  hasktorchApplies a 1D adaptive average pooling over an input signal composed of several input planes.  hasktorchApplies a 2D adaptive average pooling over an input signal composed of several input planes.  hasktorchApplies a 3D adaptive average pooling over an input signal composed of several input planes.  hasktorch.Takes the inverse of the square matrix input. input can be batches of 2D square tensors, in which case this function would return a tensor composed of individual inverses.  hasktorchSolves a system of equations with a triangular coefficient matrix AA and multiple right-hand sides bb  hasktorchThis function returns eigenvalues and eigenvectors of a real symmetric matrix input or a batch of real symmetric matrices, represented by a namedtuple (eigenvalues, eigenvectors).  hasktorchComputes the eigenvalues and eigenvectors of a real square matrix  hasktorchThis function returns a namedtuple (U, S, V) which is the singular value decomposition of a input real matrix or batches of real matrices input such that input = U * diag(S) * V^T  hasktorchComputes the Cholesky decomposition of a symmetric positive-definite matrix AA or for batches of symmetric positive-definite matrices.  hasktorchSolves a linear system of equations with a positive semidefinite matrix to be inverted given its Cholesky factor matrix uu .  hasktorchThis function returns the solution to the system of linear equations represented by AX = BAX=B and the LU factorization of A, in order as a namedtuple solution.  hasktorchSolves a linear system of equations with a positive semidefinite matrix to be inverted given its Cholesky factor matrix uu .  hasktorchThis is a low-level function for calling LAPACK directly. This function returns a namedtuple (a, tau) as defined in LAPACK documentation for geqrf.  hasktorchComputes the orthogonal matrix Q of a QR factorization, from the (input, input2) tuple returned by  ? function. This directly calls the underlying LAPACK function ?orgqr. See LAPACK documentation for orgqr for further details.  hasktorchMultiplies mat (given by input3) by the orthogonal Q matrix of the QR factorization formed by torch.geqrf() that is represented by (a, tau) (given by (input, input2)). This directly calls the underlying LAPACK function ?ormqr. See LAPACK documentation for ormqr for further details.  hasktorchReturns the LU solve of the linear system Ax = bAx=b using the partially pivoted LU factorization of A from torch.lu().  hasktorchDuring training, randomly zeroes some of the elements of the input tensor with probability p using samples from a Bernoulli distribution.  hasktorch#Applies alpha dropout to the input.  hasktorchComputes the bitwise NOT of the given input tensor. The input tensor must be of integral or Boolean types. For bool tensors, it computes the logical NOT.  hasktorchComputes the element-wise logical NOT of the given input tensor. If not specified, the output tensor will have the bool dtype. If the input tensor is not a bool tensor, zeros are treated as False and non-zeros are treated as True.  hasktorchConcatenates the given sequence of seq tensors in the given dimension. All tensors must either have the same shape (except in the concatenating dimension) or be empty.  hasktorchPuts values from the tensor value into the input tensor (out-of-place) using the indices specified in indices (which is a tuple of Tensors). The expression tensor.index_put_(indices, value) is equivalent to tensor[indices] = value. If accumulate is True, the elements in value are added to self. If accumulate is False, the behavior is undefined if indices contain duplicate elements.  hasktorchSplits a tensor into a specific number of chunks. Last chunk will be smaller if the tensor size along the given dimension dim is not divisible by chunks.  hasktorchClamp all elements in input into the range [ min, max ] and return a resulting tensor.  hasktorch8Clamps all elements in input to be smaller or equal max.  hasktorch7Clamps all elements in input to be larger or equal min.  hasktorch7Pads the input tensor boundaries with a constant value.  hasktorchApplies a 1D convolution over an input signal composed of several input planes.  hasktorchApplies a 2D convolution over an input signal composed of several input planes.  hasktorchApplies a 3D convolution over an input signal composed of several input planes.  hasktorchApplies a 1D transposed convolution over an input signal composed of several input planes  hasktorchApplies a 2D transposed convolution over an input signal composed of several input planes  hasktorchApplies a 3D transposed convolution over an input signal composed of several input planes  hasktorch7Returns a new tensor with the signs of the elements of input  hasktorch1Returns a tensor that is a transposed version of input. The given dimensions dim0 and dim1 are swapped.  hasktorch&transpose special case for a 2D tensor  hasktorchReturns a tensor with the elements of input as the diagonal. The second argument controls which diagonal to consider: If Int = 0, it is the main diagonal. If Int > 0, it is above the main diagonal. If Int < 0, it is below the main diagonal.  hasktorchIf input is a vector (1-D tensor), then returns a 2-D square tensor with the elements of input as the diagonal. If input is a tensor with more than one dimension, then returns a 2-D tensor with diagonal elements equal to a flattened input. The argument offset controls which diagonal to consider: If offset = 0, it is the main diagonal. If offset > 0, it is above the main diagonal. If offset < 0, it is below the main diagonal.  hasktorchReturns a partial view of input with the its diagonal elements with respect to dim1 and dim2 appended as a dimension at the end of the shape. Applying diagEmbed to the output of this function with the same arguments yields a diagonal matrix with the diagonal entries of the input. However, diagEmbed has different default dimensions, so those need to be explicitly specified.  hasktorchReturns True if all elements in the tensor are True, False otherwise.  hasktorchReturns True if any elements in the tensor are True, False otherwise.  hasktorchReturns True if all elements in each row of the tensor in the given dimension dim are True, False otherwise. If keepdim is True, the output tensor is of the same size as input except in the dimension dim where it is of size 1. Otherwise, dim is squeezed, resulting in the output tensor having 1 fewer dimension than input.  hasktorchReturns True if any elements in each row of the tensor in the given dimension dim are True, False otherwise. If keepdim is True, the output tensor is of the same size as input except in the dimension dim where it is of size 1. Otherwise, dim is squeezed, resulting in the output tensor having 1 fewer dimension than input.  hasktorch&Permute the dimensions of this tensor.  hasktorch"expand TODO: figure out what the implicit boolean value does  hasktorchflatten  hasktorch flattenAll  hasktorch%A long short-term memory (LSTM) cell.  hasktorch!A gated recurrent unit (GRU) cell  hasktorch)An Elman RNN cell with tanh non-linearity  hasktorch)An Elman RNN cell with ReLU non-linearity  hasktorch/A quantized long short-term memory (LSTM) cell.  hasktorch1A quantized long gated recurrent unit (GRU) cell.  hasktorch2A quantized Elman RNN cell with relu non-linearity  hasktorch2A quantized Elman RNN cell with tanh non-linearity  hasktorch/Applies the soft shrinkage function elementwise  hasktorchConcatenates sequence of tensors along a new dimension. All tensors need to be of the same size.  hasktorchReturns the sum of each row of the input tensor in the given dimension dim. If keepdim is True, the output tensor is of the same size as input except in the dimension(s) dim where it is of size 1. Otherwise, dim is squeezed, resulting in the output tensor having 1 (or len(dim)) fewer dimension(s).  hasktorchReturns the k largest elements of the given input tensor along a given dimension. If largest is False then the k smallest elements are returned. The boolean option sorted if True, will make sure that the returned k elements are themselves sorted A tuple of (values, indices) is returned, where the indices are the indices of the elements in the original input tensor.  hasktorchReturns the log of summed exponentials of each row of the input tensor in the given dimension dim. The computation is numerically stabilized.  hasktorchReturns the upper triangular part of a matrix (2-D tensor) or batch of matrices input, the other elements of the result tensor out are set to 0. The upper triangular part of the matrix is defined as the elements on and above the diagonal. The argument diagonal controls which diagonal to consider. If diagonal = 0, all elements on and above the main diagonal are retained. A positive value excludes just as many diagonals above the main diagonal, and similarly a negative value includes just as many diagonals below the main diagonal. The main diagonal are the set of indices (i,i) for i \in [0,\min(d_1,d_2)-1] where d_1 and d_2 " are the dimensions of the matrix.  hasktorchReturns the lower triangular part of the matrix (2-D tensor) or batch of matrices input, the other elements of the result tensor out are set to 0. The lower triangular part of the matrix is defined as the elements on and below the diagonal. The argument diagonal controls which diagonal to consider. If diagonal = 0, all elements on and below the main diagonal are retained. A positive value includes just as many diagonals above the main diagonal, and similarly a negative value excludes just as many diagonals below the main diagonal. The main diagonals are the set of indices (i,i) for i \in [0,\min(d_1,d_2)-1] where d_1 and d_2 " are the dimensions of the matrix.  hasktorchReturns a new tensor with the truncated integer values of the elements of input.  hasktorchReturns the unique elements of the input tensor along a dimension.  hasktorchEliminates all but the first element from every consecutive group of equivalent elements. This function is different from uniqueDim in the sense that this function only eliminates consecutive duplicate values.  hasktorchEliminates all but the first element from every consecutive group of equivalent elements along a dimension. This function is different from uniqueDim in the sense that this function only eliminates consecutive duplicate values.  hasktorchReturns a new tensor with a dimension of size one inserted at the specified position. The returned tensor shares the same underlying data with this tensor. A dim value within the range [(dim input) - 1, (dim input) + 1)] can be used. Negative dim will correspond to unsqueeze applied at dim = dim + (dim input) + 1  hasktorchUpsamples the input, using bilinear upsampling. Expected inputs are spatial (4 dimensional).  hasktorchApplies a 2D nearest neighbor upsampling to an input signal composed of several input channels.  hasktorch8Splits the tensor into chunks of given size if possible.  hasktorchCreates a criterion that measures the mean absolute error (MAE) between each element in the input x and target y .  hasktorch$Applies the element-wise function: \text{LeakyReLU}(x) = \max(0,x) + \text{negative_slope} D \min(0,x)  hasktorch$Applies the element-wise function: 6\text{LogSigmoid}(x) = \log(\frac{ 1 }{ 1 + \exp(-x)})  hasktorchReturns a namedtuple (values, indices) where values is the maximum value of each row of the input tensor in the given dimension dim. And indices is the index location of each maximum value found (argmax). If keepdim is True, the output tensors are of the same size as input except in the dimension dim where they are of size 1. Otherwise, dim is squeezed , resulting in the output tensors having 1 fewer dimension than input.  hasktorchReturns a namedtuple (values, indices) where values is the minimum value of each row of the input tensor in the given dimension dim. And indices is the index location of each minimum value found (argmin). If keepdim is True, the output tensors are of the same size as input except in the dimension dim where they are of size 1. Otherwise, dim is squeezed, resulting in the output tensors having 1 fewer dimension than input.  hasktorchReturns the mean value of each row of the input tensor in the given dimension dim. If dim is a list of dimensions, reduce over all of them. If keepdim is True, the output tensor is of the same size as input except in the dimension(s) dim where it is of size 1. Otherwise, dim is squeezed (see torch.squeeze()), resulting in the output tensor having 1 (or len(dim)) fewer dimension(s).  hasktorchReturns a namedtuple (values, indices) where values is the median value of each row of the input tensor in the given dimension dim. And indices is the index location of each median value found. By default, dim is the last dimension of the input tensor. If keepdim is True, the output tensors are of the same size as input except in the dimension dim where they are of size 1. Otherwise, dim is squeezed (see torch.squeeze()), resulting in the outputs tensor having 1 fewer dimension than input.  hasktorchReturns the matrix product of the NN 2-D tensors. This product is efficiently computed using the matrix chain order algorithm which selects the order in which incurs the lowest cost in terms of arithmetic operations. Note that since this is a function to compute the product, NN needs to be greater than or equal to 2; if equal to 2 then a trivial matrix-matrix product is returned. If NN is 1, then this is a no-op - the original matrix is returned as is.  hasktorch"Applies element-wise the function \text{GELU}(x) = x * \Phi(x) where \Phi(x) is the Cumulative Distribution Function for Gaussian Distribution.  hasktorch"The gated linear unit. Computes: &\text{GLU}(a, b) = a \otimes \sigma(b): where input is split in half along dim to form a and b, \sigma is the sigmoid function and \otimes. is the element-wise product between matrices.  hasktorchReturns the standard-deviation and mean of all elements in the input tensor. If unbiased is False, then the standard-deviation will be calculated via the biased estimator. Otherwise, Bessel@s correction will be used.  hasktorchReturns the standard-deviation and mean of each row of the input tensor in the dimension dim. If dim is a list of dimensions, reduce over all of them. If keepdim is True, the output tensor is of the same size as input except in the dimension(s) dim where it is of size 1. Otherwise, dim is squeezed, resulting in the output tensor having 1 (or len(dim)) fewer dimension(s). If unbiased is False, then the standard-deviation will be calculated via the biased estimator. Otherwise, Bessel@s correction will be used.  hasktorchReturns a copy of input. Output tensor keeps a computational graph and a requires_grad value of input tensor. https://discuss.pytorch.org/t/clone-and-detach-in-v0-4-0/16861/41  hasktorchReturns a copy of input. Output tensor does not keep a computational graph and a requires_grad value of input tensor.  hasktorchReturns a new tensor with the same data as the input tensor but of a different shape.  hasktorch3Repeats this tensor along the specified dimensions.  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchthe dimension to reduce hasktorch1whether the output tensor has dim retained or not hasktorchinput hasktorchoutput  hasktorchinput hasktorchother hasktorchoutput  hasktorchinput hasktorchother hasktorchoutput  hasktorchinput hasktorchother hasktorchoutput  hasktorchinput hasktorchother  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchsummand hasktorchinput hasktorchoutput  hasktorch subtrahend hasktorchinput hasktorchoutput  hasktorch multiplier hasktorchinput hasktorchoutput  hasktorchdivisor hasktorchinput hasktorchoutput  hasktorch&first tensor for matrix multiplication hasktorch'second tensor for matrix multiplication hasktorchoutput  hasktorch7whether or not to scale the gradient by the frequencies hasktorch&whether or not the embedding is sparse hasktorchweights hasktorchpadding hasktorchindices hasktorchoutput  hasktorchweights hasktorchindices hasktorchoutput  hasktorchnumber of classes hasktorchinput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchn hasktorchinput hasktorchoutput  hasktorchp hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchexponent hasktorchinput hasktorchoutput  hasktorchinput hasktorchexponent hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchalpha value for ELU formulation hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchalpha hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorch dimension hasktorchinput hasktorchoutput  hasktorch dimension hasktorchinput hasktorchoutput  hasktorch threshold hasktorchvalue hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput hasktorchother  hasktorchinput hasktorchother hasktorchoutput  hasktorchinput hasktorchother hasktorchoutput  hasktorchinput hasktorchother hasktorchoutput  hasktorchinput hasktorchother hasktorchoutput  hasktorchindex hasktorchinput hasktorchoutput  hasktorchmask hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchrtol hasktorchatol hasktorch equal_nan hasktorchself hasktorchother  hasktorchself  hasktorchself  hasktorchself hasktorchother  hasktorchinput hasktorch/True if the data type of input is a signed type  hasktorchinput hasktorchother hasktorchoutput  hasktorchdata type to cast to hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchdim hasktorchinput hasktorchoutput  hasktorchdim hasktorchinput hasktorchoutput (values, indices)  hasktorchdim hasktorchinput hasktorchoutput (values, indices)  hasktorchdim hasktorchdtype hasktorchinput hasktorchoutput  hasktorchdim hasktorchdtype hasktorchinput hasktorchoutput  hasktorch.Specifies the reduction to apply to the output hasktorchtarget hasktorchweight hasktorchinput hasktorchoutput  hasktorchtarget hasktorchinput hasktorchoutput  hasktorch.Specifies the reduction to apply to the output hasktorchtarget hasktorchweight hasktorch pos_weight hasktorchinput hasktorchoutput  hasktorch target tensor hasktorchinput hasktorchoutput  hasktorch target tensor hasktorchinput hasktorchoutput  hasktorch dimension of vectors (default=1) hasktorch1small value to avoid division by 0 (default=1e-8) hasktorchx1 hasktorchx2 hasktorchoutput  hasktorchx1 hasktorchx2 hasktorchoutput  hasktorchzero_infinity - Whether to zero infinite losses and the associated gradients (False by default). Infinite losses mainly occur when the inputs are too short to be aligned to the targets. hasktorch blank label hasktorch reduction hasktorch input_lengths hasktorchtarget_lengths hasktorch log_probs hasktorchtargets hasktorchoutput  hasktorch reduction hasktorch input lengths hasktorchtarget lengths hasktorch log probs hasktorchtargets hasktorchoutput  hasktorchp hasktorchother hasktorchinput hasktorchoutput  hasktorchmargin hasktorch reduction hasktorchtarget hasktorchself hasktorchoutput  hasktorchinput1 hasktorchinput2 hasktorchtarget hasktorchmargin hasktorch reduction hasktorchoutput  hasktorch reduction hasktorchp hasktorchmargin hasktorchinput hasktorchtarget hasktorchweight hasktorchoutput  hasktorchself hasktorchtarget hasktorchoutput  hasktorch reduction hasktorchself hasktorchtarget hasktorchoutput  hasktorch reduction hasktorchinput hasktorchtarget hasktorchoutput  hasktorch output size hasktorchinput hasktorchoutput  hasktorch output size hasktorchinput hasktorchoutput  hasktorch output size hasktorchinput  hasktorch kernel size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil mode hasktorchinput hasktorchoutput, indices  hasktorch kernel size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil mode hasktorchinput hasktorchoutput  hasktorch kernel size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil mode hasktorchinput hasktorchoutput  hasktorch kernel size hasktorchstride hasktorchpadding hasktorchdilation hasktorch ceil mode hasktorchinput hasktorchoutput  hasktorch kernel size hasktorchstride hasktorchpadding hasktorchdilation hasktorchCeiling or Floor hasktorchimage dimensions hasktorchheight, width after maxPool  hasktorch kernel size hasktorchstride hasktorchpadding hasktorch ceil mode hasktorchcount include pad hasktorchinput hasktorchoutput  hasktorch kernel size hasktorchstride hasktorchpadding hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchoutput size (Height * Width) hasktorchinput hasktorchoutput  hasktorch$output size (Depth * Height * Width) hasktorchinput hasktorchoutput  hasktorchinput hasktorchoutput  hasktorchA hasktorchupper hasktorch transpose hasktorch unitriangular hasktorchinput hasktorchoutput  hasktorch