This is the final report for the GSoC 2021 project - Gradually Typed Hasktorch. Gradually Typed Hasktorch is a new tensor API for Hasktorch. It has been initiated by my mentor, Torsten Scholak, and as part of GSoC I helped with streamlining this new API.
Student: Julius Marozas Mentor: Torsten Scholak
Hasktorch has two distinct APIs for tensors: Torch.Tensor and Torch.Tensor.Typed. While the untyped version can initially be easier to use and experiment with, the typed version offers static analysis of tensor’s shape, layout, precision, and compute device. The typed version not only helps with debugging and maintainability but also offers better support for type-driven development via GHC features like type holes. However, it is currently difficult to mix the two approaches, e.g., statically specifying tensor’s embedding size while keeping other dimensions unchecked is not possible. Gradual typing is the proposed solution that fuses the two APIs by adding Torch.GraduallyTyped type. The new API allows for more granular control by letting the user choose which properties of the tensor should be given a static type. The goal of the project is to bring maturity to the gradually typed tensor API, add missing features, and experiment with new ideas.
Moved Torch.GraduallyTyped to /experimental #563
In preparation of merging gradually-typed branch to master, I moved gradually typed modules to a new cabal project.
Fixed doctests #567
- Fixed doctest driver in Linux, they seem to still not work on macOS though. The tests can be run like this:
cabal test hasktorch-gradually-typed:doctests
- Also fixed failing doctest test cases.
Implemented select
function for slicing and added gradually typed Index
type #568
- Added a new function
select
for slicing a tensor along a given dimension. - Implemented
eye
,eyeSquare
creation functions.
This week we decided to get rid of type class based gradually types and use singletons instead.
- Added
SIndex
singleton type. - Added
sFull
,sArangeNaturals
,sEye
,sEyeSquare
functions that work with singletons.
Implemented TensorLike type class #574
sToTensor
method of TensorLike
type class converts list, tuples, vectors, and sized vector to gradually typed Tensor
:
>>> t <- sToTensor (SGradient SWithoutGradient) (SLayout SDense) (SDevice SCPU) ([(1, 2), (3, 4), (5, 6)] :: [(Int, Int)])
>>> t
Tensor Int64 [3,2] [[ 1, 2],
[ 3, 4],
[ 5, 6]]
>>> :type t
t :: Tensor
('Gradient 'WithoutGradient)
('Layout 'Dense)
('Device 'CPU)
('DataType 'Int64)
('Shape
'[ 'Dim ('Name "*") 'UncheckedSize, 'Dim ('Name "*") ('Size 2)])
Notice, that the first dimension of the shape of t
is unchecked because it was converted from a Haskell list (which we don't know the size of at compile time).
However, the second dimension is of size 2, because we known that (Int, Int)
always contains 2 elements.
fromTensor
convert back to a non-tensor type:
>>> fromTensor @[(Int, Int)] t
[(1, 2), (3, 4), (5, 6)]
Updated existing code to use toTensor
#579
Fixed property tests #580
Add all, any reduction functions #581
Started working on indexing/slicing
Vacation (started July 15)
Vacation (ended July 22)
Add indexing/slicing operators #592
Added support for slicing tensors with multiple indices:
>>> tensor
Tensor Int64 [2,2,3] [[[ 0, 1, 2],
[ 3, 4, 5]],
[[ 6, 7, 8],
[ 9, 10, 11]]]
>>> x <- t ! SIndices (SEllipsis :|: SSliceAt (SIndex @1) :|: SNil)
Tensor Int64 [2,2] [[1, 4],
[7, 10]]
Also added additional constructor for SIndex
to take negative indices: SNegativeIndex
.
Template Haskell quasiquoter for slicing #602
The slice
quasiquoter lets you use slice syntax similar to Python's. The previous example from week 8 could be rewritten like this:
>>> x <- t ! [slice|...,1|]
Tensor Int64 [2,2] [[1, 4],
[7, 10]]
This is inspired by Junji Hashimoto's earlier work #579
Lenses can be used to view, set or modify part of a tensor. For example, t & (toLens [slice|2:4|]) %~ (+2)
lets you increment t
values (at indices 2 and 3) by 2. The only missing part is figuring out how to deal with failure with view operators, e.g. ^.
.
- Add gradually typed version of MNIST example.
- Move Torch.GraduallyTyped out of experimental.
- Add support for mutable network parameters to save memory while training.
- Add a more general type
Checked
for gradual types.Checked
type could replace gradual types likeGradient
,Layout
,Device
, etc.
I am extremely grateful to my mentor Torsten Scholak who helped me throughout the summer. I would also like to express my gratitude to other Hasktorch contributors and to Google for organising Google Summer of Code.
so proud ^^, last year the lack of gradual typing was still just a gripe of mine while trying to use hasktorch for my thesis project. :o