mytorch Easily extensible autograd implemented python with pytorch API. Uses numpy to do the heavy-lifting. Implementation is very similar to pytorch (graph-based reverse-mode autodiff). It wouldn't be too tough to extend the autograd, implement torch.nn , and possibly run on GPU (presumably with CuPy or Numba). It would be an interesting (but useless) endeavor to rewrite mytorch in a low level language using BLAS library calls instead on numpy, just like pytorch. Examples mytorch supports the computation of arbitrarily high derivatives for both scalars and non-scalars. Both torch.autograd.backward and torch.autograd.grad are supported. import mytorch as torch a = torch . tensor ( 3. , dtype = torch . float32 , requires_grad = True ) b = torch . tensor ( 10. , dtype = torch . float32 , requires_grad = True ) c = 2 + ( a + b ** 2 ) / ( a + b + a * b ) print ( "a =" , a ) print ( "b =" , b ) print ( "c = 2 + (a + b ** 2) / (a + b + a * b) =" , c ) # NOTE: You could also use c.backward() to accumulate the gradients in a.grad and b.grad dc_da , dc_db = torch . autograd . grad ( c , [ a , b ]) # NOTE: To get higher order derivatives like below, pytorch would require ∂c/∂a and # ∂c/∂b to be calculated with create_graph=True; mytorch does not require it d2c_da2 = torch . autograd . grad ( dc_da , [ a ])[ 0 ] d2c_db2 = torch . autograd . grad ( dc_db , [ b ])[ 0 ] print ( f"∂c/∂a = { dc_da } " ) print ( f"∂c/∂b = { dc_db } " ) print ( f"∂²c/∂a² = { d2c_da2 } " ) print ( f"∂²c/∂b² = { d2c_db2 } " ) Output: a = tensor ( 3.0 , requires_grad = True ) b = tensor ( 10.0 , requires_grad = True ) c = 2 + ( a + b ** 2 ) / ( a + b + a * b ) = tensor ( 4.395348787307739 , requires_grad = True ) ∂ c / ∂ a = tensor ( - 0.5895078420767982 , requires_grad = True ) ∂ c / ∂ b = tensor ( 0.24229313142239048 , requires_grad = True ) ∂² c / ∂ a ² = tensor ( 0.3016086633881293 , requires_grad = True ) ∂² c / ∂ b ² = tensor ( 0.0014338360144389717 , requires_grad = True ) Here is a non-scalar ex...
First seen: 2026-01-04 02:19
Last seen: 2026-01-04 15:20