type inference changing dimensionality
Because of return type inference, my mat
gets bumped to a cube
. How do I “downcast” that cube again in order to pass it to a function accepting an input of type mat
?
For example:
foo = randn(2, 2)
print size(foo) --> [ 2, 2]
print type(foo) --> mat
% this is some generic function,
% working for 1D, 2D or 3D data
function [B:cube] = generic_function(A:cube)
B = A
end
bar = generic_function(foo)
print size(bar) --> [ 2, 2]
print type(bar) --> cube!
This breaks calling the following function:
% this is a more specific function,
% only working for 1D or 2D data
function [B:mat] = specific_function(A:mat)
B = A
end
_ = specific_function(bar) --> error
The context is some image processing function of which I know some are compatible with arbitrary dimensions, whilst another function is not.
I know I could get rid of the :cube
type specifier on generic_function
its input and output parameters, but that does not solve the underlying problem.
Having read the quick reference, I thought a assert(type(bar, "mat"))
should fix this, but it doesn’t.
I guess part of the work could be done at compile-time. For example, the compiler could check (regardless of specialization) if an N-dimensional type is indexed with M>N
indices, If so, it could assert that those superfluous indices are known (through range inference) to be zero, or at least issue a warning if the existence of those indices is not caused by specialization (catching the case where one forgets to remove an index, as you mentioned).
But I don’t have any knowledge (yet) of the compiler internals, so I don’t know how feasible such a proposal is 🙂
Yes, that’s exactly what I was also thinking.
Checking superfluous indices at compile-time is easy. Run-time implementation of handling too many indices is more work (different cases that need to be implemented and checked for different back-ends/computation engines), but it is certainly feasible.