Multiple dispatch can be thought of as a generalization of object-oriented (OO) programming.
In a typical OO language like Python, an object type (class) owns certain methods (functions), and are typically called via
object.method(arg1, arg2)
Depending on the type of object
, the runtime system will dispatch to different method
definitions.
In Julia, the same call would be "spelled" differently:
method(object, arg1, arg2)
Spelled this way, you should notice something odd about OO programming: why is the first argument so special?
Traditional OO programming corresponds to single dispatch: the runtime chooses method
based on the type of the first argument only. Julia implements multiple dispatch: the runtime chooses method
based on the types of all the arguments.
A classic example of the need for multiple dispatch is the case of binary math operators. If you compute x * y
, the definition of the *
function depends upon both the arguments, not just on x
.
Julia defines many versions of the *
function:
methods(*)
We can add new methods to a given function at any time. The methods don't "belong" to a particular type, and aren't part of the type's definition.
For example, string concatenation in Julia is done via *
:
"hello" * "world"
"helloworld"
"hello" + "world"
`+` has no method matching +(::ASCIIString, ::ASCIIString) while loading In[3], in expression starting on line 1
But we can easily extend +
to support a concatenation for strings, if we want:
import Base.+ # we must import a method to add methods (as opposed to replacing it)
+(x::String, y::String) = x * " " * y
+ (generic function with 147 methods)
"hello" + "world"
"hello world"
This may look a lot like function overloading in languages like C++. The difference is that C++'s overloading is static (= dispatch at compile-time), whereas Julia's overloading is dynamic (= dispatch at run-time), like OO polymorphism.
For example, now that we've defined +
, we can use strings with any previously defined function that requires a +
operation, like sum
(summation):
sum(["The", "quick", "brown", "fox", "jumped", "over", "the", "lazy", "dog."])
"The quick brown fox jumped over the lazy dog."
Type declarations are not required for performance — Julia automatically specializes a function on its argument types during compilation. They act like filters, allowing us to specify which functions are used when.
Without this, in a language like Python, you sometimes have to write manual function filters like this example from Matplotlib's quiver.py:
def _parse_args(*args):
X, Y, U, V, C = [None] * 5
args = list(args)
# The use of atleast_1d allows for handling scalar arguments while also
# keeping masked arrays
if len(args) == 3 or len(args) == 5:
C = np.atleast_1d(args.pop(-1))
V = np.atleast_1d(args.pop(-1))
U = np.atleast_1d(args.pop(-1))
if U.ndim == 1:
nr, nc = 1, U.shape[0]
else:
nr, nc = U.shape
if len(args) == 2: # remaining after removing U,V,C
X, Y = [np.array(a).ravel() for a in args]
if len(X) == nc and len(Y) == nr:
X, Y = [a.ravel() for a in np.meshgrid(X, Y)]
else:
indexgrid = np.meshgrid(np.arange(nc), np.arange(nr))
X, Y = [np.ravel(a) for a in indexgrid]
return X, Y, U, V, C
In Julia, you could define different methods for differing numbers of arguments, arrays vs. scalars, etcetera (all eventually calling a single lower-level function to do the work once the arguments have been transformed).