Computation seems to have two flavors, and we have been using "algorithm" as the catch-all word for all procedures capable of simulation on a Turing machine. The word "protocol" naturally showed up when we covered network computation during my academic CS training. We never used "protocol" in any other context.
I think this is a mistake
Let's reserve the word "algorithm" only for a certain specific flavor. Think of a Grandmaster chess player, looking at the entire board with all the pieces and positions. What they do depends on this knowledge and a cloud of knowledge around previous moves by them and the opponent. This is a great prototype for when to invoke the word "algorithm". In their naive first pass, algorithms resist parallelization. They depend on the computation of a previous step, and they have to be done step by step.
In Distributed Systems coursework, the motivation typically begins with the need for speed. We start to accept "good enough" soluti