The Author Online Book Forums are Moving

The Author Online Book Forums will soon redirect to Manning's liveBook and liveVideo. All book forum content will migrate to liveBook's discussion forum and all video forum content will migrate to liveVideo. Log in to liveBook or liveVideo with your Manning credentials to join the discussion!

Thank you for your engagement in the AoF over the years! We look forward to offering you a more enhanced forum experience.

294119 (38) [Avatar] Offline
#1
In Section 8.1.2, page 191, the filter function, all-greater-than, is defined twice - first using a previously defined function, compute-across; second using built-in function reduce.

(defn all-greater-than [threshold numbers]
(compute-across #(if (> %2 threshold) (conj %1 %2) %1) numbers []))

(defn all-greater-than [threshold numbers]
(reduce #(if (> %2 threshold) (conj %1 %2) %1) [] numbers))

My question is why, in the anonymous function, is the collection always bound to %1 and the next item in the sequence to %2?

I tried switching %1, %2, thinking that maybe the macro used the expression, (conj %1 %2) determined the binding based on placement i.e. conj coll item. That blows up with the error of trying to force a vector into a scalar.

I thought the reason might possibly be the order of the parameters following the anonymous function, numbers and []. But the order is switched between the functions based on compute-across versus reduce.

I can't seem to find the source for the reader macro #(...), thinking that maybe the clue is there.

There is no discussion, in this book or anywhere else that I can find, that covers the rules for %1, %2...%n.

Are there such rules? Or is it just "suck it and see"?
Francis Avila (16) [Avatar] Offline
#2
I can't seem to find the source for the reader macro #(...), thinking that maybe the clue is there.

There is no discussion, in this book or anywhere else that I can find, that covers the rules for %1, %2...%n.

Are there such rules? Or is it just "suck it and see"?


The #(...) reader macro (anonymous function shorthand) and the meaning of %1 %2, etc is explained in section 3.3.5.1, pages 77-78.

Basically, #(conj %1 %2) is sugar for (fn [a b] (conj a b)).

In Section 8.1.2, page 191, the filter function, all-greater-than, is defined twice - first using a previously defined function, compute-across; second using built-in function reduce.

(defn all-greater-than [threshold numbers]
   (compute-across #(if (> %2 threshold) (conj %1 %2) %1) numbers []))

(defn all-greater-than [threshold numbers]
   (reduce #(if (> %2 threshold) (conj %1 %2) %1) [] numbers))


My question is why, in the anonymous function, is the collection always bound to %1 and the next item in the sequence to %2?


This is the contract that both compute-across and reduce have with the reducing function they take in as an argument. They both call the function you supply with the accumulating value as the first argument and the next item in the consumed sequence as the second argument. This is clearer with a close look at the implementation of compute-across (page 190):

(defn compute-across [func elements value]
(if (empty? elements)
value
(recur func (rest elements) (func value (first elements)))))


What's happening in the bold section looks like this when we replace the variables func elements value with their values: (#(conj %1 %2) [] (first numbers)), which is the same as (conj [] (first numbers)). Reduce calls its func argument the same way, which is why both are the same.

Don't be misled by compute-across and reduce having different argument orders: compute-across is called like (compute-across func coll val) and reduce like (reduce func val call), but they themselves both call func like (func accumulating-val next-item-in-coll).


294119 (38) [Avatar] Offline
#3
Thank you.

I had the wrong end of the stick. I was looking to the #(...) macro to sort it all out when I should have been looking at the contract that both reduce and compute-across had with the function to be applied.

I did see the contract with compute-across when I was investigating, but did not bother to look at the source for reduce.

Your book is awesome, by the way