LT4 Trekking Pole

Lightest pole
Insanely light trekking pole. The lightness (less than 4 oz) means you can twitch it really fast to catch yourself because the pole doesn’t have a lot of inertia to overcome. It means your arm and hand tire far less in a day of swing-and-place. It means when you lash it to your pack, it adds little to your burden.
This pole has proved its durability for me on a range of hikes from casual to intense, in a variety of terrains. Adjusting the length with an untwist and retwist to lock is easy and reliable. Since they’re usually sold by the pair, you have a spare in reserve. (Trekking with two poles feels like skiing without snow for me; awkward and hand-encumbering. I like to be a three-legged creature in the bush, always able to brace for stability, striding like a pilgrim with staff.)
09/5/12LT4 Trekking Pole (adjustable)
$80/pole
Available from Gossamer Gear
LT4S (adjustable, with strap)
$88/pole
Available from Gossamer Gear
LT3C Fixed poles
$110 per pair
Available from
As a philosopher and ethicist at the London School of Economics, Birch has spent years grappling with one of science's most perplexing questions: how do we know if another being is conscious and capable of suffering? His book, The Edge of Sentience, argues that we've been asking the wrong question all along. Instead of demanding absolute proof of consciousness — which may be impossible to obtain — we should focus on identifying "sentience candidates" and taking practical steps to protect them from harm. This isn't just academic theory. Birch's work has already influenced real-world policy — he led the team whose research convinced the UK government to legally recognize lobsters and octopuses as sentient beings. Now he's turning his attention to an even broader range of cases, from human patients with brain injuries to the possibility of conscious AI. Here are four key insights from the book: "A patient [with a prolonged disorder of consciousness] should not be assumed incapable of experience when an important clinical decision is made. All clinical decisions should consider the patient's best interests as comprehensively as possible, working on the precautionary assumption that there is a realistic possibility of valenced experience and a continuing interest in avoiding suffering and in achieving a state of well-being, but without taking this assumption to have implications regarding prognosis." “Sentience is neither intelligence nor brain size. We should be aware of the possibility of decouplings between intelligence, brain size, and sentience in the animal kingdom. Precautions to safeguard animal welfare should be driven by markers of sentience, not by markers of intelligence or by brain size.” “At least in principle, there can be phenomenal consciousness without valence: experiences that feel like something but feel neither bad nor good. It is not clear that humans can have such experiences (our overall conscious state arguably always contains an element of mood). But we can conceive of a being that has a subjective point of view on the world in which non-valenced states feature (it consciously experiences shapes, colours, sounds, odours, etc.) but in which everything is evaluatively neutral. Such a being would be technically non-sentient according to the definition we have been using, though it would be sentient in a broader sense. Would such a being have the same moral standing as a being with valenced experiences?” "As these models get larger and larger, we have no sense of the upper limit on the sophistication of the algorithms they could implicitly learn... The point at which this judgement shifts from correct to dangerously incorrect will be very hard for us to see. There is a real risk that we will continue to regard these systems as our tools and playthings long after they become sentient.""Assume Sentient" When Lives Are at Stake
Look Beyond Brain Size and Intelligence
On the Hidden Nature of Experience
On Future AI Risk